Patents by Inventor John J. Andersen
John J. Andersen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11836592Abstract: Systems and methods for a cognitive system to interact with a user are provide. Aspects include receiving a cognitive system profile and observational data associated with the user. Environmental data associated with the user is received and features are extracted from the observations data and the environmental data. The features are stored in the user profile and analyzed to determine a situational context for each of the features based on the cognitive system profile and the user profile. Trigger events are identified based on the situational context for each of the features. One or more proposed actions are determined based at least in part on the one or more trigger events. At least one action is initiated from the one or more proposed actions and are stored in the user profile along with the one or more trigger events and the one or more features.Type: GrantFiled: December 15, 2017Date of Patent: December 5, 2023Assignee: International Business Machines CorporationInventors: John J. Andersen, Rob High, Leah Lawrence, Jennifer Sukis, Wilson Wu
-
Patent number: 11016729Abstract: Mechanisms are provided, in a data processing system comprising a fusion sensor service and a human computer interaction (HCI) device, for responding to a user input. The HCI device receives a user input from a first sensor monitoring a monitored environment. The fusion sensor service captures, via one or more second sensors different from the first sensor, sensor data representing characteristics of one or more entities within the monitored environment indicative of a condition within the monitored environment. The fusion sensor service determines whether the user input is specifically directed to the HCI device based on the captured sensor data. The HCI device performs an operation in response to the data processing system determining that the user input is specifically directed to the HCI device based on the capture sensor data.Type: GrantFiled: November 8, 2017Date of Patent: May 25, 2021Assignee: International Business Machines CorporationInventors: John J. Andersen, Dogukan Erenel, Richard O. Lyle, Connie Yee
-
Patent number: 10740683Abstract: Technical solutions are described for visually depicting health of a cognitive system. An example computer-implemented method includes accessing a query-log of a question input to the cognitive system. The method also includes generating a query-node corresponding to the question. The method also includes configuring animation parameters of the query-node based on the query-log. The method also includes displaying the query-node according to the animation parameters.Type: GrantFiled: July 29, 2016Date of Patent: August 11, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: John J. Andersen, Jason E. Doucette, Sanjay F. Kottaram, Robert L. Turknett, Jr., Wilson L. Wu
-
Patent number: 10692498Abstract: An approach is provided that identifies a current mood state of a requestor from whom a question has been submitted to a question answering (QA) system. The approach determines, based on the identified mood state, an urgency associated with the requestor. Data pertaining to a number of candidate answers is analyzed with the candidate answers being generated by the QA system. The analysis results in an urgency characteristic that pertains to each of the candidate answers. Scoring of the candidate answers is adjusted based on a comparison of the requestor's urgency and the urgency characteristic associated with the candidate answers. Answers are selected from the candidate answers and returned to the requestor with the selected answers based on the adjusted scoring.Type: GrantFiled: October 23, 2017Date of Patent: June 23, 2020Assignee: International Business Machines CorporationInventors: John J. Andersen, Dogukan Erenel, Richard O. Lyle, Ajiemar D. Santiago, Wilson L. Wu
-
Patent number: 10685648Abstract: Mechanisms are provided, in a smart speaker system having at least one smart speaker device comprising an audio capture device, and smart speaker system logic, for processing audio sample data captured by the audio capture device. The audio capture device captures an audio sample from a monitored environment and one or more sensor devices capture sensor data representing non-verbal attention indicators associated with a speaker of a speech portion of the audio sample. The smart speaker system logic evaluates the non-verbal attention indicators of the sensor data to determine whether or not the speech portion of the audio sample is directed to the smart speaker device. In response to determining that the speech portion of the audio sample is directed to the smart speaker device, a cognitive system associated with the smart speaker system generates a response to the speech portion.Type: GrantFiled: November 8, 2017Date of Patent: June 16, 2020Assignee: International Business Machines CorporationInventors: John J. Andersen, Dogukan Erenel, Richard O. Lyle, Connie Yee
-
Patent number: 10679398Abstract: Technical solutions are described for representing health of a cognitive system. An example computer-implemented method includes displaying an animated set of icons, each icon representing a question input to the cognitive system. Each icon has a respective movement pattern. The computer-implemented method also includes receiving a selection of a first icon from the animated set of icons. The computer-implemented method also includes in response, identifying a category of a first question corresponding to the first icon, determining a subset of the icons corresponding to question from the category, and displaying connectors between the subset of the icons.Type: GrantFiled: July 29, 2016Date of Patent: June 9, 2020Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: John J. Andersen, Jason E. Doucette, Sanjay F. Kottaram, Robert L. Turknett, Jr., Wilson L. Wu
-
Publication number: 20190188552Abstract: Systems and methods for a cognitive system to interact with a user are provide. Aspects include receiving a cognitive system profile and observational data associated with the user. Environmental data associated with the user is received and features are extracted from the observations data and the environmental data. The features are stored in the user profile and analyzed to determine a situational context for each of the features based on the cognitive system profile and the user profile. Trigger events are identified based on the situational context for each of the features. One or more proposed actions are determined based at least in part on the one or more trigger events. At least one action is initiated from the one or more proposed actions and are stored in the user profile along with the one or more trigger events and the one or more features.Type: ApplicationFiled: December 15, 2017Publication date: June 20, 2019Inventors: John J. Andersen, Rob High, Leah Lawrence, Jennifer Sukis, Wilson Wu
-
Publication number: 20190138268Abstract: Mechanisms are provided, in a data processing system comprising a fusion sensor service and a human computer interaction (HCI) device, for responding to a user input. The HCI device receives a user input from a first sensor monitoring a monitored environment. The fusion sensor service captures, via one or more second sensors different from the first sensor, sensor data representing characteristics of one or more entities within the monitored environment indicative of a condition within the monitored environment. The fusion sensor service determines whether the user input is specifically directed to the HCI device based on the captured sensor data. The HCI device performs an operation in response to the data processing system determining that the user input is specifically directed to the HCI device based on the capture sensor data.Type: ApplicationFiled: November 8, 2017Publication date: May 9, 2019Inventors: John J. Andersen, Dogukan Erenel, Richard O. Lyle, Connie Yee
-
Publication number: 20190139541Abstract: Mechanisms are provided, in a smart speaker system having at least one smart speaker device comprising an audio capture device, and smart speaker system logic, for processing audio sample data captured by the audio capture device. The audio capture device captures an audio sample from a monitored environment and one or more sensor devices capture sensor data representing non-verbal attention indicators associated with a speaker of a speech portion of the audio sample. The smart speaker system logic evaluates the non-verbal attention indicators of the sensor data to determine whether or not the speech portion of the audio sample is directed to the smart speaker device. In response to determining that the speech portion of the audio sample is directed to the smart speaker device, a cognitive system associated with the smart speaker system generates a response to the speech portion.Type: ApplicationFiled: November 8, 2017Publication date: May 9, 2019Inventors: John J. Andersen, Dogukan Erenel, Richard O. Lyle, Connie Yee
-
Publication number: 20190122667Abstract: An approach is provided that identifies a current mood state of a requestor from whom a question has been submitted to a question answering (QA) system. The approach determines, based on the identified mood state, an urgency associated with the requestor. Data pertaining to a number of candidate answers is analyzed with the candidate answers being generated by the QA system. The analysis results in an urgency characteristic that pertains to each of the candidate answers. Scoring of the candidate answers is adjusted based on a comparison of the requestor's urgency and the urgency characteristic associated with the candidate answers. Answers are selected from the candidate answers and returned to the requestor with the selected answers based on the adjusted scoring.Type: ApplicationFiled: October 23, 2017Publication date: April 25, 2019Inventors: John J. Andersen, Dogukan Erenel, Richard O. Lyle, Ajiemar D. Santiago, Wilson L. Wu
-
Patent number: 10254914Abstract: An approach is provided that selects three attributes that correspond to objects included in a dataset, where each of the three attributes is assigned to a different coordinate value (x, y, and z coordinates). The approach creates a simulated three dimensional (3D) scene of the objects on a display screen by using the x, y, and z coordinate values corresponding to the attributes of each of the objects. The approach further displays, on a 2D screen, a gyroscopic graphical user interface (GUI) that provides three dimensional (3D) control of the simulated 3D scene. In the approach, a gesture from a user receiving at the gyroscopic GUI. Responsively, the approach adjusts the 3D scene displayed on the 2D screen based on the gesture that was received.Type: GrantFiled: August 29, 2016Date of Patent: April 9, 2019Assignee: International Business Machines CorporationInventors: John J. Andersen, Jacob A. Daigle, Jason E. Doucette, Wilson L. Wu
-
Patent number: 10168856Abstract: An approach is provided that displays, on a two dimensional (2D) screen, a gyroscopic graphical user interface (GUI). The gyroscopic GUI provides three dimensional (3D) control of a simulated 3D scene displayed on the 2D screen. In the approach, a gesture from a user receiving at the gyroscopic GUI. Responsively, the approach adjusts the 3D scene displayed on the 2D screen based on the gesture that was received.Type: GrantFiled: August 29, 2016Date of Patent: January 1, 2019Assignee: International Business Machines CorporationInventors: John J. Andersen, Jacob A. Daigle, Jason E. Doucette, Wilson L. Wu
-
Publication number: 20180059899Abstract: An approach is provided that displays, on a two dimensional (2D) screen, a gyroscopic graphical user interface (GUI). The gyroscopic GUI provides three dimensional (3D) control of a simulated 3D scene displayed on the 2D screen. In the approach, a gesture from a user receiving at the gyroscopic GUI. Responsively, the approach adjusts the 3D scene displayed on the 2D screen based on the gesture that was received.Type: ApplicationFiled: August 29, 2016Publication date: March 1, 2018Inventors: John J. Andersen, Jacob A. Daigle, Jason E. Doucette, Wilson L. Wu
-
Publication number: 20180059900Abstract: An approach is provided that selects three attributes that correspond to objects included in a dataset, where each of the three attributes is assigned to a different coordinate value (x, y, and z coordinates). The approach creates a simulated three dimensional (3D) scene of the objects on a display screen by using the x, y, and z coordinate values corresponding to the attributes of each of the objects. The approach further displays, on a 2D screen, a gyroscopic graphical user interface (GUI) that provides three dimensional (3D) control of the simulated 3D scene. In the approach, a gesture from a user receiving at the gyroscopic GUI. Responsively, the approach adjusts the 3D scene displayed on the 2D screen based on the gesture that was received.Type: ApplicationFiled: August 29, 2016Publication date: March 1, 2018Inventors: John J. Andersen, Jacob A. Daigle, Jason E. Doucette, Wilson L. Wu
-
Publication number: 20180032873Abstract: Technical solutions are described for visually depicting health of a cognitive system. An example computer-implemented method includes accessing a query-log of a question input to the cognitive system. The method also includes generating a query-node corresponding to the question. The method also includes configuring animation parameters of the query-node based on the query-log. The method also includes displaying the query-node according to the animation parameters.Type: ApplicationFiled: July 29, 2016Publication date: February 1, 2018Inventors: John J. Andersen, Jason E. Doucette, Sanjay F. Kottaram, Robert L. Turknett, JR., Wilson L. Wu
-
Publication number: 20180032231Abstract: Technical solutions are described for representing health of a cognitive system. An example computer-implemented method includes displaying an animated set of icons, each icon representing a question input to the cognitive system. Each icon has a respective movement pattern. The computer-implemented method also includes receiving a selection of a first icon from the animated set of icons. The computer-implemented method also includes in response, identifying a category of a first question corresponding to the first icon, determining a subset of the icons corresponding to question from the category, and displaying connectors between the subset of the icons.Type: ApplicationFiled: July 29, 2016Publication date: February 1, 2018Inventors: John J. Andersen, Jason E. Doucette, Sanjay F. Kottaram, Robert L. Turknett, JR., Wilson L. Wu