Patents by Inventor Rachel K.E. Bellamy

Rachel K.E. Bellamy has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11263188
    Abstract: A method for automatically generating documentation for an artificial intelligence model includes receiving, by a computing device, an artificial intelligence model. The computing device accesses a model facts policy that indicates data to be collected for artificial intelligence models. The computing device collects artificial intelligence model facts regarding the artificial intelligence model according to the model facts policy. The computing device accesses a factsheet template. The factsheet template provides a schema for an artificial intelligence model factsheet for the artificial intelligence model. The computing device populates the artificial intelligence model factsheet using the factsheet template with the artificial intelligence model facts related to the artificial intelligence model.
    Type: Grant
    Filed: November 1, 2019
    Date of Patent: March 1, 2022
    Assignee: International Business Machines Corporation
    Inventors: Matthew R. Arnold, Rachel K. E. Bellamy, Kaoutar El Maghraoui, Michael Hind, Stephanie Houde, Kalapriya Kannan, Sameep Mehta, Aleksandra Mojsilovic, Ramya Raghavendra, Darrell C. Reimer, John T. Richards, David J. Piorkowski, Jason Tsay, Kush R. Varshney, Manish Kesarwani
  • Patent number: 11182600
    Abstract: A processor may record a first location at an event with at least one person. The processor may monitor a plurality of actions of that at least one person at the first location. The processor may interpret at least one action of the at least one person that indicates a change of interest to a second location at the event. Based on the at least one action, the processor may determine the second location at the event. The processor may record the second location at the event.
    Type: Grant
    Filed: September 24, 2015
    Date of Patent: November 23, 2021
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Patent number: 11100407
    Abstract: Embodiments for building domain models from dialog interactions by a processor. A domain knowledge may be elicited from one or more dialog interactions with one or more users according to one or more dialog strategies. One or more domain models may be built and/or enhanced according to the domain knowledge.
    Type: Grant
    Filed: October 10, 2018
    Date of Patent: August 24, 2021
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Oznur Alkan, Rachel K. E. Bellamy, Elizabeth Daly, Matthew Davis, Vera Liao, Biplav Srivastava
  • Publication number: 20210133162
    Abstract: A method for automatically generating documentation for an artificial intelligence model includes receiving, by a computing device, an artificial intelligence model. The computing device accesses a model facts policy that indicates data to be collected for artificial intelligence models. The computing device collects artificial intelligence model facts regarding the artificial intelligence model according to the model facts policy. The computing device accesses a factsheet template. The factsheet template provides a schema for an artificial intelligence model factsheet for the artificial intelligence model. The computing device populates the artificial intelligence model factsheet using the factsheet template with the artificial intelligence model facts related to the artificial intelligence model.
    Type: Application
    Filed: November 1, 2019
    Publication date: May 6, 2021
    Inventors: Matthew R. Arnold, Rachel K.E. Bellamy, Kaoutar El Maghraoui, Michael Hind, Stephanie Houde, Kalapriya Kannan, Sameep Mehta, Aleksandra Mojsilovic, Ramya Raghavendra, Darrell C. Reimer, John T. Richards, David J. Piorkowski, Jason Tsay, Kush R. Varshney, Manish Kesarwani
  • Patent number: 10956831
    Abstract: In one embodiment, in accordance with the present invention, a method, computer program product, and system for performing actions based on captured interpersonal interactions during a meeting is provided. One or more computer processors capture the interpersonal interactions between people in a physical space during a period of time, using machine learning algorithms to detect the interpersonal interactions and a state of each person based on vision and audio sensors in the physical space. The one or more computer processors analyze and categorize the interactions and state of each person, and tag representations of each person with the respectively analyzed and categorized interactions and states of the respective person over the period of time. The one or more computer processors then take an action based on the analysis.
    Type: Grant
    Filed: November 13, 2017
    Date of Patent: March 23, 2021
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Publication number: 20200118008
    Abstract: Embodiments for building domain models from dialog interactions by a processor. A domain knowledge may be elicited from one or more dialog interactions with one or more users according to one or more dialog strategies. One or more domain models may be built and/or enhanced according to the domain knowledge.
    Type: Application
    Filed: October 10, 2018
    Publication date: April 16, 2020
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Oznur ALKAN, Rachel K. E. BELLAMY, Elizabeth DALY, Matthew DAVIS, Vera LIAO, Biplav SRIVASTAVA
  • Publication number: 20190147367
    Abstract: In one embodiment, in accordance with the present invention, a method, computer program product, and system for performing actions based on captured interpersonal interactions during a meeting is provided. One or more computer processors capture the interpersonal interactions between people in a physical space during a period of time, using machine learning algorithms to detect the interpersonal interactions and a state of each person based on vision and audio sensors in the physical space. The one or more computer processors analyze and categorize the interactions and state of each person, and tag representations of each person with the respectively analyzed and categorized interactions and states of the respective person over the period of time. The one or more computer processors then take an action based on the analysis.
    Type: Application
    Filed: November 13, 2017
    Publication date: May 16, 2019
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Patent number: 10244013
    Abstract: A computer-implemented method manages remote electronic drop-ins on local conversations. A local audio sensor transmits a captured conversation from a local cluster of persons to a remote communication device where members of the local cluster of persons are within a predefined distance of one another, and where the remote communication device is at a location that is beyond a human hearing range from the local audio sensor. One or more processors determine that the captured conversation is about a particular topic. A request from a remote user is received from the remote communication device to electronically drop in on a particular remote cluster of persons who are having a conversation about the particular topic. In response to receiving the request from the remote user, one or more processors selectively connect a local communication device proximate to the cluster of persons to the remote communication device.
    Type: Grant
    Filed: December 8, 2017
    Date of Patent: March 26, 2019
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Patent number: 10013160
    Abstract: Detecting user input based on multiple gestures is provided. One or more interactions are received from a user via a user interface. An inferred interaction is determined based, at least in part, on a geometric operation, wherein the geometric operation is based on the one or more interactions. The inferred interaction is presented via the user interface. Whether a confirmation has been received for the inferred interaction is determined.
    Type: Grant
    Filed: May 29, 2014
    Date of Patent: July 3, 2018
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Bonnie E. John, Peter K. Malkin, John T. Richards, Calvin B. Swart, John C. Thomas, Jr., Sharon M. Trewin
  • Publication number: 20180109571
    Abstract: A computer-implemented method manages remote electronic drop-ins on local conversations. A local audio sensor transmits a captured conversation from a local cluster of persons to a remote communication device where members of the local cluster of persons are within a predefined distance of one another, and where the remote communication device is at a location that is beyond a human hearing range from the local audio sensor. One or more processors determine that the captured conversation is about a particular topic. A request from a remote user is received from the remote communication device to electronically drop in on a particular remote cluster of persons who are having a conversation about the particular topic. In response to receiving the request from the remote user, one or more processors selectively connect a local communication device proximate to the cluster of persons to the remote communication device.
    Type: Application
    Filed: December 8, 2017
    Publication date: April 19, 2018
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Patent number: 9923938
    Abstract: A computer-implemented method manages drop-ins on conversations near a focal point of proximal activity in a gathering place. One or more processors receive a first set of sensor data from one or more sensors in a gathering place, and then identify a focal point of proximal activity based on the first set of received sensor data received from the one or more sensors. One or more processors characterize a conversation near the focal point based on a second set of received sensor data from the one or more sensors, and then present a characterization of the conversation to an electronic device. One or more processors enable the electronic device to allow a user to drop-in on the conversation.
    Type: Grant
    Filed: July 13, 2015
    Date of Patent: March 20, 2018
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Patent number: 9881171
    Abstract: Information regarding one or more sensing devices in an environment is broadcasted. The broadcasted information is received by a user application running on a user device in the environment. The broadcasted information comprises information regarding presence of the one or more sensing devices in the environment and at least one of a capacity profile and an activity profile of the one or more sensing devices.
    Type: Grant
    Filed: November 16, 2015
    Date of Patent: January 30, 2018
    Assignee: International Business Machines Corporation
    Inventors: Rachel K.E. Bellamy, Thomas D. Erickson
  • Patent number: 9782069
    Abstract: Systems and methods are provided for post-hoc correction of calibration errors in eye tracking data, which take into consideration calibration errors that result from changes in user position during a user session in which the user's fixations on a display screen are captured and recorded by an eye tracking system, and which take into consideration errors that occur when the user looks away from a displayed target item before selecting the target item.
    Type: Grant
    Filed: November 6, 2014
    Date of Patent: October 10, 2017
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Bonnie E. John, John T. Richards, Calvin B. Swart, John C. Thomas, Jr., Sharon M. Trewin
  • Patent number: 9740398
    Abstract: Detecting user input based on multiple gestures is provided. One or more interactions are received from a user via a user interface. An inferred interaction is determined based, at least in part, on a geometric operation, wherein the geometric operation is based on the one or more interactions. The inferred interaction is presented via the user interface. Whether a confirmation has been received for the inferred interaction is determined.
    Type: Grant
    Filed: November 2, 2016
    Date of Patent: August 22, 2017
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Bonnie E. John, Peter K. Malkin, John T. Richards, Calvin B. Swart, John C. Thomas, Jr., Sharon M. Trewin
  • Publication number: 20170140164
    Abstract: Information regarding one or more sensing devices in an environment is broadcasted. The broadcasted information is received by a user application running on a user device in the environment. The broadcasted information comprises information regarding presence of the one or more sensing devices in the environment and at least one of a capacity profile and an activity profile of the one or more sensing devices.
    Type: Application
    Filed: November 16, 2015
    Publication date: May 18, 2017
    Inventors: Rachel K.E. Bellamy, Thomas D. Erickson
  • Publication number: 20170094179
    Abstract: A processor may record a first location at an event with at least one person. The processor may monitor a plurality of actions of that at least one person at the first location. The processor may interpret at least one action of the at least one person that indicates a change of interest to a second location at the event. Based on the at least one action, the processor may determine the second location at the event. The processor may record the second location at the event.
    Type: Application
    Filed: September 24, 2015
    Publication date: March 30, 2017
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Publication number: 20170046064
    Abstract: Detecting user input based on multiple gestures is provided. One or more interactions are received from a user via a user interface. An inferred interaction is determined based, at least in part, on a geometric operation, wherein the geometric operation is based on the one or more interactions. The inferred interaction is presented via the user interface. Whether a confirmation has been received for the inferred interaction is determined.
    Type: Application
    Filed: November 2, 2016
    Publication date: February 16, 2017
    Inventors: Rachel K. E. Bellamy, Bonnie E. John, Peter K. Malkin, John T. Richards, Calvin B. Swart, John C. Thomas, JR., Sharon M. Trewin
  • Patent number: 9563354
    Abstract: Detecting user input based on multiple gestures is provided. One or more interactions are received from a user via a user interface. An inferred interaction is determined based, at least in part, on a geometric operation, wherein the geometric operation is based on the one or more interactions. The inferred interaction is presented via the user interface. Whether a confirmation has been received for the inferred interaction is determined.
    Type: Grant
    Filed: July 26, 2016
    Date of Patent: February 7, 2017
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Bonnie E. John, Peter K. Malkin, John T. Richards, Calvin B. Swart, John C. Thomas, Jr., Sharon M. Trewin
  • Publication number: 20170017640
    Abstract: A computer-implemented method manages drop-ins on conversations near a focal point of proximal activity in a gathering place. One or more processors receive a first set of sensor data from one or more sensors in a gathering place, and then identify a focal point of proximal activity based on the first set of received sensor data received from the one or more sensors. One or more processors characterize a conversation near the focal point based on a second set of received sensor data from the one or more sensors, and then present a characterization of the conversation to an electronic device. One or more processors enable the electronic device to allow a user to drop-in on the conversation.
    Type: Application
    Filed: July 13, 2015
    Publication date: January 19, 2017
    Inventors: Rachel K. E. Bellamy, Jonathan H. Connell, II, Robert G. Farrell, Brian P. Gaucher, Jonathan Lenchner, David O. S. Melville, Valentina Salapura
  • Patent number: 9495098
    Abstract: Detecting user input based on multiple gestures is provided. One or more interactions are received from a user via a user interface. An inferred interaction is determined based, at least in part, on a geometric operation, wherein the geometric operation is based on the one or more interactions. The inferred interaction is presented via the user interface. Whether a confirmation has been received for the inferred interaction is determined.
    Type: Grant
    Filed: April 4, 2016
    Date of Patent: November 15, 2016
    Assignee: International Business Machines Corporation
    Inventors: Rachel K. E. Bellamy, Bonnie E. John, Peter K. Malkin, John T. Richards, Calvin B. Swart, John C. Thomas, Jr., Sharon M. Trewin