Patents by Inventor Janet E. Galore

Janet E. Galore has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10599393
    Abstract: The subject disclosure relates to user input into a computer system, and a technology by which one or more users interact with a computer system via a combination of input modalities. When the input data of two or more input modalities are related, they are combined to interpret an intended meaning of the input. For example, speech when combined with one input gesture has one intended meaning, e.g., convert the speech to verbatim text for consumption by a program, while the exact speech when combined with a different input gesture has a different meaning, e.g., convert the speech to a command that controls the operation of that same program.
    Type: Grant
    Filed: August 1, 2018
    Date of Patent: March 24, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Michael Koenig, Jack Creasey, Subha Bhattacharyay
  • Patent number: 10398972
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Grant
    Filed: September 16, 2016
    Date of Patent: September 3, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
  • Publication number: 20190138271
    Abstract: The subject disclosure relates to user input into a computer system, and a technology by which one or more users interact with a computer system via a combination of input modalities. When the input data of two or more input modalities are related, they are combined to interpret an intended meaning of the input. For example, speech when combined with one input gesture has one intended meaning, e.g., convert the speech to verbatim text for consumption by a program, while the exact speech when combined with a different input gesture has a different meaning, e.g., convert the speech to a command that controls the operation of that same program.
    Type: Application
    Filed: August 1, 2018
    Publication date: May 9, 2019
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Michael Koenig, Jack Creasey, Subha Bhattacharyay
  • Patent number: 10067740
    Abstract: The subject disclosure relates to user input into a computer system, and a technology by which one or more users interact with a computer system via a combination of input modalities. When the input data of two or more input modalities are related, they are combined to interpret an intended meaning of the input. For example, speech when combined with one input gesture has one intended meaning, e.g., convert the speech to verbatim text for consumption by a program, while the exact speech when combined with a different input gesture has a different meaning, e.g., convert the speech to a command that controls the operation of that same program.
    Type: Grant
    Filed: May 10, 2016
    Date of Patent: September 4, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Michael Koenig, Jack Creasey, Subha Bhattacharyay
  • Publication number: 20170144067
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Application
    Filed: September 16, 2016
    Publication date: May 25, 2017
    Inventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
  • Publication number: 20160350071
    Abstract: The subject disclosure relates to user input into a computer system, and a technology by which one or more users interact with a computer system via a combination of input modalities. When the input data of two or more input modalities are related, they are combined to interpret an intended meaning of the input. For example, speech when combined with one input gesture has one intended meaning, e.g., convert the speech to verbatim text for consumption by a program, while the exact speech when combined with a different input gesture has a different meaning, e.g., convert the speech to a command that controls the operation of that same program.
    Type: Application
    Filed: May 10, 2016
    Publication date: December 1, 2016
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Michael Koenig, Jack Creasey, Subha Bhattacharyay
  • Patent number: 9348417
    Abstract: The subject disclosure relates to user input into a computer system, and a technology by which one or more users interact with a computer system via a combination of input modalities. When the input data of two or more input modalities are related, they are combined to interpret an intended meaning of the input. For example, speech when combined with one input gesture has one intended meaning, e.g., convert the speech to verbatim text for consumption by a program, while the exact speech when combined with a different input gesture has a different meaning, e.g., convert the speech to a command that controls the operation of that same program.
    Type: Grant
    Filed: November 1, 2010
    Date of Patent: May 24, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Michael Koenig, Jack Creasey, Subha Bhattacharyay
  • Publication number: 20120109868
    Abstract: The subject disclosure relates to a technology by which output data in the form of audio, visual, haptic, and/or other output is automatically selected and tailored by a system, including adapting in real time, to address one or more users' specific needs, context and implicit/explicit intent. State data and preference data are input into a real time adaptive output system that uses the data to select among output modalities, e.g., to change output mechanisms, add/remove output mechanisms, and/or change rendering characteristics. The output may be rendered on one or more output mechanisms to a single user or multiple users, including via a remote output mechanism.
    Type: Application
    Filed: November 1, 2010
    Publication date: May 3, 2012
    Applicant: Microsoft Corporation
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Tim Wantland, Blaise H. Aguera-Arcas
  • Publication number: 20120105257
    Abstract: The subject disclosure relates to user input into a computer system, and a technology by which one or more users interact with a computer system via a combination of input modalities. When the input data of two or more input modalities are related, they are combined to interpret an intended meaning of the input. For example, speech when combined with one input gesture has one intended meaning, e.g., convert the speech to verbatim text for consumption by a program, while the exact speech when combined with a different input gesture has a different meaning, e.g., convert the speech to a command that controls the operation of that same program.
    Type: Application
    Filed: November 1, 2010
    Publication date: May 3, 2012
    Applicant: Microsoft Corporation
    Inventors: Oscar E. Murillo, Janet E. Galore, Jonathan C. Cluts, Colleen G. Estrada, Michael Koenig, Jack Creasey, Subha Bhattacharyay