Patents by Inventor Crystal Lee Parker

Crystal Lee Parker has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10845871
    Abstract: A controller is adapted to recognize an input from a user using an input interface, determine if a user gaze information indicates that the user is gazing at a device, and when the user gaze information indicates that the user is gazing at the device, route response information to the device.
    Type: Grant
    Filed: August 7, 2018
    Date of Patent: November 24, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Publication number: 20180341330
    Abstract: A controller is adapted to recognize an input from a user using an input interface, determine if a user gaze information indicates that the user is gazing at a device, and when the user gaze information indicates that the user is gazing at the device, route response information to the device.
    Type: Application
    Filed: August 7, 2018
    Publication date: November 29, 2018
    Inventors: Crystal Lee PARKER, Mark Louis Wilson O'HANLON, Andrew LOVITT, Jason Ryan FARMER
  • Patent number: 10067563
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Grant
    Filed: October 27, 2017
    Date of Patent: September 4, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Publication number: 20180059781
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Application
    Filed: October 27, 2017
    Publication date: March 1, 2018
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Patent number: 9823742
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Grant
    Filed: May 18, 2012
    Date of Patent: November 21, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Publication number: 20130307771
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Application
    Filed: May 18, 2012
    Publication date: November 21, 2013
    Applicant: Microsoft Corporation
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer