Patents by Inventor Mark Louis Wilson O'Hanlon

Mark Louis Wilson O'Hanlon has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10845871
    Abstract: A controller is adapted to recognize an input from a user using an input interface, determine if a user gaze information indicates that the user is gazing at a device, and when the user gaze information indicates that the user is gazing at the device, route response information to the device.
    Type: Grant
    Filed: August 7, 2018
    Date of Patent: November 24, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Publication number: 20190098267
    Abstract: Features of the present disclosure implement a light illumination system that utilizes a scanning device that is pivotal on an axis between a plurality of positions. To this end, the system may partition the image frame into at least a first sub-image frame and a second sub-image frame. The system may adjust the scanning device, during the first time period, to a first position to reflect light associated with the first sub-image and adjust the scanning mirror, during the second time period, to a second position to reflect light associated with the second sub-image. Thus, by implementing the techniques described herein, the overall size of a linear array (e.g., liquid crystal on silicon or LCoS) in an optics systems (as well as of some optical components in the optics systems) may be reduced from its current constraints, and thereby achieving a compact optical system that is mobile and user friendly.
    Type: Application
    Filed: September 27, 2017
    Publication date: March 28, 2019
    Inventors: Yarn Chee POON, Richard A. JAMES, Jeb WU, Mark Louis Wilson O'HANLON, Vijay Krishna PARUCHURU, Keita OKA
  • Publication number: 20180341330
    Abstract: A controller is adapted to recognize an input from a user using an input interface, determine if a user gaze information indicates that the user is gazing at a device, and when the user gaze information indicates that the user is gazing at the device, route response information to the device.
    Type: Application
    Filed: August 7, 2018
    Publication date: November 29, 2018
    Inventors: Crystal Lee PARKER, Mark Louis Wilson O'HANLON, Andrew LOVITT, Jason Ryan FARMER
  • Patent number: 10067563
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Grant
    Filed: October 27, 2017
    Date of Patent: September 4, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Publication number: 20180059781
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Application
    Filed: October 27, 2017
    Publication date: March 1, 2018
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Patent number: 9823742
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Grant
    Filed: May 18, 2012
    Date of Patent: November 21, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer
  • Publication number: 20130307771
    Abstract: User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user. The interaction sets may include grammars for a speech recognition system, movements for a gesture recognition system, physiological states for a user health parameter detection system, or other possible inputs. When a user focuses on a selected object or area, an interaction set associated with that object or area is activated and used to interpret user inputs. Interaction sets may also be selected based upon areas that a user is not viewing. Multiple devices can share gaze information so that a device does not require its own gaze detector.
    Type: Application
    Filed: May 18, 2012
    Publication date: November 21, 2013
    Applicant: Microsoft Corporation
    Inventors: Crystal Lee Parker, Mark Louis Wilson O'Hanlon, Andrew Lovitt, Jason Ryan Farmer