Patents by Inventor Bernard James Kerr

Bernard James Kerr has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170336867
    Abstract: Examples relating to calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received, the image data comprising at least one target visual and target visual metadata that identifies the at least one target visual. The target visual metadata is used to identify a target location of the at least one target visual. The estimated gaze location of the viewer is monitored and a probability that the viewer is gazing at the target location is estimated. The gaze tracking system is calibrated using the probability, the estimated gaze location and the target location to generate an updated estimated gaze location of the viewer.
    Type: Application
    Filed: August 7, 2017
    Publication date: November 23, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Patent number: 9727136
    Abstract: Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector.
    Type: Grant
    Filed: May 19, 2014
    Date of Patent: August 8, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Publication number: 20170098152
    Abstract: In various implementations, one or more specific attributes found in an image can be modified utilizing one or more specific attributes found in another image. Machine learning, deep neural networks, and other computer vision techniques can be utilized to extract attributes of images, such as color, composition, font, style, and texture from one or more images. A user may modify at least one of these attributes in a first image based on the attribute(s) of another image and initiate a visual-based search using the modified image.
    Type: Application
    Filed: January 20, 2016
    Publication date: April 6, 2017
    Inventors: Bernard James Kerr, Zhe Lin, Patrick Reynolds, Baldo Faieta
  • Publication number: 20170097948
    Abstract: In various implementations, specific attributes found in images can be used in a visual-based search. Utilizing machine learning, deep neural networks, and other computer vision techniques, attributes of images, such as color, composition, font, style, and texture can be extracted from a given image. A user can then select a specific attribute from a sample image the user is searching for and the search can be refined to focus on that specific attribute from the sample image. In some embodiments, the search includes specific attributes from more than one image.
    Type: Application
    Filed: January 20, 2016
    Publication date: April 6, 2017
    Inventors: BERNARD JAMES KERR, ZHE LIN, PATRICK REYNOLDS, BALDO FAIETA
  • Patent number: 9400553
    Abstract: Embodiments that relate to scaling a visual element displayed via a display device are disclosed. In one embodiment a method includes receiving and using gaze tracking data to determine gaze locations at which a user is gazing on the display device. Depth tracking data is received and used to determine that a user's pointer is at a predetermined location. In response, a locked gaze location on the screen is locked, where the locked gaze location includes at least a portion of the visual element. In response to locking the locked gaze location, the visual element is programmatically scaled by a predetermined amount to an enlarged size. A user input selecting the visual element is then received.
    Type: Grant
    Filed: October 11, 2013
    Date of Patent: July 26, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Bernard James Kerr, Edward Elliott, Morgan Kolya Venable
  • Patent number: 9244539
    Abstract: Embodiments that relate to positioning a target indicator via a display system are disclosed. For example, one disclosed embodiment provides a method for positioning a target indicator using gaze tracking data having a coarse accuracy from a gaze tracking system of a computing device. Head pose data having a fine accuracy greater than the coarse accuracy is received from a head tracking system. Using the gaze tracking data, an approximate user gaze region within a display region is determined, and the target indicator is displayed at an initial location within the approximate user gaze region. A reposition input from the user is received. In response, subsequently received head pose data is used to calculate an adjusted location for the target indicator. The target indicator is then displayed at the adjusted location.
    Type: Grant
    Filed: January 7, 2014
    Date of Patent: January 26, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Weerapan Wilairat
  • Publication number: 20150331485
    Abstract: Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector.
    Type: Application
    Filed: May 19, 2014
    Publication date: November 19, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Publication number: 20150193018
    Abstract: Embodiments that relate to positioning a target indicator via a display system are disclosed. For example, one disclosed embodiment provides a method for positioning a target indicator using gaze tracking data having a coarse accuracy from a gaze tracking system of a computing device. Head pose data having a fine accuracy greater than the coarse accuracy is received from a head tracking system. Using the gaze tracking data, an approximate user gaze region within a display region is determined, and the target indicator is displayed at an initial location within the approximate user gaze region. A reposition input from the user is received. In response, subsequently received head pose data is used to calculate an adjusted location for the target indicator. The target indicator is then displayed at the adjusted location.
    Type: Application
    Filed: January 7, 2014
    Publication date: July 9, 2015
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Weerapan Wilairat
  • Publication number: 20150103003
    Abstract: Embodiments that relate to scaling a visual element displayed via a display device are disclosed. In one embodiment a method includes receiving and using gaze tracking data to determine gaze locations at which a user is gazing on the display device. Depth tracking data is received and used to determine that a user's pointer is at a predetermined location. In response, a locked gaze location on the screen is locked, where the locked gaze location includes at least a portion of the visual element. In response to locking the locked gaze location, the visual element is programmatically scaled by a predetermined amount to an enlarged size. A user input selecting the visual element is then received.
    Type: Application
    Filed: October 11, 2013
    Publication date: April 16, 2015
    Inventors: Bernard James Kerr, Edward Elliott, Morgan Kolya Venable
  • Patent number: 8988344
    Abstract: Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: March 24, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
  • Publication number: 20140380230
    Abstract: Embodiments are disclosed herein that relate to selecting user interface elements via a periodically updated position signal. For example, one disclosed embodiment provides a method comprising displaying on a graphical user interface a representation of a user interface element and a representation of an interactive target.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
  • Publication number: 20140375544
    Abstract: Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister