Patents by Inventor David Nister

David Nister has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150346814
    Abstract: One or more techniques and/or systems are provided for gaze tracking of one or more users. A user tracking component (e.g., a depth camera or a relatively lower resolution camera) may be utilized to obtain user tracking data for a user. The user tracking data is evaluated to identify a spatial location of the user. An eye capture camera (e.g., a relatively higher resolution camera) may be selected from an eye capture camera configuration based upon the eye capture camera having a view frustum corresponding to the spatial location of the user. The eye capture camera may be invoked to obtain eye region imagery of the user. Other eye capture cameras within the eye capture camera configuration are maintained in a powered down state to reduce power and/or bandwidth consumption. Gaze tracking information may be generated based upon the eye region imagery, and may be used to perform a task.
    Type: Application
    Filed: May 30, 2014
    Publication date: December 3, 2015
    Inventors: Vaibhav Thukral, Ibrahim Eden, Shivkumar Swaminathan, David Nister, Morgan Venable
  • Publication number: 20150331485
    Abstract: Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector.
    Type: Application
    Filed: May 19, 2014
    Publication date: November 19, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Publication number: 20150310253
    Abstract: Embodiments are disclosed for eye tracking systems and methods. An example eye tracking system comprises a plurality of light sources and a camera configured to capture an image of light from the light sources as reflected from an eye. The eye tracking system further comprises a logic device and a storage device storing instructions executable by the logic device to acquire frames of eye tracking data by iteratively projecting light from different combinations of light sources of the plurality of light sources and capturing an image of the eye during projection of each combination. The instructions may be further executable to select a selected combination of light sources for eye tracking based on a determination of occlusion detected in the image arising from a transparent or semi-transparent optical structure positioned between the eye and the camera and project light via the selected combination of light sources for eye tracking.
    Type: Application
    Filed: April 29, 2014
    Publication date: October 29, 2015
    Inventors: Mudit Agrawal, Vaibhav Thukral, Ibrahim Eden, David Nister, Shivkumar Swaminathan
  • Publication number: 20150293587
    Abstract: Examples relating to using non-visual feedback to alert a viewer of a display that a visual change has been triggered are disclosed. One disclosed example provides a method comprising using gaze tracking data from a gaze tracking system to determine that a viewer changes a gaze location. Based on determining that the viewer changes the gaze location, a visual change is triggered and non-visual feedback indicating the triggering of the visual change is provided to the viewer. If a cancel change input is received within a predetermined timeframe, the visual change is not displayed. If a cancel change input is not received within the timeframe, the visual change is displayed via the display.
    Type: Application
    Filed: April 10, 2014
    Publication date: October 15, 2015
    Inventors: Weerapan Wilairat, Ibrahim Eden, Vaibhav Thukral, David Nister, Vivek Pradeep
  • Publication number: 20150261293
    Abstract: Embodiments are disclosed that relate to gaze-based remote device control. For example, one disclosed embodiment provides, on a computing device, a method comprising detecting a gaze direction of a user, detecting an indication from the user to control a remotely controllable device located in the gaze direction, and adapting a user interface of a controller device to enable user control of the remotely controllable device.
    Type: Application
    Filed: March 12, 2014
    Publication date: September 17, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral, Ibrahim Eden, David Nister
  • Publication number: 20150242680
    Abstract: Embodiments that relate to determining gaze locations are disclosed. In one embodiment a method includes shining light along an outbound light path to the eyes of the user wearing glasses. Upon detecting the glasses, the light is dynamically polarized in a polarization pattern that switches between a random polarization phase and a single polarization phase, wherein the random polarization phase includes a first polarization along an outbound light path and a second polarization orthogonal to the first polarization along a reflected light path. The single polarization phase has a single polarization. During the random polarization phases, glares reflected from the glasses are filtered out and pupil images are captured. Glint images are captured during the single polarization phase. Based on pupil characteristics and glint characteristics, gaze locations are repeatedly detected.
    Type: Application
    Filed: February 26, 2014
    Publication date: August 27, 2015
    Inventors: Vaibhav Thukral, Sudipta Sinha, Vivek Pradeep, Timothy Andrew Large, Nigel Stuart Keam, David Nister
  • Publication number: 20150193920
    Abstract: The technology disclosed herein provides various embodiments for mapping glints that reflect off from an object to light sources responsible for the glints. Embodiments disclosed herein are able to correctly map glints to light sources by capturing just a few images with a camera. Each image is captured while illuminating the object with a different pattern of light sources. A glint free image may also be determined. A glint free image is one in which the glints have been removed by image processing techniques.
    Type: Application
    Filed: January 7, 2014
    Publication date: July 9, 2015
    Inventors: Derek Knee, John Eldridge, Robert Havlik, Ronald Boskovic, Christopher Mei, Gerhard Schneider, Djordje Nijemcevic, David Nister
  • Patent number: 9064174
    Abstract: Architecture that enables optical character recognition (OCR) of text in video frames at the rate at which the frames are received. Additionally, conflation is performed on multiple text recognition results in the frame sequence. The architecture comprises an OCR text recognition engine and a tracker system; the tracker system establishes a common coordinate system in which OCR results from different frames may be compared and/or combined. From a set of sequential video frames, a keyframe is chosen from which the reference coordinate system is established. An estimated transformation from keyframe coordinates to subsequent video frames is computed using the tracker system. When text recognition is completed for any subsequent frame, the result coordinates can be related to the keyframe using the inverse transformation from the processed frame to the reference keyframe. The results can be rendered for viewing as the results are obtained.
    Type: Grant
    Filed: October 18, 2012
    Date of Patent: June 23, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David Nister, Frederik Schaffalitzky, Michael Grabner, Matthew S. Ashman, Milan Vugdelija, Ivan Stojiljkovic
  • Publication number: 20150103000
    Abstract: Various embodiments related to entering text into a computing device via eye-typing are disclosed. For example, one embodiment provides a method that includes receiving a data set including a plurality of gaze samples, each gaze sample including a gaze location and a corresponding point in time. The method further comprises processing the plurality of gaze samples to determine one or more likely terms represented by the data set.
    Type: Application
    Filed: December 17, 2014
    Publication date: April 16, 2015
    Inventors: David Nister, Vaibhav Thukral, Djordje Nijemcevic, Ruchi Bhargava
  • Patent number: 8996557
    Abstract: Various embodiments enable audio data, such as music data, to be captured, by a device, from a background environment and processed to formulate a query that can then be transmitted to a content recognition service. In one or more embodiments, multiple queries are transmitted to the content recognition service. In at least some embodiments, subsequent queries can progressively incorporate previous queries plus additional data that is captured. In one or more embodiments, responsive to receiving the query, the content recognition service can employ a multi-stage matching technique to identify content items responding to the query. This matching technique can be employed as queries are progressively received.
    Type: Grant
    Filed: May 18, 2011
    Date of Patent: March 31, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kazuhito Koishida, David Nister, Ian Simon, Tom Butcher
  • Patent number: 8988344
    Abstract: Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: March 24, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
  • Publication number: 20150035744
    Abstract: Embodiments are disclosed for adjusting alignment of a near-eye optic of a see-through head-mounted display system. In one embodiment, a method of detecting eye location for a head-mounted display system includes directing positioning light to an eye of a user and detecting the positioning light reflected from the eye of the user. The method further includes determining a distance between the eye and a near-eye optic of the head-mounted display system based on attributes of the detected positioning light, and providing feedback for adjusting the distance between the eye and the near-eye optic.
    Type: Application
    Filed: August 22, 2013
    Publication date: February 5, 2015
    Inventors: Steve Robbins, Scott C. McEldowney, Xinye Lou, David D. Bohn, Quentin Simon Charles Miller, David Nister, Gerhard Schneider, Christopher Maurice Mei, Nathan Ackerman
  • Publication number: 20150002940
    Abstract: A system and related methods for near-eye display of an image are provided. In one example, a near-eye display system includes a light source comprising a surface and a plurality of pixels having a pixel pitch of 5 microns or less. An aperture array is located between 2 mm and 5 mm from the surface of the light source. The aperture array comprises non-overlapping apertures that are each centered on a vertex of an equilateral triangle within a grid of equilateral triangles. The center of each aperture is spaced from the center of each adjacent aperture by an aperture spacing of between 1 mm and 9 mm. The aperture array selectively passes the light emitted from the pixels to display the image without a double image condition.
    Type: Application
    Filed: June 28, 2013
    Publication date: January 1, 2015
    Inventors: David Nister, Georg Klein
  • Publication number: 20140376770
    Abstract: A method of object detection includes receiving a first image taken by a first stereo camera, receiving a second image taken by a second stereo camera, and offsetting the first image relative to the second image by an offset distance selected such that each corresponding pixel of offset first and second images depict a same object locus if the object locus is at an assumed distance from the first and second stereo cameras. The method further includes locating a target object in the offset first and second images.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: David Nister, Piotr Dollar, Wolf Kienzle, Mladen Radojevic, Matthew S. Ashman, Ivan Stojiljkovic, Magdalena Vukosavljevic
  • Publication number: 20140375544
    Abstract: Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
  • Publication number: 20140380230
    Abstract: Embodiments are disclosed herein that relate to selecting user interface elements via a periodically updated position signal. For example, one disclosed embodiment provides a method comprising displaying on a graphical user interface a representation of a user interface element and a representation of an interactive target.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
  • Publication number: 20140375540
    Abstract: A system and method are disclosed for sensing a position and/or angular orientation of a head mounted display device respect to a wearer's eyes, and to provide feedback for adjusting the position and/or angular orientation of the head mounted display device so as to be optimally centered and oriented with respect to the wearer's eyes.
    Type: Application
    Filed: June 24, 2013
    Publication date: December 25, 2014
    Inventors: Nathan Ackerman, Andy Hodge, David Nister
  • Publication number: 20140375541
    Abstract: Embodiments are disclosed that relate to tracking a user's eye based on time-of-flight depth image data of the user's eye are disclosed. For example, one disclosed embodiment provides an eye tracking system comprising a light source, a sensing subsystem configured to obtain a two-dimensional image of a user's eye and depth data of the user's eye using a depth sensor having an unconstrained baseline distance, and a logic subsystem configured to control the light source to emit light, control the sensing subsystem to acquire a two-dimensional image of the user's eye while illuminating the light source, control the sensing subsystem to acquire depth data of the user's eye, determine a gaze direction of the user's eye from the two-dimensional image, determine a location on a display at which the gaze direction intersects the display based on the gaze direction and the depth data, and output the location.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: David Nister, Ibrahim Eden
  • Publication number: 20140375790
    Abstract: Embodiments are disclosed for a see-through head-mounted display system. In one embodiment, the see-through head-mounted display system comprises a freeform prism, and a display device configured to emit display light through the freeform prism to an eye of a user. The see-through head-mounted display system may also comprise an imaging device having an entrance pupil positioned at a back focal plane of the freeform prism, the imaging device configured to receive gaze-detection light reflected from the eye and directed through the freeform prism.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Steve Robbins, Scott McEldowney, Xinye Lou, David Nister, Drew Steedly, Quentin Simon Charles Miller, David D. Bohn, James Peele Terrell, JR., Andrew C. Goris, Nathan Ackerman
  • Patent number: 8917238
    Abstract: Various embodiments related to entering text into a computing device via eye-typing are disclosed. For example, one embodiment provides a method that includes receiving a data set including a plurality of gaze samples, each gaze sample including a gaze location and a corresponding point in time. The method further comprises processing the plurality of gaze samples to determine one or more likely terms represented by the data set.
    Type: Grant
    Filed: June 28, 2012
    Date of Patent: December 23, 2014
    Assignee: Microsoft Corporation
    Inventors: David Nister, Vaibhav Thukral, Djordje Nijemcevic, Ruchi Bhargava