Patents by Inventor Vaibhav Thukral
Vaibhav Thukral has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20170155868Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.Type: ApplicationFiled: February 13, 2017Publication date: June 1, 2017Applicant: Microsoft Technology Licensing, LLCInventors: Weerapan Wilairat, Vaibhav Thukral
-
Patent number: 9568997Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.Type: GrantFiled: March 25, 2014Date of Patent: February 14, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Weerapan Wilairat, Vaibhav Thukral
-
Publication number: 20160358009Abstract: Embodiments are disclosed for eye tracking systems and methods. An example eye tracking system comprises a plurality of light sources and a camera configured to capture an image of light from the light sources as reflected from an eye. The eye tracking system further comprises a logic device and a storage device storing instructions executable by the logic device to acquire frames of eye tracking data by iteratively projecting light from different combinations of light sources of the plurality of light sources and capturing an image of the eye during projection of each combination. The instructions may be further executable to select a selected combination of light sources for eye tracking based on a determination of occlusion detected in the image arising from a transparent or semi-transparent optical structure positioned between the eye and the camera and project light via the selected combination of light sources for eye tracking.Type: ApplicationFiled: August 22, 2016Publication date: December 8, 2016Applicant: Microsoft Technology Licensing, LLCInventors: Mudit Agrawal, Vaibhav Thukral, Ibrahim Eden, David Nister, Shivkumar Swaminathan
-
Patent number: 9454699Abstract: Embodiments are disclosed for eye tracking systems and methods. An example eye tracking system comprises a plurality of light sources and a camera configured to capture an image of light from the light sources as reflected from an eye. The eye tracking system further comprises a logic device and a storage device storing instructions executable by the logic device to acquire frames of eye tracking data by iteratively projecting light from different combinations of light sources of the plurality of light sources and capturing an image of the eye during projection of each combination. The instructions may be further executable to select a selected combination of light sources for eye tracking based on a determination of occlusion detected in the image arising from a transparent or semi-transparent optical structure positioned between the eye and the camera and project light via the selected combination of light sources for eye tracking.Type: GrantFiled: April 29, 2014Date of Patent: September 27, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mudit Agrawal, Vaibhav Thukral, Ibrahim Eden, David Nister, Shivkumar Swaminathan
-
Patent number: 9342147Abstract: Examples relating to using non-visual feedback to alert a viewer of a display that a visual change has been triggered are disclosed. One disclosed example provides a method comprising using gaze tracking data from a gaze tracking system to determine that a viewer changes a gaze location. Based on determining that the viewer changes the gaze location, a visual change is triggered and non-visual feedback indicating the triggering of the visual change is provided to the viewer. If a cancel change input is received within a predetermined timeframe, the visual change is not displayed. If a cancel change input is not received within the timeframe, the visual change is displayed via the display.Type: GrantFiled: April 10, 2014Date of Patent: May 17, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Weerapan Wilairat, Ibrahim Eden, Vaibhav Thukral, David Nister, Vivek Pradeep
-
Patent number: 9330302Abstract: Embodiments that relate to determining gaze locations are disclosed. In one embodiment a method includes shining light along an outbound light path to the eyes of the user wearing glasses. Upon detecting the glasses, the light is dynamically polarized in a polarization pattern that switches between a random polarization phase and a single polarization phase, wherein the random polarization phase includes a first polarization along an outbound light path and a second polarization orthogonal to the first polarization along a reflected light path. The single polarization phase has a single polarization. During the random polarization phases, glares reflected from the glasses are filtered out and pupil images are captured. Glint images are captured during the single polarization phase. Based on pupil characteristics and glint characteristics, gaze locations are repeatedly detected.Type: GrantFiled: February 26, 2014Date of Patent: May 3, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Vaibhav Thukral, Sudipta Sinha, Vivek Pradeep, Timothy Andrew Large, Nigel Stuart Keam, David Nister
-
Publication number: 20160103484Abstract: A method to furnish input representing gaze direction in a computer system operatively coupled to a vision system. In this method, a first image of an eye at a first level of illumination is acquired by a camera of the vision system. The first image is obtained from the camera, and a second image of the eye corresponding to a second, different level of illumination is also obtained. Brightness of corresponding pixels of the first and second images is compared in order to distinguish a reflection of the illumination by the eye from a reflection of the illumination by eyewear. The input is then furnished based on the reflection of the illumination by the eye.Type: ApplicationFiled: October 8, 2014Publication date: April 14, 2016Inventors: Huimin Guo, Ibrahim Eden, Vaibhav Thukral, David Zachris Nister
-
Patent number: 9304586Abstract: Various embodiments related to entering text into a computing device via eye-typing are disclosed. For example, one embodiment provides a method that includes receiving a data set including a plurality of gaze samples, each gaze sample including a gaze location and a corresponding point in time. The method further comprises processing the plurality of gaze samples to determine one or more likely terms represented by the data set.Type: GrantFiled: December 17, 2014Date of Patent: April 5, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: David Nister, Vaibhav Thukral, Djordje Nijemcevic, Ruchi Bhargava
-
Publication number: 20150346814Abstract: One or more techniques and/or systems are provided for gaze tracking of one or more users. A user tracking component (e.g., a depth camera or a relatively lower resolution camera) may be utilized to obtain user tracking data for a user. The user tracking data is evaluated to identify a spatial location of the user. An eye capture camera (e.g., a relatively higher resolution camera) may be selected from an eye capture camera configuration based upon the eye capture camera having a view frustum corresponding to the spatial location of the user. The eye capture camera may be invoked to obtain eye region imagery of the user. Other eye capture cameras within the eye capture camera configuration are maintained in a powered down state to reduce power and/or bandwidth consumption. Gaze tracking information may be generated based upon the eye region imagery, and may be used to perform a task.Type: ApplicationFiled: May 30, 2014Publication date: December 3, 2015Inventors: Vaibhav Thukral, Ibrahim Eden, Shivkumar Swaminathan, David Nister, Morgan Venable
-
Publication number: 20150331485Abstract: Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector.Type: ApplicationFiled: May 19, 2014Publication date: November 19, 2015Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
-
Publication number: 20150310253Abstract: Embodiments are disclosed for eye tracking systems and methods. An example eye tracking system comprises a plurality of light sources and a camera configured to capture an image of light from the light sources as reflected from an eye. The eye tracking system further comprises a logic device and a storage device storing instructions executable by the logic device to acquire frames of eye tracking data by iteratively projecting light from different combinations of light sources of the plurality of light sources and capturing an image of the eye during projection of each combination. The instructions may be further executable to select a selected combination of light sources for eye tracking based on a determination of occlusion detected in the image arising from a transparent or semi-transparent optical structure positioned between the eye and the camera and project light via the selected combination of light sources for eye tracking.Type: ApplicationFiled: April 29, 2014Publication date: October 29, 2015Inventors: Mudit Agrawal, Vaibhav Thukral, Ibrahim Eden, David Nister, Shivkumar Swaminathan
-
Publication number: 20150293587Abstract: Examples relating to using non-visual feedback to alert a viewer of a display that a visual change has been triggered are disclosed. One disclosed example provides a method comprising using gaze tracking data from a gaze tracking system to determine that a viewer changes a gaze location. Based on determining that the viewer changes the gaze location, a visual change is triggered and non-visual feedback indicating the triggering of the visual change is provided to the viewer. If a cancel change input is received within a predetermined timeframe, the visual change is not displayed. If a cancel change input is not received within the timeframe, the visual change is displayed via the display.Type: ApplicationFiled: April 10, 2014Publication date: October 15, 2015Inventors: Weerapan Wilairat, Ibrahim Eden, Vaibhav Thukral, David Nister, Vivek Pradeep
-
Publication number: 20150277552Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.Type: ApplicationFiled: March 25, 2014Publication date: October 1, 2015Inventors: Weerapan Wilairat, Vaibhav Thukral
-
Publication number: 20150261293Abstract: Embodiments are disclosed that relate to gaze-based remote device control. For example, one disclosed embodiment provides, on a computing device, a method comprising detecting a gaze direction of a user, detecting an indication from the user to control a remotely controllable device located in the gaze direction, and adapting a user interface of a controller device to enable user control of the remotely controllable device.Type: ApplicationFiled: March 12, 2014Publication date: September 17, 2015Inventors: Weerapan Wilairat, Vaibhav Thukral, Ibrahim Eden, David Nister
-
Publication number: 20150242680Abstract: Embodiments that relate to determining gaze locations are disclosed. In one embodiment a method includes shining light along an outbound light path to the eyes of the user wearing glasses. Upon detecting the glasses, the light is dynamically polarized in a polarization pattern that switches between a random polarization phase and a single polarization phase, wherein the random polarization phase includes a first polarization along an outbound light path and a second polarization orthogonal to the first polarization along a reflected light path. The single polarization phase has a single polarization. During the random polarization phases, glares reflected from the glasses are filtered out and pupil images are captured. Glint images are captured during the single polarization phase. Based on pupil characteristics and glint characteristics, gaze locations are repeatedly detected.Type: ApplicationFiled: February 26, 2014Publication date: August 27, 2015Inventors: Vaibhav Thukral, Sudipta Sinha, Vivek Pradeep, Timothy Andrew Large, Nigel Stuart Keam, David Nister
-
Publication number: 20150103000Abstract: Various embodiments related to entering text into a computing device via eye-typing are disclosed. For example, one embodiment provides a method that includes receiving a data set including a plurality of gaze samples, each gaze sample including a gaze location and a corresponding point in time. The method further comprises processing the plurality of gaze samples to determine one or more likely terms represented by the data set.Type: ApplicationFiled: December 17, 2014Publication date: April 16, 2015Inventors: David Nister, Vaibhav Thukral, Djordje Nijemcevic, Ruchi Bhargava
-
Patent number: 8988344Abstract: Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained.Type: GrantFiled: June 25, 2013Date of Patent: March 24, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
-
Publication number: 20140375544Abstract: Embodiments that relate to navigating a hierarchy of visual elements are disclosed. In one embodiment a method includes presenting one or more visual elements from a two-dimensional plane via a display device. A home location within a viewable region of the display is established. A proportional size relationship between each element and each of the other elements is established. Using gaze tracking data, a gaze location at which a user is gazing within the viewable region is determined. The gaze location is mapped to a target location, and movement of the target location toward the home location is initiated. As the target location moves closer to the home location, each of the visual elements is progressively enlarged while the proportional size relationship between each of the visual elements and each of the other visual elements is also maintained.Type: ApplicationFiled: June 25, 2013Publication date: December 25, 2014Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
-
Publication number: 20140380230Abstract: Embodiments are disclosed herein that relate to selecting user interface elements via a periodically updated position signal. For example, one disclosed embodiment provides a method comprising displaying on a graphical user interface a representation of a user interface element and a representation of an interactive target.Type: ApplicationFiled: June 25, 2013Publication date: December 25, 2014Inventors: Morgan Kolya Venable, Bernard James Kerr, Vaibhav Thukral, David Nister
-
Patent number: 8917238Abstract: Various embodiments related to entering text into a computing device via eye-typing are disclosed. For example, one embodiment provides a method that includes receiving a data set including a plurality of gaze samples, each gaze sample including a gaze location and a corresponding point in time. The method further comprises processing the plurality of gaze samples to determine one or more likely terms represented by the data set.Type: GrantFiled: June 28, 2012Date of Patent: December 23, 2014Assignee: Microsoft CorporationInventors: David Nister, Vaibhav Thukral, Djordje Nijemcevic, Ruchi Bhargava