Patents by Inventor Weerapan Wilairat

Weerapan Wilairat has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10447960
    Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.
    Type: Grant
    Filed: February 13, 2017
    Date of Patent: October 15, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral
  • Patent number: 10248199
    Abstract: Examples relating to calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received, the image data comprising at least one target visual and target visual metadata that identifies the at least one target visual. The target visual metadata is used to identify a target location of the at least one target visual. The estimated gaze location of the viewer is monitored and a probability that the viewer is gazing at the target location is estimated. The gaze tracking system is calibrated using the probability, the estimated gaze location and the target location to generate an updated estimated gaze location of the viewer.
    Type: Grant
    Filed: August 7, 2017
    Date of Patent: April 2, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Publication number: 20170336867
    Abstract: Examples relating to calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received, the image data comprising at least one target visual and target visual metadata that identifies the at least one target visual. The target visual metadata is used to identify a target location of the at least one target visual. The estimated gaze location of the viewer is monitored and a probability that the viewer is gazing at the target location is estimated. The gaze tracking system is calibrated using the probability, the estimated gaze location and the target location to generate an updated estimated gaze location of the viewer.
    Type: Application
    Filed: August 7, 2017
    Publication date: November 23, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Patent number: 9727136
    Abstract: Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector.
    Type: Grant
    Filed: May 19, 2014
    Date of Patent: August 8, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Publication number: 20170155868
    Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.
    Type: Application
    Filed: February 13, 2017
    Publication date: June 1, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral
  • Patent number: 9568997
    Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.
    Type: Grant
    Filed: March 25, 2014
    Date of Patent: February 14, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Weerapan Wilairat, Vaibhav Thukral
  • Patent number: 9342147
    Abstract: Examples relating to using non-visual feedback to alert a viewer of a display that a visual change has been triggered are disclosed. One disclosed example provides a method comprising using gaze tracking data from a gaze tracking system to determine that a viewer changes a gaze location. Based on determining that the viewer changes the gaze location, a visual change is triggered and non-visual feedback indicating the triggering of the visual change is provided to the viewer. If a cancel change input is received within a predetermined timeframe, the visual change is not displayed. If a cancel change input is not received within the timeframe, the visual change is displayed via the display.
    Type: Grant
    Filed: April 10, 2014
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Weerapan Wilairat, Ibrahim Eden, Vaibhav Thukral, David Nister, Vivek Pradeep
  • Patent number: 9244539
    Abstract: Embodiments that relate to positioning a target indicator via a display system are disclosed. For example, one disclosed embodiment provides a method for positioning a target indicator using gaze tracking data having a coarse accuracy from a gaze tracking system of a computing device. Head pose data having a fine accuracy greater than the coarse accuracy is received from a head tracking system. Using the gaze tracking data, an approximate user gaze region within a display region is determined, and the target indicator is displayed at an initial location within the approximate user gaze region. A reposition input from the user is received. In response, subsequently received head pose data is used to calculate an adjusted location for the target indicator. The target indicator is then displayed at the adjusted location.
    Type: Grant
    Filed: January 7, 2014
    Date of Patent: January 26, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Weerapan Wilairat
  • Publication number: 20150331485
    Abstract: Examples relating calibrating an estimated gaze location are disclosed. One example method comprises monitoring the estimated gaze location of a viewer using gaze tracking data from a gaze tracking system. Image data for display via a display device is received and, without using input from the viewer, at least one target visual that may attract a gaze of the viewer and a target location of the target visual are identified within the image data. The estimated gaze location of the viewer is compared with the target location of the target visual. An offset vector is calculated based on the estimated gaze location and the target location. The gaze tracking system is calibrated using the offset vector.
    Type: Application
    Filed: May 19, 2014
    Publication date: November 19, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral, David Nister, Morgan Kolya Venable, Bernard James Kerr, Chris Aholt
  • Publication number: 20150293587
    Abstract: Examples relating to using non-visual feedback to alert a viewer of a display that a visual change has been triggered are disclosed. One disclosed example provides a method comprising using gaze tracking data from a gaze tracking system to determine that a viewer changes a gaze location. Based on determining that the viewer changes the gaze location, a visual change is triggered and non-visual feedback indicating the triggering of the visual change is provided to the viewer. If a cancel change input is received within a predetermined timeframe, the visual change is not displayed. If a cancel change input is not received within the timeframe, the visual change is displayed via the display.
    Type: Application
    Filed: April 10, 2014
    Publication date: October 15, 2015
    Inventors: Weerapan Wilairat, Ibrahim Eden, Vaibhav Thukral, David Nister, Vivek Pradeep
  • Publication number: 20150277552
    Abstract: Systems and methods for controlling closed captioning using an eye tracking device are provided. The system for controlling closed captioning may comprise a display device, a closed captioning controller configured to display closed captioning text for a media item during playback on the display device, and an eye tracking device configured to detect a location of a user's gaze relative to the display device and send the location to the closed captioning controller. The closed captioning controller may be configured to recognize a predetermined gaze pattern of the user's gaze and, upon detecting the predetermined gaze pattern, partially or completely deemphasize the display of the closed captioning text.
    Type: Application
    Filed: March 25, 2014
    Publication date: October 1, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral
  • Publication number: 20150261293
    Abstract: Embodiments are disclosed that relate to gaze-based remote device control. For example, one disclosed embodiment provides, on a computing device, a method comprising detecting a gaze direction of a user, detecting an indication from the user to control a remotely controllable device located in the gaze direction, and adapting a user interface of a controller device to enable user control of the remotely controllable device.
    Type: Application
    Filed: March 12, 2014
    Publication date: September 17, 2015
    Inventors: Weerapan Wilairat, Vaibhav Thukral, Ibrahim Eden, David Nister
  • Publication number: 20150213252
    Abstract: A multiple-access-level lock screen system allows different levels of functionality to be accessed on a computing device. For example, when a device is in a locked state, a user can select (e.g., by making one or more gestures on a touchscreen) a full-access lock screen pane and provide input that causes the device to be fully unlocked, or a user can select a partial-access lock screen pane and provide input that causes only certain resources (e.g., particular applications, attached devices, documents, etc.) to be accessible. Lock screen panes also can be selected (e.g., automatically) in response to events. For example, when a device is in a locked state, a messaging access lock screen pane can be selected automatically in response to an incoming message, and a user can provide input at the messaging access lock screen pane that causes only a messaging application to be accessible.
    Type: Application
    Filed: April 8, 2015
    Publication date: July 30, 2015
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventor: Weerapan Wilairat
  • Publication number: 20150193018
    Abstract: Embodiments that relate to positioning a target indicator via a display system are disclosed. For example, one disclosed embodiment provides a method for positioning a target indicator using gaze tracking data having a coarse accuracy from a gaze tracking system of a computing device. Head pose data having a fine accuracy greater than the coarse accuracy is received from a head tracking system. Using the gaze tracking data, an approximate user gaze region within a display region is determined, and the target indicator is displayed at an initial location within the approximate user gaze region. A reposition input from the user is received. In response, subsequently received head pose data is used to calculate an adjusted location for the target indicator. The target indicator is then displayed at the adjusted location.
    Type: Application
    Filed: January 7, 2014
    Publication date: July 9, 2015
    Inventors: Morgan Kolya Venable, Bernard James Kerr, Weerapan Wilairat
  • Patent number: 9027117
    Abstract: A multiple-access-level lock screen system allows different levels of functionality to be accessed on a computing device. For example, when a device is in a locked state, a user can select (e.g., by making one or more gestures on a touchscreen) a full-access lock screen pane and provide input that causes the device to be fully unlocked, or a user can select a partial-access lock screen pane and provide input that causes only certain resources (e.g., particular applications, attached devices, documents, etc.) to be accessible. Lock screen panes also can be selected (e.g., automatically) in response to events. For example, when a device is in a locked state, a messaging access lock screen pane can be selected automatically in response to an incoming message, and a user can provide input at the messaging access lock screen pane that causes only a messaging application to be accessible.
    Type: Grant
    Filed: October 4, 2010
    Date of Patent: May 5, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Weerapan Wilairat
  • Patent number: 8836648
    Abstract: Embodiments of a touch pull-in gesture are described. In various embodiments, a touch input is detected that starts at an edge of a touch-screen and progresses as an expanding contact region from the edge of the touch-screen toward approximately a center region of the touch-screen while the touch input remains in contact with the touch-screen. The expanding contact region is determined as a touch pull-in gesture to initiate a display of a user interface component.
    Type: Grant
    Filed: May 27, 2009
    Date of Patent: September 16, 2014
    Assignee: Microsoft Corporation
    Inventor: Weerapan Wilairat
  • Patent number: 8751956
    Abstract: Embodiments of a variable rate scrollbar are described. In various embodiments, media items, such as email messages, text messages, digital photos, or song selections are grouped by a page, a day, a week, or a month that corresponds to a date associated with a media item. A list of the media items is displayed from one of the groups, and the media items that are displayed are selectable from the list. A variable rate scrollbar is displayed and configured for selection to navigate the media items at variable rates according to the groups of the media items.
    Type: Grant
    Filed: May 27, 2009
    Date of Patent: June 10, 2014
    Assignee: Microsoft Corporation
    Inventor: Weerapan Wilairat
  • Patent number: 8458609
    Abstract: A multi-context service is described. In embodiments, a first input on a touch-screen of a portable device is detected as a selection of an object icon displayed on the touch-screen. A second input on the touch-screen is detected as a selection of an application icon displayed on the touch-screen. The object icon is associated with a content object, and the application icon is associated with a command provider that performs one or more actions on the content object. The content object is associated with the command provider to initiate the command provider to perform at least one of the actions on the content object based on the content object being associated with the command provider.
    Type: Grant
    Filed: September 24, 2009
    Date of Patent: June 4, 2013
    Assignee: Microsoft Corporation
    Inventor: Weerapan Wilairat
  • Patent number: 8416192
    Abstract: For each of multiple user inputs, multiple keys of a keyboard that are touched as part of the user input are identified. Additionally, for each of the multiple user inputs, multiple characters that are to be displayed concurrently are determined based on the multiple keys that are touched as part of the user input. Both a character input field and the multiple characters determined for each of the multiple user inputs are displayed. One or more suggested inputs can also be displayed, and a user-selected one of the suggested inputs identified as an input to the character input field.
    Type: Grant
    Filed: February 5, 2009
    Date of Patent: April 9, 2013
    Assignee: Microsoft Corporation
    Inventor: Weerapan Wilairat
  • Patent number: 8269736
    Abstract: Embodiments of drop target gestures are described. In various embodiments, a first input on a touch-screen is detected and determined as a selection of an object displayed on the touch-screen. A second input on the touch-screen is detected and determined as a selection of a target position to move the object, the second input being detected while the first input remains in contact with the touch-screen. The first input is then detected as no longer being in contact with the touch-screen, and a display of the object is initiated at the target position on the touch-screen giving the appearance that the object moves from a location of the first input to the second input.
    Type: Grant
    Filed: May 22, 2009
    Date of Patent: September 18, 2012
    Assignee: Microsoft Corporation
    Inventor: Weerapan Wilairat