Patents by Inventor Robert Thomas Held

Robert Thomas Held has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190208190
    Abstract: Examples are disclosed herein related to reducing binocular rivalry in a near-eye display. One example provides a head-mounted display device having a near-eye display system configured to output a first-eye image to a first eyebox and a second-eye image to a second eyebox. The head-mounted display device is configured to receive an input of a three-dimensional (3D) location of a pupil of a first eye and a 3D location of a pupil of a second eye relative to the near-eye display system, based upon the 3D location of the pupil of the first eye and of the second eye, determine a location at which the pupil of the first eye begins to exit the first eyebox, and attenuate a luminance of the second-eye image at a location in the second-eye image based upon the location at which the pupil of the first eye begins to exit the first eyebox.
    Type: Application
    Filed: December 29, 2017
    Publication date: July 4, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Matthew Calbraith CRISLER, Robert Thomas HELD, Bernard Charles KRESS
  • Publication number: 20190204910
    Abstract: Via a near-eye display, one or more pre-saccade image frames are displayed to a user eye. Based on a detected movement of the user eye, the user eye is determined to be performing a saccade. One or more saccade-contemporaneous image frames are displayed with a temporary saccade-specific image effect not applied to the pre-saccade image frames.
    Type: Application
    Filed: January 2, 2018
    Publication date: July 4, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Robert Thomas HELD, Christopher Maurice MEI, Christopher Charles AHOLT, Nava Kayla BALSAM, Shivkumar SWAMINATHAN, Jeffrey N. MARGOLIS
  • Publication number: 20190171463
    Abstract: A technique is described herein for presenting notifications associated with applications in a context-based manner. In one implementation, the technique maintains a data store that provides application annotation information that describes a plurality of anchors. For instance, the application annotation information for an illustrative anchor identifies: a location at which the anchor is virtually placed in an interactive world; an application associated with the anchor; and triggering information that describes a set of one or more triggering conditions to be satisfied to enable presentation of a notification pertaining to the application. In use, the technique presents the notification pertaining to the application in prescribed proximity to the anchor when it is determined that the user's engagement with the interactive world satisfies the anchor's set of triggering conditions. The triggering conditions can specify any combination of spatial factors, temporal factors, user co-presence factors, etc.
    Type: Application
    Filed: February 11, 2019
    Publication date: June 6, 2019
    Inventors: Semih Energin, Anatolie Gavriliuc, Robert Thomas Held, Maxime Ouellet, Riccardo Giraldi, Andrew Frederick Muehlhausen, Sergio Paolantonio
  • Patent number: 10313645
    Abstract: A scanned-beam display comprises an emitter configured to emit light of controlled variable intensity, a beam-steering optic configured to steer the light to a controlled variable beam position, and a drive circuit coupled operatively to the emitter and beam-steering optic. The drive circuit is configured to apply a signal to the beam-steering optic to move the beam position along a path of pixel positions, the path including a first pixel position at the center of the field-of-view and a second pixel position at the periphery of the field-of-view. The drive circuit is further configured to drive a series of current pulses through the emitter in synchronicity with the signal, to illuminate the first pixel position during a first interval and to equivalently illuminate the second pixel position during a second interval, the emitter being driven at a higher duty cycle during the second interval than during the first.
    Type: Grant
    Filed: January 19, 2018
    Date of Patent: June 4, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Robert Thomas Held, Andrew Martin Pearson
  • Patent number: 10311543
    Abstract: A method for moving a virtual object includes displaying a virtual object and moving the virtual object based on a user input. Based on the user input attempting to move the virtual object in violation of an obstacle, displaying a collision indicator and an input indicator. The collision indicator is moved based on user input and movement constraints imposed by the obstacle. The input indicator is moved based on user input without movement constraints imposed by the obstacle.
    Type: Grant
    Filed: October 27, 2016
    Date of Patent: June 4, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Semih Energin, Sergio Paolantonio, David Evans, Eric Scott Rehmeyer, Robert Thomas Held, Maxime Ouellet, Anatolie Gavriliuc, Riccardo Giraldi, Andrew Frederick Muehlhausen
  • Patent number: 10277943
    Abstract: Peripheral visualizations are based on user movements and/or interactions with elements in a scene. Various user movements are detected while a scene is being rendered. Afterwards, the embodiments determine whether one of the movements corresponds with an increase in a level of focus by the user to the one or more elements and/or an interaction by the user with the one or more elements. Thereafter, peripheral visualizations are rendered on one or more peripheral displays proximate elements that correspond with the user movements/interactions/focus. In some instances, the selective rendering is performed in response to a determination that the user movement does correspond with the increase in the level of focus by the user to the one or more elements and/or the interaction by the user with the one or more elements.
    Type: Grant
    Filed: March 27, 2017
    Date of Patent: April 30, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Matthew Calbraith Crisler, Robert Thomas Held, Eliezer Glik
  • Patent number: 10249095
    Abstract: A technique is described herein for presenting notifications associated with applications in a context-based manner. In one implementation, the technique maintains a data store that provides application annotation information that describes a plurality of anchors. For instance, the application annotation information for an illustrative anchor identifies: a location at which the anchor is virtually placed in an interactive world; an application associated with the anchor; and triggering information that describes a set of one or more triggering conditions to be satisfied to enable presentation of a notification pertaining to the application. In use, the technique presents the notification pertaining to the application in prescribed proximity to the anchor when it is determined that the user's engagement with the interactive world satisfies the anchor's set of triggering conditions. The triggering conditions can specify any combination of spatial factors, temporal factors, user co-presence factors, etc.
    Type: Grant
    Filed: April 7, 2017
    Date of Patent: April 2, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Semih Energin, Anatolie Gavriliuc, Robert Thomas Held, Maxime Ouellet, Riccardo Giraldi, Andrew Frederick Muehlhausen, Sergio Paolantonio
  • Patent number: 10216260
    Abstract: Peripheral visualizations are based on various attributes associated with a scene. Characteristics of elements in a scene are determined. Based on these characteristics, the salience of the elements is determined. When the element is salient, then this determination also includes a saliency magnitude of the element. Thereafter, the embodiments determine whether the element's saliency magnitude exceeds a particular saliency threshold. If the magnitude does exceed this threshold, then the embodiments render a corresponding peripheral visualization with the peripheral display(s) proximate the salient element(s).
    Type: Grant
    Filed: March 27, 2017
    Date of Patent: February 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Matthew Calbraith Crisler, Robert Thomas Held, Eliezer Glik
  • Patent number: 10186086
    Abstract: An augmented reality head-mounted device includes a gaze detector, a camera, and a communication interface. The gaze detector determines a gaze vector of an eye of a wearer of the augmented reality head-mounted device. The camera images a physical space including a display of a computing device. The communication interface sends a control signal to the computing device in response to a wearer input. The control signal indicates a location at which the gaze vector intersects the display and useable by the computing device to adjust operation of the computing device.
    Type: Grant
    Filed: September 2, 2015
    Date of Patent: January 22, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Riccardo Giraldi, Anatolie Gavriliuc, Michelle Chua, Andrew Frederick Muehlhausen, Robert Thomas Held, Joseph van den Heuvel
  • Publication number: 20180293798
    Abstract: A technique is described herein for presenting notifications associated with applications in a context-based manner. In one implementation, the technique maintains a data store that provides application annotation information that describes a plurality of anchors. For instance, the application annotation information for an illustrative anchor identifies: a location at which the anchor is virtually placed in an interactive world; an application associated with the anchor; and triggering information that describes a set of one or more triggering conditions to be satisfied to enable presentation of a notification pertaining to the application. In use, the technique presents the notification pertaining to the application in prescribed proximity to the anchor when it is determined that the user's engagement with the interactive world satisfies the anchor's set of triggering conditions. The triggering conditions can specify any combination of spatial factors, temporal factors, user co-presence factors, etc.
    Type: Application
    Filed: April 7, 2017
    Publication date: October 11, 2018
    Inventors: Semih Energin, Anatolie Gavriliuc, Robert Thomas Held, Maxime Ouellet, Riccardo Giraldi, Andrew Frederick Muehlhausen, Sergio Paolantonio
  • Publication number: 20180278993
    Abstract: Peripheral visualizations are based on user movements and/or interactions with elements in a scene. Various user movements are detected while a scene is being rendered. Afterwards, the embodiments determine whether one of the movements corresponds with an increase in a level of focus by the user to the one or more elements and/or an interaction by the user with the one or more elements. Thereafter, peripheral visualizations are rendered on one or more peripheral displays proximate elements that correspond with the user movements/interactions/focus. In some instances, the selective rendering is performed in response to a determination that the user movement does correspond with the increase in the level of focus by the user to the one or more elements and/or the interaction by the user with the one or more elements.
    Type: Application
    Filed: March 27, 2017
    Publication date: September 27, 2018
    Inventors: Matthew Calbraith Crisler, Robert Thomas Held, Eliezer Glik
  • Publication number: 20180275745
    Abstract: Peripheral visualizations are based on various attributes associated with a scene. Characteristics of elements in a scene are determined. Based on these characteristics, the salience of the elements is determined. When the element is salient, then this determination also includes a saliency magnitude of the element. Thereafter, the embodiments determine whether the element's saliency magnitude exceeds a particular saliency threshold. If the magnitude does exceed this threshold, then the embodiments render a corresponding peripheral visualization with the peripheral display(s) proximate the salient element(s).
    Type: Application
    Filed: March 27, 2017
    Publication date: September 27, 2018
    Inventors: Matthew Calbraith Crisler, Robert Thomas Held, Eliezer Glik
  • Publication number: 20180122043
    Abstract: A method for moving a virtual object includes displaying a virtual object and moving the virtual object based on a user input. Based on the user input attempting to move the virtual object in violation of an obstacle, displaying a collision indicator and an input indicator. The collision indicator is moved based on user input and movement constraints imposed by the obstacle. The input indicator is moved based on user input without movement constraints imposed by the obstacle.
    Type: Application
    Filed: October 27, 2016
    Publication date: May 3, 2018
    Inventors: Semih Energin, Sergio Paolantonio, David Evans, Eric Scott Rehmeyer, Robert Thomas Held, Maxime Ouellet, Anatolie Gavriliuc, Riccardo Giraldi, Andrew Frederick Muehlhausen
  • Patent number: 9874932
    Abstract: One embodiment provides a method to display video such as computer-rendered animation or other video. The method includes assembling a sequence of video frames featuring a moving object, each video frame including a plurality of subframes sequenced for display according to a schedule. The method also includes determining a vector-valued differential velocity of the moving object relative to a head of an observer of the video. At a time scheduled for display of a first subframe of a given frame, first-subframe image content transformed by a first transform is displayed. At a time scheduled for display of the second subframe of the given frame, second-subframe image content transformed by a second transform is displayed. The first and second transforms are computed based on the vector-valued differential velocity to mitigate artifacts.
    Type: Grant
    Filed: April 9, 2015
    Date of Patent: January 23, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Matthew Calbraith Crisler, Robert Thomas Held, Stephen Latta, Ashraf Ayman Michail, Martin Shetter, Arthur Tomlin
  • Patent number: 9865091
    Abstract: Examples are disclosed herein that relate to identifying and localizing devices in an environment via an augmented reality display device. One example provides, on a portable augmented reality computing device, a method including establishing a coordinate frame for an environment, and discovering, via a location-sensitive input device, a location of a physical manifestation of the device in the environment, assigning a device location for the device in the coordinate frame based upon the location of the physical manifestation, and modifying an output of the portable augmented reality computing device based upon a change in relative position between the portable augmented reality computing device and the physical manifestation in environment.
    Type: Grant
    Filed: September 2, 2015
    Date of Patent: January 9, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Riccardo Giraldi, Anatolie Gavriliuc, Michelle Chua, Andrew Frederick Muehlhausen, Robert Thomas Held, Joseph van den Heuvel, Todd Alan Omotani, Richard J. Wifall, Christian Sadak, Gregory Alt
  • Publication number: 20170099481
    Abstract: Examples are disclosed herein that relate to calibrating a user's eye for a stereoscopic display. One example provides, on a head-mounted display device including a see-through display, a method of calibrating a stereoscopic display for a user's eyes, the method including for a first eye, receiving an indication of alignment of a user-controlled object with a first eye reference object viewable via the head-mounted display device from a perspective of the first eye, determining a first ray intersecting the user-controlled object and the first eye reference object from the perspective of the first eye, and determining a position of the first eye based on the first ray. The method further includes repeating such steps for a second eye, determining a position of the second eye based on a second ray, and calibrating the stereoscopic display based on the position of the first eye and the position of the second eye.
    Type: Application
    Filed: October 2, 2015
    Publication date: April 6, 2017
    Inventors: Robert Thomas Held, Anatolie Gavriliuc, Riccardo Giraldi, Szymon P. Stachniak, Andrew Frederick Muehlhausen, Maxime Ouellet
  • Publication number: 20170061694
    Abstract: A head-mounted device includes a gaze detector, a camera, and a communication interface. The gaze detector determines a gaze vector of an eye of a wearer of the head-mounted device. The camera images a physical space including a display of a computing device. The communication interface sends a control signal to the computing device in response to a wearer input. The control signal indicates a location at which the gaze vector intersects the display and useable by the computing device to adjust operation of the computing device.
    Type: Application
    Filed: September 2, 2015
    Publication date: March 2, 2017
    Inventors: Riccardo Giraldi, Anatolie Gavriliuc, Michelle Chua, Andrew Frederick Muehlhausen, Robert Thomas Held, Joseph van den Heuvel
  • Publication number: 20170061692
    Abstract: Examples are disclosed herein that relate to identifying and localizing devices in an environment via an augmented reality display device. One example provides, on a portable augmented reality computing device, a method including establishing a coordinate frame for an environment, and discovering, via a location-sensitive input device, a location of a physical manifestation of the device in the environment, assigning a device location for the device in the coordinate frame based upon the location of the physical manifestation, and modifying an output of the portable augmented reality computing device based upon a change in relative position between the portable augmented reality computing device and the physical manifestation in environment.
    Type: Application
    Filed: September 2, 2015
    Publication date: March 2, 2017
    Inventors: Riccardo Giraldi, Anatolie Gavriliuc, Michelle Chua, Andrew Frederick Muehlhausen, Robert Thomas Held, Joseph van den Heuvel, Todd Alan Omotani, Richard J. Wifall, Christian Sadak, Gregory Alt
  • Publication number: 20160299567
    Abstract: One embodiment provides a method to display video such as computer-rendered animation or other video. The method includes assembling a sequence of video frames featuring a moving object, each video frame including a plurality of subframes sequenced for display according to a schedule. The method also includes determining a vector-valued differential velocity of the moving object relative to a head of an observer of the video. At a time scheduled for display of a first subframe of a given frame, first-subframe image content transformed by a first transform is displayed. At a time scheduled for display of the second subframe of the given frame, second-subframe image content transformed by a second transform is displayed. The first and second transforms are computed based on the vector-valued differential velocity to mitigate artifacts.
    Type: Application
    Filed: April 9, 2015
    Publication date: October 13, 2016
    Inventors: Matthew Calbraith Crisler, Robert Thomas Held, Stephen Latta, Ashraf Ayman Michail, Martin Shetter, Arthur Tomlin
  • Patent number: 8284235
    Abstract: A method for modifying images for display on a three dimensional display includes receiving a first image and a corresponding second image which together represent a three dimensional image when displayed on the three dimensional display. Based upon the first and second image, the method determines whether their content is expected to cause substantial discomfort to a viewer when displayed on the three dimensional display. At least one of the first and second images are modified based at least in part on a physical value related to the pixel pitch of the display, in such a manner to reduce the expected discomfort to the viewer.
    Type: Grant
    Filed: September 28, 2009
    Date of Patent: October 9, 2012
    Assignee: Sharp Laboratories of America, Inc.
    Inventors: Robert Thomas Held, Chang Yuan, Scott J. Daly, Hao Pan