Patents by Inventor Andrew D. Wilson

Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11409361
    Abstract: A computer device is provided that includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.
    Type: Grant
    Filed: February 3, 2020
    Date of Patent: August 9, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. Wilson, Hakim Si Mohammed, Christian Holz, Adrian Kuo Ching Lee, Ivan Jelev Tashev, Hannes Gamper, Edward Bryan Cutrell, David Emerson Johnston, Dimitra Emmanouilidou, Mihai R. Jalobeanu
  • Publication number: 20220053174
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Application
    Filed: October 28, 2021
    Publication date: February 17, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Patent number: 11190741
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Grant
    Filed: March 23, 2018
    Date of Patent: November 30, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20210330325
    Abstract: End effector position set point adjustment includes an instrument and a control unit. The instrument includes an end effector and a drive mechanism for actuating the end effector. The control unit is configured to actuate the end effector using the drive mechanism, determine an actual position of the end effector, identify an expected position associated with the determined actual position of the end effector, determine a position offset based on the expected position and the determined actual position of the end effector, adjust a position set point based on the position offset, and actuate the end effector to the adjusted position set point using the drive mechanism.
    Type: Application
    Filed: July 8, 2021
    Publication date: October 28, 2021
    Inventors: Andrew D. WILSON, Amir CHAGHAJERDI, David W. WEIR
  • Publication number: 20210240264
    Abstract: A computer device is provided is includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.
    Type: Application
    Filed: February 3, 2020
    Publication date: August 5, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. WILSON, Hakim SI MOHAMMED, Christian HOLZ, Adrian Kuo Ching LEE, Ivan Jelev TASHEV, Hannes GAMPER, Edward Bryan CUTRELL, David Emerson JOHNSTON, Dimitra EMMANOUILIDOU, Mihai R. JALOBEANU
  • Publication number: 20210241536
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Application
    Filed: April 9, 2021
    Publication date: August 5, 2021
    Inventors: Andrew D. WILSON, Christian HOLZ, Eyal OFEK, Jeremy HARTMANN
  • Patent number: 11055891
    Abstract: Examples of the present disclosure describe systems and methods for providing real-time motion styling in virtual reality (VR), augmented reality (AR), and/or mixed reality (MR) environments. In aspects, input data corresponding to user interaction with a VR, an AR, or an MR environment may be received. The input data may be featurized to generate a feature set. The feature set may be compared to a set of stored motion data comprising motion capture data representing one or more motion styles for executing an action or activity. Based on the comparison, the feature set may be matched to feature data for one or more motions styles in the stored motion data. The one or more motions styles may then be executed by a virtual avatar or a virtual object in the VR/AR/MR environment.
    Type: Grant
    Filed: March 10, 2020
    Date of Patent: July 6, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Mar Gonzalez Franco, Andrew D. Wilson, Karan Ahuja, Christian Holz
  • Publication number: 20210177412
    Abstract: End effector position set point correction includes an instrument having an end effector and a control unit. In some embodiments, the control unit actuates the end effector to a first position, determines an actuation level, determines an offset based on the actuation level, adjusts a position set point based on the offset, and actuates the end effector to the adjusted position set point. In some embodiments, the control unit actuates the end effector, determines an actuation level, and determines whether the actuation level is above a threshold. In response to determining that the actuation level is above the threshold, the control unit determines a position of the end effector, identifies a nominal position associated with the determined position, determines an offset based on the nominal position and the determined position, adjusts a position set point based on the offset, and actuates the end effector to the adjusted position set point.
    Type: Application
    Filed: November 2, 2018
    Publication date: June 17, 2021
    Inventors: Andrew D. WILSON, Amir CHAGHAJERDI, David W. WEIR
  • Patent number: 11004269
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Grant
    Filed: April 22, 2019
    Date of Patent: May 11, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Christian Holz, Eyal Ofek, Jeremy Hartmann
  • Patent number: 10976816
    Abstract: Various embodiments are provided herein for modifying a virtual scene based on eye-tracking. A computing device coupled to a HMD can provide a virtual scene for display on the HMD. The computing device can receive sensor data from a set of eye-tracking sensors coupled to the HMD. Based on the received sensor data, the computing device can determine a set of focal regions of the displayed virtual scene, including a perifoveal region of the displayed virtual scene. A portion of the virtual scene can then be modified based, in part, on a determination that the portion is outside of the determined perifoveal region.
    Type: Grant
    Filed: June 25, 2019
    Date of Patent: April 13, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Sebastian Lennard Marwecki, Eyal Ofek, Christian Holz
  • Patent number: 10885710
    Abstract: In various embodiments, computerized methods and systems for dynamically updating a fully-immersive virtual environment based on tracked physical environment data. A computing device coupled to a HMD receives sensor data from a variety of sensors. The computing device can generate a virtual scene based on the received sensor data, whereby the virtual scene includes at least a portion of a virtual path that corresponds to at least a portion of a navigable path determined based on the received sensor data. The computing device can modify the virtual scene include a virtual obstruction that corresponds to a physical object detected based on additional sensor data received from the sensors. The modified virtual scene is presented to the user for display, so that the user can safely traverse the physical environment while staying fully-immersed in the virtual environment.
    Type: Grant
    Filed: March 14, 2019
    Date of Patent: January 5, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Christian Holz, Eyal Ofek, Andrew D. Wilson, Lung-Pan Cheng, Junrui Yang
  • Publication number: 20200409455
    Abstract: Various embodiments are provided herein for modifying a virtual scene based on eye-tracking. A computing device coupled to a HMD can provide a virtual scene for display on the HMD. The computing device can receive sensor data from a set of eye-tracking sensors coupled to the HMD. Based on the received sensor data, the computing device can determine a set of focal regions of the displayed virtual scene, including a perifoveal region of the displayed virtual scene. A portion of the virtual scene can then be modified based, in part, on a determination that the portion is outside of the determined perifoveal region.
    Type: Application
    Filed: June 25, 2019
    Publication date: December 31, 2020
    Inventors: Andrew D. WILSON, Sebastian Lennard MARWECKI, Eyal OFEK, Christian HOLZ
  • Publication number: 20200334908
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Application
    Filed: April 22, 2019
    Publication date: October 22, 2020
    Inventors: Andrew D. WILSON, Christian HOLZ, Eyal OFEK, Jeremy HARTMANN
  • Publication number: 20200294311
    Abstract: In various embodiments, computerized methods and systems for dynamically updating a fully-immersive virtual environment based on tracked physical environment data. A computing device coupled to a HMD receives sensor data from a variety of sensors. The computing device can generate a virtual scene based on the received sensor data, whereby the virtual scene includes at least a portion of a virtual path that corresponds to at least a portion of a navigable path determined based on the received sensor data. The computing device can modify the virtual scene include a virtual obstruction that corresponds to a physical object detected based on additional sensor data received from the sensors. The modified virtual scene is presented to the user for display, so that the user can safely traverse the physical environment while staying fully-immersed in the virtual environment.
    Type: Application
    Filed: March 14, 2019
    Publication date: September 17, 2020
    Inventors: Christian HOLZ, Eyal OFEK, Andrew D. WILSON, Lung-Pan CHENG, Junrui YANG
  • Patent number: 10768696
    Abstract: Representative embodiments disclose mechanisms for calibrating an eye gaze selection system. When the calibration is triggered, a snapshot of an area around the current user's gaze point is taken. The snapshot area is then animated to cause motion of the snapshot area. As the snapshot is animated, the user's gaze will naturally track the thing the user was focusing on. This creates an eye tracking vector with a magnitude and direction. The magnitude and direction of the eye tracking vector can then be used to calculate a correction factor for the current user's gaze point. Calibration can be triggered manually by the user or based on some criteria such as error rates in item selection by the user.
    Type: Grant
    Filed: October 5, 2017
    Date of Patent: September 8, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Shane Williams, Andrew D. Wilson
  • Patent number: 10346529
    Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.
    Type: Grant
    Filed: May 18, 2016
    Date of Patent: July 9, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Björn U. Hartmann, Andrew D. Wilson, Hrvoje Benko, Meredith J. Morris
  • Patent number: 10339662
    Abstract: A method of registering first and second cameras in a multi-camera imager comprising generating virtual fiducials at different locations relative to the multi camera imager and using coordinates of the virtual fiducials to determine a fundamental matrix for the cameras.
    Type: Grant
    Filed: May 23, 2016
    Date of Patent: July 2, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. Wilson, Michael Anthony Hall
  • Patent number: 10341857
    Abstract: A privacy preserving sensor apparatus is described herein. The privacy preserving sensor apparatus includes a microphone that is configured to output a signal that is indicative of audio in an environment. The privacy preserving sensor apparatus further includes feature extraction circuitry integrated in the apparatus with the microphone, the feature extraction circuitry configured to extract features from the signal output by the microphone that are usable to detect occurrence of an event in the environment, wherein the signal output by the microphone is unable to be reconstructed based solely upon the features.
    Type: Grant
    Filed: January 24, 2018
    Date of Patent: July 2, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Yong Rui, Daniel Morris, Andrew D. Wilson, Nikunj Raghuvanshi, Desney S. Tan, Jeannette M. Wing
  • Patent number: 10297082
    Abstract: Various technologies pertaining to shared spatial augmented reality (SSAR) are described. Sensor units in a room output sensor signals that are indicative of positions of two or more users in the room and gaze directions of the two or more users. Views of at least one virtual object are computed separately for each of the two or more users, and projectors project such views in the room. The projected views cause the two or more users to simultaneously perceive the virtual object in space.
    Type: Grant
    Filed: April 9, 2015
    Date of Patent: May 21, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Patent number: 10290153
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Grant
    Filed: September 13, 2017
    Date of Patent: May 14, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian, Mark Hancock