Patents by Inventor Andrew D. Wilson

Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240121280
    Abstract: Systems, methods, and computer-readable storage devices are disclosed for simulated choral audio chatter in communication systems. One method including: receiving audio data from each of a plurality of users participating in a first group of a plurality of groups for an event using a communication system; generating first simulated choral audio chatter based on the audio data received from each of the plurality of users in the first group; and providing the generated first simulated choral audio data to at least one user of a plurality of users of the event.
    Type: Application
    Filed: October 7, 2022
    Publication date: April 11, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Edward Sean Lloyd RINTEL, Amos MILLER, Andrew D. WILSON, Sasa JUNUZOVIC
  • Patent number: 11946722
    Abstract: A detector for detecting the removal and/or insertion of a weapon out of and/or into a holster. The detector may transmit a message each time the weapon is removed from the holster. A recording system may receive the message and determine whether or not it will begin recording the data it captures. A detector may detect the change in a magnitude of an inductance and/or an impedance of a circuit to detect insertion and removal of the weapon into and out of the holster. The holster is configured to couple to the detector to position the detector to detect insertion and removal of the weapon. An adhesive tape may couple a detector to a holster.
    Type: Grant
    Filed: May 2, 2023
    Date of Patent: April 2, 2024
    Assignee: Axon Enterprise, Inc.
    Inventors: Daniel Joseph Wagner, Nache D. Shekarri, Jonathan R. Hatcher, John W. Wilson, Andrew G. Terajewicz, Lucas Kraft, Brian Piquette, Zachary B. Williams, Elliot William Weber, Jason W. Haensly
  • Publication number: 20240005579
    Abstract: Systems and methods for representing two-dimensional representations as three-dimensional avatars are provided herein. In some examples, one or more input video streams are received. A first subject, within the one or more input video streams, is identified. Based on the one or more input video streams, a first view of the first subject is identified. Based on the one or more input video streams, a second view of the first subject is identified. The first subject is segmented into a plurality of planar object. The plurality of planar objects are transformed with respect to each other. The plurality of planar objects are based on the first and second views of the first subject. The plurality of planar objects are output in an output video stream. The plurality of planar objects provide perspective of the first subject to one or more viewers.
    Type: Application
    Filed: June 30, 2022
    Publication date: January 4, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mar GONZALEZ FRANCO, Payod PANDA, Andrew D. WILSON, Kori M. INKPEN, Eyal OFEK, William Arthur Stewart BUXTON
  • Patent number: 11847261
    Abstract: A computer device is provided that includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.
    Type: Grant
    Filed: July 27, 2022
    Date of Patent: December 19, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. Wilson, Hakim Si Mohammed, Christian Holz, Adrian Kuo Ching Lee, Ivan Jelev Tashev, Hannes Gamper, Edward Bryan Cutrell, David Emerson Johnston, Dimitra Emmanouilidou, Mihai R. Jalobeanu
  • Patent number: 11778151
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Grant
    Filed: October 28, 2021
    Date of Patent: October 3, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20230290041
    Abstract: Technologies for transitioning between two-dimensional (2D) and three-dimensional (3D) display views for video conferencing are described herein. Video conferencing applications can have multiple display views for a user participating in a video conference. In certain situations a user may want to transition from a 2D display view of the video conference to a more immersive 3D display view. These transitions can be visually jarring and create an uncomfortable user experience. The transition from a 2D display view to a 3D display view can be improved by executing the transition to a 3D display view by manipulating visual properties of a virtual camera that is employed to generate the display views.
    Type: Application
    Filed: March 10, 2022
    Publication date: September 14, 2023
    Inventor: Andrew D. WILSON
  • Publication number: 20230233202
    Abstract: Techniques for automated rotation of a needle in a computer-assisted system include an end effector having a drive mechanism configured to be coupled to a curved needle and configured to rotationally actuate the curved needle along an arcuate path and a control unit coupled to the drive mechanism. The control unit is configured to, in response to receiving a first input, cause the drive mechanism to rotationally actuate the curved needle by a first preset rotation amount along the arcuate path, and, in response to receiving a second input, cause the drive mechanism to rotationally actuate the curved needle by a second preset rotation amount along the arcuate path.
    Type: Application
    Filed: March 31, 2023
    Publication date: July 27, 2023
    Inventors: Andrew D. WILSON, Gabriel F. BRISSON, Robert C. REID
  • Publication number: 20230236713
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Application
    Filed: March 17, 2023
    Publication date: July 27, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
  • Publication number: 20230190262
    Abstract: Techniques for automated rotation of a needle in a computer-assisted system include an end effector having a drive mechanism configured to be coupled to a curved needle and configured to rotationally actuate the curved needle along an arcuate path and a control unit coupled to the drive mechanism. The control unit is configured to, in response to receiving a first input, cause the drive mechanism to rotationally actuate the curved needle by a first preset rotation amount along the arcuate path, and, in response to receiving a second input, cause the drive mechanism to rotationally actuate the curved needle by a second preset rotation amount along the arcuate path.
    Type: Application
    Filed: April 13, 2021
    Publication date: June 22, 2023
    Inventors: Andrew D. WILSON, Gabriel F. BRISSON, Robert C. REID
  • Patent number: 11656747
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Grant
    Filed: September 21, 2021
    Date of Patent: May 23, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: John C. Tang, William Arthur Stewart Buxton, Andrew D. Wilson, Kori M. Inkpen, Sasa Junuzovic, Abigail J. Sellen, Edward Sean Lloyd Rintel
  • Patent number: 11642127
    Abstract: End effector position set point correction includes an instrument having an end effector and a control unit. In some embodiments, the control unit actuates the end effector to a first position, determines an actuation level, determines an offset based on the actuation level, adjusts a position set point based on the offset, and actuates the end effector to the adjusted position set point. In some embodiments, the control unit actuates the end effector, determines an actuation level, and determines whether the actuation level is above a threshold. In response to determining that the actuation level is above the threshold, the control unit determines a position of the end effector, identifies a nominal position associated with the determined position, determines an offset based on the nominal position and the determined position, adjusts a position set point based on the offset, and actuates the end effector to the adjusted position set point.
    Type: Grant
    Filed: November 2, 2018
    Date of Patent: May 9, 2023
    Assignee: INTUITIVE SURGICAL OPERATIONS, INC.
    Inventors: Andrew D. Wilson, Amir Chaghajerdi, David W. Weir
  • Patent number: 11633185
    Abstract: End effector position set point adjustment includes an instrument and a control unit. The instrument includes an end effector and a drive mechanism for actuating the end effector. The control unit is configured to actuate the end effector using the drive mechanism, determine an actual position of the end effector, identify an expected position associated with the determined actual position of the end effector, determine a position offset based on the expected position and the determined actual position of the end effector, adjust a position set point based on the position offset, and actuate the end effector to the adjusted position set point using the drive mechanism.
    Type: Grant
    Filed: July 8, 2021
    Date of Patent: April 25, 2023
    Assignee: INTUITIVE SURGICAL OPERATIONS, INC.
    Inventors: Andrew D. Wilson, Amir Chaghajerdi, David W. Weir
  • Publication number: 20230086906
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Application
    Filed: September 21, 2021
    Publication date: March 23, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
  • Patent number: 11580704
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: February 14, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Christian Holz, Eyal Ofek, Jeremy Hartmann
  • Publication number: 20230045386
    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
    Type: Application
    Filed: October 20, 2022
    Publication date: February 9, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Sasa JUNUZOVIC, William Thomas BLANK, Bruce Arnold CLEARY, III, Anoop GUPTA, Andrew D. WILSON
  • Patent number: 11567633
    Abstract: A computer-implemented method for determining focus of a user is provided. User input is received. An intention image of a scene including a plurality of interactive objects is generated. The intention image includes pixels encoded with intention values determined based on the user input. An intention value indicates a likelihood that the user intends to focus on the pixel. An intention score is determined for each interactive object based on the intention values of pixels that correspond to the interactive object. An interactive object of the plurality of interactive objects is determined to be a focused object that has the user's focus based on the intention scores of the plurality of interactive objects.
    Type: Grant
    Filed: February 8, 2021
    Date of Patent: January 31, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Julia Schwarz, Andrew D. Wilson, Sophie Stellmach, Erian Vazquez, Kristian Jose Davila, Adam Edwin Behringer, Jonathan Palmer, Jason Michael Ray, Mathew Julian Lamb
  • Patent number: 11509861
    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
    Type: Grant
    Filed: December 15, 2016
    Date of Patent: November 22, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20220365599
    Abstract: A computer device is provided that includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.
    Type: Application
    Filed: July 27, 2022
    Publication date: November 17, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. WILSON, Hakim SI MOHAMMED, Christian HOLZ, Adrian Kuo Ching LEE, Ivan Jelev TASHEV, Hannes GAMPER, Edward Bryan CUTRELL, David Emerson JOHNSTON, Dimitra EMMANOUILIDOU, Mihai R. JALOBEANU
  • Publication number: 20220345678
    Abstract: Aspects of the present disclosure relate to distributed virtual reality. In examples, a depth buffer and a color buffer are generated at a presenter device as part of rendering a virtual environment. The virtual environment may be perceived by a user in three dimensions (3D), for example via a virtual reality (VR) headset. Virtual environment information comprising the depth buffer and the color buffer may be transmitted to a viewer device, where it is used to render the virtual environment for display to a viewer. For example, the viewer may similarly view the virtual environment in 3D via a VR headset. A viewer perspective (e.g., from which the virtual environment is generated for the viewer) may differ from a presenter perspective and may be manipulated by the viewer, thereby decoupling the viewer's perception of the virtual environment from that of the presenter.
    Type: Application
    Filed: April 21, 2021
    Publication date: October 27, 2022
    Applicant: Microsoft Technology LIcensing, LLC
    Inventors: Andrew D. WILSON, Balasaravanan Thoravi KUMARAVEL
  • Publication number: 20220253182
    Abstract: A computer-implemented method for determining focus of a user is provided. User input is received. An intention image of a scene including a plurality of interactive objects is generated. The intention image includes pixels encoded with intention values determined based on the user input. An intention value indicates a likelihood that the user intends to focus on the pixel. An intention score is determined for each interactive object based on the intention values of pixels that correspond to the interactive object. An interactive object of the plurality of interactive objects is determined to be a focused object that has the user's focus based on the intention scores of the plurality of interactive objects.
    Type: Application
    Filed: February 8, 2021
    Publication date: August 11, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Julia SCHWARZ, Andrew D. WILSON, Sophie STELLMACH, Erian VAZQUEZ, Kristian Jose DAVILA, Adam Edwin BEHRINGER, Jonathan PALMER, Jason Michael RAY, Mathew Julian LAMB