Patents by Inventor Andrew D. Wilson
Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230290041Abstract: Technologies for transitioning between two-dimensional (2D) and three-dimensional (3D) display views for video conferencing are described herein. Video conferencing applications can have multiple display views for a user participating in a video conference. In certain situations a user may want to transition from a 2D display view of the video conference to a more immersive 3D display view. These transitions can be visually jarring and create an uncomfortable user experience. The transition from a 2D display view to a 3D display view can be improved by executing the transition to a 3D display view by manipulating visual properties of a virtual camera that is employed to generate the display views.Type: ApplicationFiled: March 10, 2022Publication date: September 14, 2023Inventor: Andrew D. WILSON
-
Publication number: 20230233202Abstract: Techniques for automated rotation of a needle in a computer-assisted system include an end effector having a drive mechanism configured to be coupled to a curved needle and configured to rotationally actuate the curved needle along an arcuate path and a control unit coupled to the drive mechanism. The control unit is configured to, in response to receiving a first input, cause the drive mechanism to rotationally actuate the curved needle by a first preset rotation amount along the arcuate path, and, in response to receiving a second input, cause the drive mechanism to rotationally actuate the curved needle by a second preset rotation amount along the arcuate path.Type: ApplicationFiled: March 31, 2023Publication date: July 27, 2023Inventors: Andrew D. WILSON, Gabriel F. BRISSON, Robert C. REID
-
Publication number: 20230236713Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.Type: ApplicationFiled: March 17, 2023Publication date: July 27, 2023Applicant: Microsoft Technology Licensing, LLCInventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
-
Publication number: 20230190262Abstract: Techniques for automated rotation of a needle in a computer-assisted system include an end effector having a drive mechanism configured to be coupled to a curved needle and configured to rotationally actuate the curved needle along an arcuate path and a control unit coupled to the drive mechanism. The control unit is configured to, in response to receiving a first input, cause the drive mechanism to rotationally actuate the curved needle by a first preset rotation amount along the arcuate path, and, in response to receiving a second input, cause the drive mechanism to rotationally actuate the curved needle by a second preset rotation amount along the arcuate path.Type: ApplicationFiled: April 13, 2021Publication date: June 22, 2023Inventors: Andrew D. WILSON, Gabriel F. BRISSON, Robert C. REID
-
Patent number: 11656747Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.Type: GrantFiled: September 21, 2021Date of Patent: May 23, 2023Assignee: Microsoft Technology Licensing, LLCInventors: John C. Tang, William Arthur Stewart Buxton, Andrew D. Wilson, Kori M. Inkpen, Sasa Junuzovic, Abigail J. Sellen, Edward Sean Lloyd Rintel
-
Patent number: 11642127Abstract: End effector position set point correction includes an instrument having an end effector and a control unit. In some embodiments, the control unit actuates the end effector to a first position, determines an actuation level, determines an offset based on the actuation level, adjusts a position set point based on the offset, and actuates the end effector to the adjusted position set point. In some embodiments, the control unit actuates the end effector, determines an actuation level, and determines whether the actuation level is above a threshold. In response to determining that the actuation level is above the threshold, the control unit determines a position of the end effector, identifies a nominal position associated with the determined position, determines an offset based on the nominal position and the determined position, adjusts a position set point based on the offset, and actuates the end effector to the adjusted position set point.Type: GrantFiled: November 2, 2018Date of Patent: May 9, 2023Assignee: INTUITIVE SURGICAL OPERATIONS, INC.Inventors: Andrew D. Wilson, Amir Chaghajerdi, David W. Weir
-
Patent number: 11633185Abstract: End effector position set point adjustment includes an instrument and a control unit. The instrument includes an end effector and a drive mechanism for actuating the end effector. The control unit is configured to actuate the end effector using the drive mechanism, determine an actual position of the end effector, identify an expected position associated with the determined actual position of the end effector, determine a position offset based on the expected position and the determined actual position of the end effector, adjust a position set point based on the position offset, and actuate the end effector to the adjusted position set point using the drive mechanism.Type: GrantFiled: July 8, 2021Date of Patent: April 25, 2023Assignee: INTUITIVE SURGICAL OPERATIONS, INC.Inventors: Andrew D. Wilson, Amir Chaghajerdi, David W. Weir
-
Publication number: 20230086906Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.Type: ApplicationFiled: September 21, 2021Publication date: March 23, 2023Applicant: Microsoft Technology Licensing, LLCInventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
-
Patent number: 11580704Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.Type: GrantFiled: April 9, 2021Date of Patent: February 14, 2023Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Andrew D. Wilson, Christian Holz, Eyal Ofek, Jeremy Hartmann
-
Publication number: 20230045386Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.Type: ApplicationFiled: October 20, 2022Publication date: February 9, 2023Applicant: Microsoft Technology Licensing, LLCInventors: Sasa JUNUZOVIC, William Thomas BLANK, Bruce Arnold CLEARY, III, Anoop GUPTA, Andrew D. WILSON
-
Patent number: 11567633Abstract: A computer-implemented method for determining focus of a user is provided. User input is received. An intention image of a scene including a plurality of interactive objects is generated. The intention image includes pixels encoded with intention values determined based on the user input. An intention value indicates a likelihood that the user intends to focus on the pixel. An intention score is determined for each interactive object based on the intention values of pixels that correspond to the interactive object. An interactive object of the plurality of interactive objects is determined to be a focused object that has the user's focus based on the intention scores of the plurality of interactive objects.Type: GrantFiled: February 8, 2021Date of Patent: January 31, 2023Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Andrew D. Wilson, Sophie Stellmach, Erian Vazquez, Kristian Jose Davila, Adam Edwin Behringer, Jonathan Palmer, Jason Michael Ray, Mathew Julian Lamb
-
Patent number: 11509861Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.Type: GrantFiled: December 15, 2016Date of Patent: November 22, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson
-
Publication number: 20220365599Abstract: A computer device is provided that includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.Type: ApplicationFiled: July 27, 2022Publication date: November 17, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Andrew D. WILSON, Hakim SI MOHAMMED, Christian HOLZ, Adrian Kuo Ching LEE, Ivan Jelev TASHEV, Hannes GAMPER, Edward Bryan CUTRELL, David Emerson JOHNSTON, Dimitra EMMANOUILIDOU, Mihai R. JALOBEANU
-
Publication number: 20220345678Abstract: Aspects of the present disclosure relate to distributed virtual reality. In examples, a depth buffer and a color buffer are generated at a presenter device as part of rendering a virtual environment. The virtual environment may be perceived by a user in three dimensions (3D), for example via a virtual reality (VR) headset. Virtual environment information comprising the depth buffer and the color buffer may be transmitted to a viewer device, where it is used to render the virtual environment for display to a viewer. For example, the viewer may similarly view the virtual environment in 3D via a VR headset. A viewer perspective (e.g., from which the virtual environment is generated for the viewer) may differ from a presenter perspective and may be manipulated by the viewer, thereby decoupling the viewer's perception of the virtual environment from that of the presenter.Type: ApplicationFiled: April 21, 2021Publication date: October 27, 2022Applicant: Microsoft Technology LIcensing, LLCInventors: Andrew D. WILSON, Balasaravanan Thoravi KUMARAVEL
-
Publication number: 20220253182Abstract: A computer-implemented method for determining focus of a user is provided. User input is received. An intention image of a scene including a plurality of interactive objects is generated. The intention image includes pixels encoded with intention values determined based on the user input. An intention value indicates a likelihood that the user intends to focus on the pixel. An intention score is determined for each interactive object based on the intention values of pixels that correspond to the interactive object. An interactive object of the plurality of interactive objects is determined to be a focused object that has the user's focus based on the intention scores of the plurality of interactive objects.Type: ApplicationFiled: February 8, 2021Publication date: August 11, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Julia SCHWARZ, Andrew D. WILSON, Sophie STELLMACH, Erian VAZQUEZ, Kristian Jose DAVILA, Adam Edwin BEHRINGER, Jonathan PALMER, Jason Michael RAY, Mathew Julian LAMB
-
Patent number: 11409361Abstract: A computer device is provided that includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.Type: GrantFiled: February 3, 2020Date of Patent: August 9, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Andrew D. Wilson, Hakim Si Mohammed, Christian Holz, Adrian Kuo Ching Lee, Ivan Jelev Tashev, Hannes Gamper, Edward Bryan Cutrell, David Emerson Johnston, Dimitra Emmanouilidou, Mihai R. Jalobeanu
-
Publication number: 20220053174Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.Type: ApplicationFiled: October 28, 2021Publication date: February 17, 2022Applicant: Microsoft Technology Licensing, LLCInventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
-
Patent number: 11190741Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.Type: GrantFiled: March 23, 2018Date of Patent: November 30, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
-
Publication number: 20210330325Abstract: End effector position set point adjustment includes an instrument and a control unit. The instrument includes an end effector and a drive mechanism for actuating the end effector. The control unit is configured to actuate the end effector using the drive mechanism, determine an actual position of the end effector, identify an expected position associated with the determined actual position of the end effector, determine a position offset based on the expected position and the determined actual position of the end effector, adjust a position set point based on the position offset, and actuate the end effector to the adjusted position set point using the drive mechanism.Type: ApplicationFiled: July 8, 2021Publication date: October 28, 2021Inventors: Andrew D. WILSON, Amir CHAGHAJERDI, David W. WEIR
-
Publication number: 20210240264Abstract: A computer device is provided is includes a display device, and a sensor system configured to be mounted adjacent to a user's head and to measure an electrical potential near one or more electrodes of the sensor system. The computer device further includes a processor configured to present a periodic motion-based visual stimulus having a changing motion that is frequency-modulated for a target frequency or code-modulated for a target code, detect changes in the electrical potential via the one or more electrodes, identify a corresponding visual evoked potential feature in the detected changes in electrical potential that corresponds to the periodic motion-based visual stimulus, and recognize a user input to the computing device based on identifying the corresponding visual evoked potential feature.Type: ApplicationFiled: February 3, 2020Publication date: August 5, 2021Applicant: Microsoft Technology Licensing, LLCInventors: Andrew D. WILSON, Hakim SI MOHAMMED, Christian HOLZ, Adrian Kuo Ching LEE, Ivan Jelev TASHEV, Hannes GAMPER, Edward Bryan CUTRELL, David Emerson JOHNSTON, Dimitra EMMANOUILIDOU, Mihai R. JALOBEANU