Patents by Inventor Andrew D. Wilson

Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180005451
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Application
    Filed: September 13, 2017
    Publication date: January 4, 2018
    Inventors: Hrvoje Benko, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian, Mark Hancock
  • Patent number: 9857938
    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.
    Type: Grant
    Filed: December 20, 2013
    Date of Patent: January 2, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
  • Patent number: 9842228
    Abstract: Systems and methods of a personal daemon, executing as a background process on a mobile computing device, for providing personal assistant to an associated user is presented. While the personal daemon maintains personal information corresponding to the associated user, the personal daemon is configured to not share the personal information of the associated user with any other entity other than the associated user except under conditions of rules established by the associated user. The personal daemon monitors and analyzes the actions of the associated user to determine additional personal information of the associated user. Additionally, upon receiving one or more notices of events from a plurality of sensors associated with the mobile computing device, the personal daemon executes a personal assistance action on behalf of the associated user.
    Type: Grant
    Filed: August 10, 2016
    Date of Patent: December 12, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael F. Cohen, Douglas C. Burger, Asta Roseway, Andrew D. Wilson, Blaise Hilary Aguera Y Arcas, Daniel Lee Massey
  • Publication number: 20170337700
    Abstract: A method of registering first and second cameras in a multi-camera imager comprising generating virtual fiducials at different locations relative to the multi camera imager and using coordinates of the virtual fiducials to determine a fundamental matrix for the cameras.
    Type: Application
    Filed: May 23, 2016
    Publication date: November 23, 2017
    Inventors: Andrew D. Wilson, Michael Anthony Hall
  • Publication number: 20170330031
    Abstract: The cross-modal sensor fusion technique described herein tracks mobile devices and the users carrying them. The technique matches motion features from sensors on a mobile device to image motion features obtained from images of the device. For example, the acceleration of a mobile device, as measured by an onboard internal measurement unit, is compared to similar acceleration observed in the color and depth images of a depth camera. The technique does not require a model of the appearance of either the user or the device, nor in many cases a direct line of sight to the device. The technique can operate in real time and can be applied to a wide variety of ubiquitous computing scenarios.
    Type: Application
    Filed: May 11, 2017
    Publication date: November 16, 2017
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Publication number: 20170329487
    Abstract: Different techniques of processing user interactions with a computing system are described. In one implementation, an interactive display is configured to depict a graphical user interface which includes a plurality of different types of user interface elements (e.g., button-type element, scroll bar-type element). A user may use one or more user input object (e.g., finger, hand, stylus) to simultaneously interact with the interactive display. A plurality of different user input processing methods are used to process user inputs received by the graphical user interface differently and in accordance with the types of the user interface elements which are displayed. The processing of the user inputs is implemented to determine whether the user inputs control the respective user interface elements. The processing may determine whether the user inputs activate and/or manipulate the displayed user interface elements in but one example.
    Type: Application
    Filed: July 17, 2017
    Publication date: November 16, 2017
    Inventor: Andrew D. Wilson
  • Patent number: 9805514
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Grant
    Filed: April 21, 2016
    Date of Patent: October 31, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Mark Hancock, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian
  • Publication number: 20170309071
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Application
    Filed: April 21, 2016
    Publication date: October 26, 2017
    Inventors: Hrvoje Benko, Mark Hancock, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian
  • Publication number: 20170299957
    Abstract: Techniques and architectures involve operating an array of slide projectors having respective brightnesses controlled by a computer. Such an array of slide projectors may be arranged to project their respective images onto a surface so that the respective images substantially overlap with one another on the surface and produce an integrated image. By judicious selection of slides and by judicious control of the brightness of the respective slide projectors, such an integrated image may render a video or an appearance of motion or other dynamic effect.
    Type: Application
    Filed: April 15, 2016
    Publication date: October 19, 2017
    Inventors: Andrew D. Wilson, Evan Shimizu
  • Publication number: 20170300170
    Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
    Type: Application
    Filed: July 1, 2017
    Publication date: October 19, 2017
    Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
  • Patent number: 9740364
    Abstract: Different techniques of processing user interactions with a computing system are described. In one implementation, an interactive display is configured to depict a graphical user interface which includes a plurality of different types of user interface elements (e.g., button-type element, scroll bar-type element). A user may use one or more user input object (e.g., finger, hand, stylus) to simultaneously interact with the interactive display. A plurality of different user input processing methods are used to process user inputs received by the graphical user interface differently and in accordance with the types of the user interface elements which are displayed. The processing of the user inputs is implemented to determine whether the user inputs control the respective user interface elements. The processing may determine whether the user inputs activate and/or manipulate the displayed user interface elements in but one example.
    Type: Grant
    Filed: May 3, 2010
    Date of Patent: August 22, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Andrew D. Wilson
  • Patent number: 9727161
    Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
    Type: Grant
    Filed: June 12, 2014
    Date of Patent: August 8, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
  • Patent number: 9729984
    Abstract: Technologies pertaining to calibration of filters of an audio system are described herein. A mobile computing device is configured to compute values for respective filters, such as equalizer filters, and transmit the values to a receiver device in the audio system. The receiver device causes audio to be emitted from a speaker based upon the values for the filters.
    Type: Grant
    Filed: January 18, 2014
    Date of Patent: August 8, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Desney S. Tan, Daniel Morris, Andrew D. Wilson, Yong Rui, Nikunj Raghuvanshi, Jeannette M. Wing
  • Publication number: 20170201722
    Abstract: A tele-immersive environment is described that provides interaction among participants of a tele-immersive session. The environment includes two or more set-ups, each associated with a participant. Each set-up, in turn, includes mirror functionality for presenting a three-dimensional virtual space for viewing by a local participant. The virtual space shows at least some of the participants as if the participants were physically present at a same location and looking into a mirror. The mirror functionality can be implemented as a combination of a semi-transparent mirror and a display device, or just a display device acting alone. According to another feature, the environment may present a virtual object in a manner that allows any of the participants of the tele-immersive session to interact with the virtual object.
    Type: Application
    Filed: March 28, 2017
    Publication date: July 13, 2017
    Inventors: Andrew D. Wilson, Zhengyou Zhang, Philip A. Chou, Neil S. Fishman, Donald M. Gillett, Hrvoje Benko
  • Patent number: 9679199
    Abstract: The cross-modal sensor fusion technique described herein tracks mobile devices and the users carrying them. The technique matches motion features from sensors on a mobile device to image motion features obtained from images of the device. For example, the acceleration of a mobile device, as measured by an onboard internal measurement unit, is compared to similar acceleration observed in the color and depth images of a depth camera. The technique does not require a model of the appearance of either the user or the device, nor in many cases a direct line of sight to the device. The technique can operate in real time and can be applied to a wide variety of ubiquitous computing scenarios.
    Type: Grant
    Filed: December 4, 2013
    Date of Patent: June 13, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Patent number: 9641805
    Abstract: A tele-immersive environment is described that provides interaction among participants of a tele-immersive session. The environment includes two or more set-ups, each associated with a participant. Each set-up, in turn, includes mirror functionality for presenting a three-dimensional virtual space for viewing by a local participant. The virtual space shows at least some of the participants as if the participants were physically present at a same location and looking into a mirror. The mirror functionality can be implemented as a combination of a semi-transparent mirror and a display device, or just a display device acting alone. According to another feature, the environment may present a virtual object in a manner that allows any of the participants of the tele-immersive session to interact with the virtual object.
    Type: Grant
    Filed: March 18, 2016
    Date of Patent: May 2, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Zhengyou Zhang, Philip A. Chou, Neil S. Fishman, Donald M. Gillett, Hrvoje Benko
  • Publication number: 20170099453
    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
    Type: Application
    Filed: December 15, 2016
    Publication date: April 6, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20170055075
    Abstract: Technologies pertaining to calibration of filters of an audio system are described herein. A mobile computing device is configured to compute values for respective filters, such as equalizer filters, and transmit the values to a receiver device in the audio system. The receiver device causes audio to be emitted from a speaker based upon the values for the filters.
    Type: Application
    Filed: November 7, 2016
    Publication date: February 23, 2017
    Inventors: Desney S. Tan, Daniel Morris, Andrew D. Wilson, Yong Rui, Nikunj Raghuvanshi, Jeannette M. Wing
  • Patent number: 9560445
    Abstract: Technologies pertaining to provision of customized audio to each listener in a plurality of listeners are described herein. A sensor outputs data that is indicative of locations of multiple listeners in an environment. The data is processed to determine locations and orientations of the respective heads of the multiple listener in the environment. Based on the locations and orientations of heads of the listeners in the environment, for each listener, respective customized audio signals are generated. The customized audio signals are transmitted to respective beamforming transducers. The beamforming transducers directionally output customized beams for the first listener and the second listener based upon the customized audio signals and locations of the heads of the listeners.
    Type: Grant
    Filed: January 18, 2014
    Date of Patent: January 31, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Nikunj Raghuvanshi, Daniel Morris, Andrew D. Wilson, Yong Rui, Desney S. Tan, Jeannette M. Wing
  • Patent number: 9560314
    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
    Type: Grant
    Filed: June 14, 2011
    Date of Patent: January 31, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson