Patents by Inventor Andrew D. Wilson

Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150179181
    Abstract: Technologies pertaining to improving an auditory experience of a listener are described. Audio is modified based upon noise generated by noise sources in an environment. A microphone generates a signal that is representative of noise in the environment, and the signal is processed to identify peak frequencies therein. When a key frequency of the audio is proximate to a peak frequency in the noise, the audio is modified to improve the listener's perception of the audio.
    Type: Application
    Filed: December 20, 2013
    Publication date: June 25, 2015
    Applicant: Microsoft Corporation
    Inventors: Daniel Morris, Andrew D. Wilson, Desney S. Tan, Yong Rui, Nikunj Raghuvanshi, Jeannette M. Wing
  • Publication number: 20150170013
    Abstract: The infrastruct fabrication and imaging technique described herein uses digital fabrication techniques to embed information inside objects and THz imaging to later decode this information. Information is encoded in a digital model to create structured transitions between materials. Digital fabrication is used to precisely manufacture the digital model with material transitions enclosed internally. A THz Time-Domain Spectroscopy (TDS) system is used to create a volumetric image of the object interior. The volumetric image is processed to decode the embedded structures into meaningful information.
    Type: Application
    Filed: December 14, 2013
    Publication date: June 18, 2015
    Applicant: Microsoft Corporation
    Inventors: Andrew D. Wilson, Karl D.D. Willis
  • Publication number: 20150160784
    Abstract: The subject application relates to a system(s) and/or methodology that facilitate vision-based projection of any image (still or moving) onto any surface. In particular, a front-projected computer vision-based interactive surface system is provided which uses a new commercially available projection technology to obtain a compact, self-contained form factor. The subject configuration addresses installation, calibration, and portability issues that are primary concerns in most vision-based table systems. The subject application also relates to determining whether an object is touching or hovering over an interactive surface based on an analysis of a shadow image.
    Type: Application
    Filed: December 11, 2013
    Publication date: June 11, 2015
    Applicant: Microsoft Corporation
    Inventors: Andrew D. Wilson, Steven N. Bathiche
  • Publication number: 20150154447
    Abstract: The cross-modal sensor fusion technique described herein tracks mobile devices and the users carrying them. The technique matches motion features from sensors on a mobile device to image motion features obtained from images of the device. For example, the acceleration of a mobile device, as measured by an onboard internal measurement unit, is compared to similar acceleration observed in the color and depth images of a depth camera. The technique does not require a model of the appearance of either the user or the device, nor in many cases a direct line of sight to the device. The technique can operate in real time and can be applied to a wide variety of ubiquitous computing scenarios.
    Type: Application
    Filed: December 4, 2013
    Publication date: June 4, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Publication number: 20150097928
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Application
    Filed: December 15, 2014
    Publication date: April 9, 2015
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Patent number: 8954330
    Abstract: The subject disclosure is directed towards detecting symbolic activity within a given environment using a context-dependent grammar. In response to receiving sets of input data corresponding to one or more input modalities, a context-aware interactive system processes a model associated with interpreting the symbolic activity using context data for the given environment. Based on the model, related sets of input data are determined. The context-aware interactive system uses the input data to interpret user intent with respect to the input and thereby, identify one or more commands for a target output mechanism.
    Type: Grant
    Filed: November 28, 2011
    Date of Patent: February 10, 2015
    Assignee: Microsoft Corporation
    Inventors: Michael F. Koenig, Oscar Enrique Murillo, Ira Lynn Snyder, Jr., Andrew D. Wilson, Kenneth P. Hinckley, Ali M. Vassigh
  • Patent number: 8933912
    Abstract: A system and method are disclosed for providing a touch interface for electronic devices. The touch interface can be any surface. As one example, a table top can be used as a touch sensitive interface. In one embodiment, the system determines a touch region of the surface, and correlates that touch region to a display of an electronic device for which input is provided. The system may have a 3D camera that identifies the relative position of a user's hands to the touch region to allow for user input. Note that the user's hands do not occlude the display. The system may render a representation of the user's hand on the display in order for the user to interact with elements on the display screen.
    Type: Grant
    Filed: April 2, 2012
    Date of Patent: January 13, 2015
    Assignee: Microsoft Corporation
    Inventors: Anthony J. Ambrus, Abdulwajid N. Mohamed, Andrew D. Wilson, Brian J. Mount, Jordan D. Andersen
  • Patent number: 8928735
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Grant
    Filed: June 14, 2011
    Date of Patent: January 6, 2015
    Assignee: Microsoft Corporation
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20140327784
    Abstract: A computer-implemented method for utilizing a camera device to track an object is presented. As part of the method, a region of interest is determined within an overall image sensing area. A point light source is then tracked within the region of interest. In a particular arrangement, the camera device incorporates CMOS image sensor technology and the point light source is an IR LED. Other embodiments pertain to manipulations of the region of interest to accommodate changes to the status of the point light source.
    Type: Application
    Filed: July 18, 2014
    Publication date: November 6, 2014
    Inventor: Andrew D. Wilson
  • Patent number: 8847739
    Abstract: The claimed subject matter provides a system and/or a method that facilitates detecting and identifying objects within surface computing. An interface component can receive at least one surface input, the surface input relates to at least one of an object, a gesture, or a user. A surface detection component can detect a location of the surface input utilizing a computer vision-based sensing technique. A Radio Frequency Identification (RFID) tag can transmit a portion of RFID data, wherein the RFID tag is associated with the surface input. A Radio Frequency Identification (RFID) fusion component can utilize the portion of RFID data to identify at least one of a source of the surface input or a portion of data to associate to the surface input.
    Type: Grant
    Filed: August 4, 2008
    Date of Patent: September 30, 2014
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Alex Olwal
  • Publication number: 20140232816
    Abstract: A tele-immersive environment is described that provides interaction among participants of a tele-immersive session. The environment includes two or more set-ups, each associated with a participant. Each set-up, in turn, includes mirror functionality for presenting a three-dimensional virtual space for viewing by a local participant. The virtual space shows at least some of the participants as if the participants were physically present at a same location and looking into a mirror. The mirror functionality can be implemented as a combination of a semi-transparent mirror and a display device, or just a display device acting alone. According to another feature, the environment may present a virtual object in a manner that allows any of the participants of the tele-immersive session to interact with the virtual object.
    Type: Application
    Filed: February 20, 2013
    Publication date: August 21, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Andrew D. Wilson, Philip A. Chou, Donald M. Gillett, Hrvoje Benko, Zhengyou Zhang, Neil S. Fishman
  • Patent number: 8745541
    Abstract: A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.
    Type: Grant
    Filed: December 1, 2003
    Date of Patent: June 3, 2014
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Nuria M. Oliver
  • Publication number: 20140109017
    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.
    Type: Application
    Filed: December 20, 2013
    Publication date: April 17, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
  • Patent number: 8670632
    Abstract: Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source.
    Type: Grant
    Filed: March 15, 2012
    Date of Patent: March 11, 2014
    Assignee: Microsoft Corporation
    Inventor: Andrew D. Wilson
  • Patent number: 8619052
    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.
    Type: Grant
    Filed: November 14, 2011
    Date of Patent: December 31, 2013
    Assignee: Microsoft Corporation
    Inventors: Hrvoje Benko, Andrew D. Wilson, Patrick M Baudisch
  • Patent number: 8611667
    Abstract: The subject application relates to a system(s) and/or methodology that facilitate vision-based projection of any image (still or moving) onto any surface. In particular, a front-projected computer vision-based interactive surface system is provided which uses a new commercially available projection technology to obtain a compact, self-contained form factor. The subject configuration addresses installation, calibration, and portability issues that are primary concerns in most vision-based table systems. The subject application also relates to determining whether an object is touching or hovering over an interactive surface based on an analysis of a shadow image.
    Type: Grant
    Filed: May 18, 2011
    Date of Patent: December 17, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Steven N. Bathiche
  • Patent number: 8576253
    Abstract: The claimed subject matter provides a system and/or a method for simulating grasping of a virtual object. Virtual 3D objects receive simulated user input forces via a 2D input surface adjacent to them. An exemplary method comprises receiving a user input corresponding to a grasping gesture that includes at least two simulated contacts with the virtual object. The grasping gesture is modeled as a simulation of frictional forces on the virtual object. A simulated physical effect on the virtual object by the frictional forces is determined. At least one microprocessor is used to display a visual image of the virtual object moving according to the simulated physical effect.
    Type: Grant
    Filed: April 27, 2010
    Date of Patent: November 5, 2013
    Assignee: Microsoft Corporation
    Inventor: Andrew D. Wilson
  • Patent number: 8560972
    Abstract: Disclosed is a unique system and method that facilitates gesture-based interaction with a user interface. The system involves an object sensing configured to include a sensing plane vertically or horizontally located between at least two imaging components on one side and a user on the other. The imaging components can acquire input images taken of a view of and through the sensing plane. The images can include objects which are on the sensing plane and/or in the background scene as well as the user as he interacts with the sensing plane. By processing the input images, one output image can be returned which shows the user objects that are in contact with the plane. Thus, objects located at a particular depth can be readily determined. Any other objects located beyond can be “removed” and not seen in the output image.
    Type: Grant
    Filed: August 10, 2004
    Date of Patent: October 15, 2013
    Assignee: Microsoft Corporation
    Inventor: Andrew D Wilson
  • Patent number: 8552976
    Abstract: Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.
    Type: Grant
    Filed: January 9, 2012
    Date of Patent: October 8, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Michael J. Sinclair
  • Publication number: 20130257748
    Abstract: A system and method are disclosed for providing a touch interface for electronic devices. The touch interface can be any surface. As one example, a table top can be used as a touch sensitive interface. In one embodiment, the system determines a touch region of the surface, and correlates that touch region to a display of an electronic device for which input is provided. The system may have a 3D camera that identifies the relative position of a user's hands to the touch region to allow for user input. Note that the user's hands do not occlude the display. The system may render a representation of the user's hand on the display in order for the user to interact with elements on the display screen.
    Type: Application
    Filed: April 2, 2012
    Publication date: October 3, 2013
    Inventors: Anthony J. Ambrus, Abdulwajid N. Mohamed, Andrew D. Wilson, Brian J. Mount, Jordan D. Andersen