Patents by Inventor Andrew D. Wilson

Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130229353
    Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.
    Type: Application
    Filed: March 27, 2013
    Publication date: September 5, 2013
    Applicant: Microsoft Corporation
    Inventors: Björn U. Hartmann, Andrew D. Wilson, Hrvoje Benko, Meredith J. Morris
  • Patent number: 8509847
    Abstract: A mobile device connection system is provided. The system includes an input medium to detect a device position or location. An analysis component determines a device type and establishes a connection with the device. The input medium can include vision systems to detect device presence and location where connections are established via wireless technologies.
    Type: Grant
    Filed: December 28, 2012
    Date of Patent: August 13, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Raman K Sarin, Kenneth P. Hinckley
  • Patent number: 8493366
    Abstract: A dynamic projected user interface device is disclosed, that includes a projector, a projection controller, and an imaging sensor. The projection controller is configured to receive instructions from a computing device, and to provide display images via the projector onto display surfaces. The display images are indicative of a first set of input controls when the computing device is in a first operating context, and a second set of input controls when the computing device is in a second operating context. The imaging sensor is configured to optically detect physical contacts with the one or more display surfaces.
    Type: Grant
    Filed: July 13, 2011
    Date of Patent: July 23, 2013
    Assignee: Microsoft Corporation
    Inventors: Steven N. Bathiche, Andrew D. Wilson
  • Publication number: 20130138424
    Abstract: The subject disclosure is directed towards detecting symbolic activity within a given environment using a context-dependent grammar. In response to receiving sets of input data corresponding to one or more input modalities, a context-aware interactive system processes a model associated with interpreting the symbolic activity using context data for the given environment. Based on the model, related sets of input data are determined. The context-aware interactive system uses the input data to interpret user intent with respect to the input and thereby, identify one or more commands for a target output mechanism.
    Type: Application
    Filed: November 28, 2011
    Publication date: May 30, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Michael F. Koenig, Oscar Enrique Murillo, Ira Lynn Snyder, JR., Andrew D. Wilson, Kenneth P. Hinckley, Ali M. Vassigh
  • Patent number: 8427424
    Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.
    Type: Grant
    Filed: September 30, 2008
    Date of Patent: April 23, 2013
    Assignee: Microsoft Corporation
    Inventors: Björn U. Hartmann, Andrew D. Wilson, Hrvoje Benko, Meredith J. Morris
  • Publication number: 20130069931
    Abstract: A system is described herein which receives internal-assessed (IA) movement information from a mobile device. The system also receives external-assessed (EA) movement information from at least one monitoring system which captures a scene containing the mobile device. The system then compares the IA movement information with the EA movement information with respect to each candidate object in the scene. If the IA movement information matches the EA movement information for a particular candidate object, the system concludes that the candidate object is associated with the mobile device. For example, the object may correspond to a hand that holds the mobile device. The system can use the correlation results produced in the above-indicated manner to perform various environment-specific actions.
    Type: Application
    Filed: September 15, 2011
    Publication date: March 21, 2013
    Applicant: Microsoft Corporation
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Patent number: 8380246
    Abstract: A mobile device connection system is provided. The system includes an input medium to detect a device position or location. An analysis component determines a device type and establishes a connection with the device. The input medium can include vision systems to detect device presence and location where connections are established via wireless technologies.
    Type: Grant
    Filed: August 15, 2007
    Date of Patent: February 19, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Raman K. Sarin, Kenneth P. Hinckley
  • Publication number: 20120320158
    Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
    Type: Application
    Filed: June 14, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20120320157
    Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.
    Type: Application
    Filed: June 14, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
  • Publication number: 20120212509
    Abstract: An interaction system is described which uses a depth camera to capture a depth image of a physical object placed on, or in vicinity to, an interactive surface. The interaction system also uses a video camera to capture a video image of the physical object. The interaction system can then generate a 3D virtual object based on the depth image and video image. The interaction system then uses a 3D projector to project the 3D virtual object back onto the interactive surface, e.g., in a mirrored relationship to the physical object. A user may then capture and manipulate the 3D virtual object in any manner. Further, the user may construct a composite model based on smaller component 3D virtual objects. The interaction system uses a projective texturing technique to present a realistic-looking 3D virtual object on a surface having any geometry.
    Type: Application
    Filed: March 29, 2011
    Publication date: August 23, 2012
    Applicant: Microsoft Corporation
    Inventors: Hrvoje Benko, Ricardo Jorge Jota Costa, Andrew D. Wilson
  • Publication number: 20120169673
    Abstract: Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source.
    Type: Application
    Filed: March 15, 2012
    Publication date: July 5, 2012
    Applicant: MICROSOFT CORPORATION
    Inventor: Andrew D. Wilson
  • Patent number: 8184101
    Abstract: The detection of touch on an optical touch-sensitive device is disclosed. For example, one disclosed embodiment comprises a touch-sensitive device including a display screen, a laser, and a scanning mirror configured to scan light from the laser across the screen. The touch-sensitive device also includes a position-sensitive device and optics configured to form an image of at least a portion of the screen on the position-sensitive device. A location of an object relative to the screen may be determined by detecting a location on the position-sensitive device of laser light reflected by the object.
    Type: Grant
    Filed: October 3, 2007
    Date of Patent: May 22, 2012
    Assignee: Microsoft Corporation
    Inventors: Nigel Keam, John Lewis, Andrew D Wilson
  • Publication number: 20120105315
    Abstract: Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.
    Type: Application
    Filed: January 9, 2012
    Publication date: May 3, 2012
    Applicant: Microsoft Corporation
    Inventors: Andrew D. Wilson, Michael J. Sinclair
  • Patent number: 8165422
    Abstract: Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source.
    Type: Grant
    Filed: June 26, 2009
    Date of Patent: April 24, 2012
    Assignee: Microsoft Corporation
    Inventor: Andrew D. Wilson
  • Publication number: 20120056840
    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.
    Type: Application
    Filed: November 14, 2011
    Publication date: March 8, 2012
    Applicant: Microsoft Corporation
    Inventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
  • Patent number: 8115732
    Abstract: Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.
    Type: Grant
    Filed: April 23, 2009
    Date of Patent: February 14, 2012
    Assignee: Microsoft Corporation
    Inventors: Andrew D. Wilson, Michael J. Sinclair
  • Patent number: 8100756
    Abstract: A system that facilitates enhancing a game, game play or playability of a game may include an experience component, a game component and an alteration component. The experience component can collect a portion of data related to a game in which the portion of data indicates at least one of a tip or a tactic for the game. The game component can dynamically incorporate the portion of data into the game during game play to enhance playability of such game for a user with assistance provided by at least one of the tip or the tactic. The alteration component may alter the game during game play.
    Type: Grant
    Filed: September 28, 2007
    Date of Patent: January 24, 2012
    Assignee: Microsoft Corporation
    Inventors: Bret P. O'Rourke, Eric P. Wilfrid, Nigel S. Keam, Steven Bathiche, James M. Alkove, Zachary L. Russell, Jon Marcus Randall Whitten, Boyd C. Multerer, Andrew D. Wilson
  • Publication number: 20110310232
    Abstract: Described is using a combination of which a multi-view display is provided by a combining spatial multiplexing (e.g., using a parallax barrier or lenslet), and temporal multiplexing (e.g., using a directed backlight). A scheduling algorithm generates different views by determining which light sources are illuminated at a particular time. Via the temporal multiplexing, different views may be in the same spatial viewing angle (spatial zone). Two of the views may correspond to two eyes of a person, with different video data sent to each eye to provide an autostereoscopic display for that person. Eye (head) tracking may be used to move the view or views with a person as that person moves.
    Type: Application
    Filed: June 21, 2010
    Publication date: December 22, 2011
    Applicant: Microsoft Corporation
    Inventors: Andrew D. Wilson, Steven Bathiche
  • Patent number: 8077153
    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.
    Type: Grant
    Filed: April 19, 2006
    Date of Patent: December 13, 2011
    Assignee: Microsoft Corporation
    Inventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
  • Publication number: 20110285633
    Abstract: A dynamic projected user interface device is disclosed, that includes a projector, a projection controller, and an imaging sensor. The projection controller is configured to receive instructions from a computing device, and to provide display images via the projector onto display surfaces. The display images are indicative of a first set of input controls when the computing device is in a first operating context, and a second set of input controls when the computing device is in a second operating context. The imaging sensor is configured to optically detect physical contacts with the one or more display surfaces.
    Type: Application
    Filed: July 13, 2011
    Publication date: November 24, 2011
    Applicant: Microsoft Corporation
    Inventors: Steven N. Bathiche, Andrew D. Wilson