Patents by Inventor Andrew D. Wilson
Andrew D. Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20130229353Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.Type: ApplicationFiled: March 27, 2013Publication date: September 5, 2013Applicant: Microsoft CorporationInventors: Björn U. Hartmann, Andrew D. Wilson, Hrvoje Benko, Meredith J. Morris
-
Patent number: 8509847Abstract: A mobile device connection system is provided. The system includes an input medium to detect a device position or location. An analysis component determines a device type and establishes a connection with the device. The input medium can include vision systems to detect device presence and location where connections are established via wireless technologies.Type: GrantFiled: December 28, 2012Date of Patent: August 13, 2013Assignee: Microsoft CorporationInventors: Andrew D. Wilson, Raman K Sarin, Kenneth P. Hinckley
-
Patent number: 8493366Abstract: A dynamic projected user interface device is disclosed, that includes a projector, a projection controller, and an imaging sensor. The projection controller is configured to receive instructions from a computing device, and to provide display images via the projector onto display surfaces. The display images are indicative of a first set of input controls when the computing device is in a first operating context, and a second set of input controls when the computing device is in a second operating context. The imaging sensor is configured to optically detect physical contacts with the one or more display surfaces.Type: GrantFiled: July 13, 2011Date of Patent: July 23, 2013Assignee: Microsoft CorporationInventors: Steven N. Bathiche, Andrew D. Wilson
-
Publication number: 20130138424Abstract: The subject disclosure is directed towards detecting symbolic activity within a given environment using a context-dependent grammar. In response to receiving sets of input data corresponding to one or more input modalities, a context-aware interactive system processes a model associated with interpreting the symbolic activity using context data for the given environment. Based on the model, related sets of input data are determined. The context-aware interactive system uses the input data to interpret user intent with respect to the input and thereby, identify one or more commands for a target output mechanism.Type: ApplicationFiled: November 28, 2011Publication date: May 30, 2013Applicant: MICROSOFT CORPORATIONInventors: Michael F. Koenig, Oscar Enrique Murillo, Ira Lynn Snyder, JR., Andrew D. Wilson, Kenneth P. Hinckley, Ali M. Vassigh
-
Patent number: 8427424Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.Type: GrantFiled: September 30, 2008Date of Patent: April 23, 2013Assignee: Microsoft CorporationInventors: Björn U. Hartmann, Andrew D. Wilson, Hrvoje Benko, Meredith J. Morris
-
Publication number: 20130069931Abstract: A system is described herein which receives internal-assessed (IA) movement information from a mobile device. The system also receives external-assessed (EA) movement information from at least one monitoring system which captures a scene containing the mobile device. The system then compares the IA movement information with the EA movement information with respect to each candidate object in the scene. If the IA movement information matches the EA movement information for a particular candidate object, the system concludes that the candidate object is associated with the mobile device. For example, the object may correspond to a hand that holds the mobile device. The system can use the correlation results produced in the above-indicated manner to perform various environment-specific actions.Type: ApplicationFiled: September 15, 2011Publication date: March 21, 2013Applicant: Microsoft CorporationInventors: Andrew D. Wilson, Hrvoje Benko
-
Patent number: 8380246Abstract: A mobile device connection system is provided. The system includes an input medium to detect a device position or location. An analysis component determines a device type and establishes a connection with the device. The input medium can include vision systems to detect device presence and location where connections are established via wireless technologies.Type: GrantFiled: August 15, 2007Date of Patent: February 19, 2013Assignee: Microsoft CorporationInventors: Andrew D. Wilson, Raman K. Sarin, Kenneth P. Hinckley
-
Publication number: 20120320158Abstract: The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.Type: ApplicationFiled: June 14, 2011Publication date: December 20, 2012Applicant: MICROSOFT CORPORATIONInventors: Sasa Junuzovic, William Thomas Blank, Bruce Arnold Cleary, III, Anoop Gupta, Andrew D. Wilson
-
Publication number: 20120320157Abstract: A “Concurrent Projector-Camera” uses an image projection device in combination with one or more cameras to enable various techniques that provide visually flicker-free projection of images or video, while real-time image or video capture is occurring in that same space. The Concurrent Projector-Camera provides this projection in a manner that eliminates video feedback into the real-time image or video capture. More specifically, the Concurrent Projector-Camera dynamically synchronizes a combination of projector lighting (or light-control points) on-state temporal compression in combination with on-state temporal shifting during each image frame projection to open a “capture time slot” for image capture during which no image is being projected. This capture time slot represents a tradeoff between image capture time and decreased brightness of the projected image.Type: ApplicationFiled: June 14, 2011Publication date: December 20, 2012Applicant: MICROSOFT CORPORATIONInventors: Sasa Junuzovic, William Thomas Blank, Steven Bathiche, Anoop Gupta, Andrew D. Wilson
-
Publication number: 20120212509Abstract: An interaction system is described which uses a depth camera to capture a depth image of a physical object placed on, or in vicinity to, an interactive surface. The interaction system also uses a video camera to capture a video image of the physical object. The interaction system can then generate a 3D virtual object based on the depth image and video image. The interaction system then uses a 3D projector to project the 3D virtual object back onto the interactive surface, e.g., in a mirrored relationship to the physical object. A user may then capture and manipulate the 3D virtual object in any manner. Further, the user may construct a composite model based on smaller component 3D virtual objects. The interaction system uses a projective texturing technique to present a realistic-looking 3D virtual object on a surface having any geometry.Type: ApplicationFiled: March 29, 2011Publication date: August 23, 2012Applicant: Microsoft CorporationInventors: Hrvoje Benko, Ricardo Jorge Jota Costa, Andrew D. Wilson
-
Publication number: 20120169673Abstract: Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source.Type: ApplicationFiled: March 15, 2012Publication date: July 5, 2012Applicant: MICROSOFT CORPORATIONInventor: Andrew D. Wilson
-
Patent number: 8184101Abstract: The detection of touch on an optical touch-sensitive device is disclosed. For example, one disclosed embodiment comprises a touch-sensitive device including a display screen, a laser, and a scanning mirror configured to scan light from the laser across the screen. The touch-sensitive device also includes a position-sensitive device and optics configured to form an image of at least a portion of the screen on the position-sensitive device. A location of an object relative to the screen may be determined by detecting a location on the position-sensitive device of laser light reflected by the object.Type: GrantFiled: October 3, 2007Date of Patent: May 22, 2012Assignee: Microsoft CorporationInventors: Nigel Keam, John Lewis, Andrew D Wilson
-
Publication number: 20120105315Abstract: Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.Type: ApplicationFiled: January 9, 2012Publication date: May 3, 2012Applicant: Microsoft CorporationInventors: Andrew D. Wilson, Michael J. Sinclair
-
Patent number: 8165422Abstract: Effects of undesired infrared light are reduced in an imaging system using an infrared light source. The desired infrared light source is activated and a first set of imaging data is captured during a first image capture interval. The desired infrared light source is then deactivated, and a second set of image data is captured during a second image capture interval. A composite set of image data is then generated by subtracting from first values in the first set of image data corresponding second values in the second set of image data. The composite set of image data thus includes a set of imaging where data all infrared signals are collected, including both signals resulting from the IR source and other IR signals, from which is subtracted imaging in which no signals result from the IR course, leaving image data including signals resulting only from the IR source.Type: GrantFiled: June 26, 2009Date of Patent: April 24, 2012Assignee: Microsoft CorporationInventor: Andrew D. Wilson
-
Publication number: 20120056840Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.Type: ApplicationFiled: November 14, 2011Publication date: March 8, 2012Applicant: Microsoft CorporationInventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
-
Patent number: 8115732Abstract: Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.Type: GrantFiled: April 23, 2009Date of Patent: February 14, 2012Assignee: Microsoft CorporationInventors: Andrew D. Wilson, Michael J. Sinclair
-
Patent number: 8100756Abstract: A system that facilitates enhancing a game, game play or playability of a game may include an experience component, a game component and an alteration component. The experience component can collect a portion of data related to a game in which the portion of data indicates at least one of a tip or a tactic for the game. The game component can dynamically incorporate the portion of data into the game during game play to enhance playability of such game for a user with assistance provided by at least one of the tip or the tactic. The alteration component may alter the game during game play.Type: GrantFiled: September 28, 2007Date of Patent: January 24, 2012Assignee: Microsoft CorporationInventors: Bret P. O'Rourke, Eric P. Wilfrid, Nigel S. Keam, Steven Bathiche, James M. Alkove, Zachary L. Russell, Jon Marcus Randall Whitten, Boyd C. Multerer, Andrew D. Wilson
-
Publication number: 20110310232Abstract: Described is using a combination of which a multi-view display is provided by a combining spatial multiplexing (e.g., using a parallax barrier or lenslet), and temporal multiplexing (e.g., using a directed backlight). A scheduling algorithm generates different views by determining which light sources are illuminated at a particular time. Via the temporal multiplexing, different views may be in the same spatial viewing angle (spatial zone). Two of the views may correspond to two eyes of a person, with different video data sent to each eye to provide an autostereoscopic display for that person. Eye (head) tracking may be used to move the view or views with a person as that person moves.Type: ApplicationFiled: June 21, 2010Publication date: December 22, 2011Applicant: Microsoft CorporationInventors: Andrew D. Wilson, Steven Bathiche
-
Patent number: 8077153Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.Type: GrantFiled: April 19, 2006Date of Patent: December 13, 2011Assignee: Microsoft CorporationInventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
-
Publication number: 20110285633Abstract: A dynamic projected user interface device is disclosed, that includes a projector, a projection controller, and an imaging sensor. The projection controller is configured to receive instructions from a computing device, and to provide display images via the projector onto display surfaces. The display images are indicative of a first set of input controls when the computing device is in a first operating context, and a second set of input controls when the computing device is in a second operating context. The imaging sensor is configured to optically detect physical contacts with the one or more display surfaces.Type: ApplicationFiled: July 13, 2011Publication date: November 24, 2011Applicant: Microsoft CorporationInventors: Steven N. Bathiche, Andrew D. Wilson