Patents by Inventor Andrew David Wilson

Andrew David Wilson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8730309
    Abstract: Architecture that combines multiple depth cameras and multiple projectors to cover a specified space (e.g., a room). The cameras and projectors are calibrated, allowing the development of a multi-dimensional (e.g., 3D) model of the objects in the space, as well as the ability to project graphics in a controlled fashion on the same objects. The architecture incorporates the depth data from all depth cameras, as well as color information, into a unified multi-dimensional model in combination with calibrated projectors. In order to provide visual continuity when transferring objects between different locations in the space, the user's body can provide a canvas on which to project this interaction. As the user moves body parts in the space, without any other object, the body parts can serve as temporary “screens” for “in-transit” data.
    Type: Grant
    Filed: June 21, 2010
    Date of Patent: May 20, 2014
    Assignee: Microsoft Corporation
    Inventors: Andrew David Wilson, Hrvoje Benko
  • Publication number: 20140114212
    Abstract: The present invention concerns apparatus for the detection and/or assessment of neurodevelopmental disorders of a user, the apparatus comprising: control means for providing a goal-orientated task to a display, said task being formulated to elicit a movement from a user in response to said task; input means via which a user can input a response to said task; wherein said control means records a user's response as multi-dimensional input data, which can be used to profile the user for the presence or absence of a neurodevelopmental disorder, the task being specifically formulated such that indicative characteristics of a user's movement over time can be identified by said multi-dimensional input data.
    Type: Application
    Filed: October 17, 2013
    Publication date: April 24, 2014
    Applicant: University Court of the University of Aberdeen
    Inventors: Mark Arwyn Mon-Williams, Justin Hereward Gwilym Williams, Andrew David Wilson, Mandy Suzanne Plumb
  • Publication number: 20140111483
    Abstract: One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.
    Type: Application
    Filed: December 31, 2013
    Publication date: April 24, 2014
    Applicant: Microsoft Corporation
    Inventors: Chris Harrison, Hrvoje Benko, Andrew David Wilson
  • Publication number: 20140005886
    Abstract: Embodiments are directed to automotive systems including automotive alerting systems and automotive heads-up display projection systems. In one case, an automotive system includes an internal-facing sensor and an external-facing sensor positioned within an automobile. The automotive system further includes an information processing system that performs various items including receiving interior sensor input from the internal-facing sensor and from the external-facing sensor. The interior sensor input indicates information about the actions the driver and any other automobile occupants. The exterior sensor indicates information about objects that are external to the automobile. The information processing system then determines, based on both the interior sensor input and the exterior sensor input, an appropriate action to perform and performs that action. Such actions may include increasing or decreasing the salience of a triggered alert.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Daniel Scott Morris, Hrvoje Benko, Jay P. Kapur, Andrew David Wilson, Kenneth Alan Lobb
  • Patent number: 8619049
    Abstract: One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.
    Type: Grant
    Filed: May 17, 2011
    Date of Patent: December 31, 2013
    Assignee: Microsoft Corporation
    Inventors: Chris Harrison, Hrvoje Benko, Andrew David Wilson
  • Patent number: 8502795
    Abstract: The claimed subject matter provides a system and/or a method that facilitates enhancing interactive surface technologies for data manipulation. A surface detection component can employ a multiple contact surfacing technology to detect a surface input, wherein the detected surface input enables a physical interaction with a portion of displayed data that represents a corporeal object. A physics engine can integrate a portion of Newtonian physics into the interaction with the portion of displayed data in order to model at least one quantity related associated with the corporeal object, the quantity is at least one of a force, a mass, a velocity, or a friction.
    Type: Grant
    Filed: February 29, 2012
    Date of Patent: August 6, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew David Wilson, Shahram Izadi, Armando Garcia-Mendoza, David Kirk, Otmar Hilliges
  • Publication number: 20130057515
    Abstract: Architecture that employs depth sensing cameras to detect touch on a surface, such as a tabletop. The act of touching is processed using thresholds which are automatically computed from depth image data, and these thresholds are used to generate a touch image. More specifically, the thresholds (near and far, relative to the camera) are used to segment a typical finger that touches a surface. A snapshot image is captured of the scene and a surfaced histogram is computed from the snapshot over a small range of deviations at each pixel location. The near threshold (nearest to the camera) is computed based on the anthropometry of fingers and hands, and associated posture during touch. After computing the surface histogram, the far threshold values (furthest from the camera) can be stored as an image of thresholds, used in a single pass to classify all pixels in the input depth image.
    Type: Application
    Filed: September 7, 2011
    Publication date: March 7, 2013
    Applicant: Microsoft Corporation
    Inventor: Andrew David Wilson
  • Publication number: 20120293402
    Abstract: One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.
    Type: Application
    Filed: May 17, 2011
    Publication date: November 22, 2012
    Applicant: Microsoft Corporation
    Inventors: Chris Harrison, Hrvoje Benko, Andrew David Wilson
  • Publication number: 20120275686
    Abstract: Three-dimensional (3-D) spatial image data may be received that is associated with at least one arm motion of an actor based on free-form movements of at least one hand of the actor, based on natural gesture motions of the at least one hand. A plurality of sequential 3-D spatial representations that each include 3-D spatial map data corresponding to a 3-D posture and position of the hand at sequential instances of time during the free-form movements may be determined, based on the received 3-D spatial image data. An integrated 3-D model may be generated, via a spatial object processor, based on incrementally integrating the 3-D spatial map data included in the determined sequential 3-D spatial representations and comparing a threshold time value with model time values indicating numbers of instances of time spent by the hand occupying a plurality of 3-D spatial regions during the free-form movements.
    Type: Application
    Filed: April 29, 2011
    Publication date: November 1, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Andrew David Wilson, Christian Holz
  • Publication number: 20120206377
    Abstract: In embodiments of angular contact geometry, touch input sensor data is recognized as a touch input on a touch-screen display, such as a touch-screen display integrated in a mobile phone or portable computing device. A sensor map is generated from the touch input sensor data, and the sensor map represents the touch input. The sensor map can be generated as a two-dimensional array of elements that correlate to sensed contact from a touch input. An ellipse can then be determined that approximately encompasses elements of the sensor map, and the ellipse represents a contact shape of the touch input.
    Type: Application
    Filed: May 2, 2011
    Publication date: August 16, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Weidong Zhao, David A. Stevens, Aleksandar Uzelac, Takahiro Shigemitsu, Andrew David Wilson, Nigel Stuart Keam
  • Publication number: 20120162117
    Abstract: The claimed subject matter provides a system and/or a method that facilitates enhancing interactive surface technologies for data manipulation. A surface detection component can employ a multiple contact surfacing technology to detect a surface input, wherein the detected surface input enables a physical interaction with a portion of displayed data that represents a corporeal object. A physics engine can integrate a portion of Newtonian physics into the interaction with the portion of displayed data in order to model at least one quantity related associated with the corporeal object, the quantity is at least one of a force, a mass, a velocity, or a friction.
    Type: Application
    Filed: February 29, 2012
    Publication date: June 28, 2012
    Applicant: Microsoft Corporation
    Inventors: Andrew David Wilson, Shahram Izadi, Armando Garcia-Mendoza, David Kirk, Otmar Hilliges
  • Patent number: 8154524
    Abstract: The claimed subject matter provides a system and/or a method that facilitates enhancing interactive surface technologies for data manipulation. A surface detection component can employ a multiple contact surfacing technology to detect a surface input, wherein the detected surface input enables a physical interaction with a portion of displayed data that represents a corporeal object. A physics engine can integrate a portion of Newtonian physics into the interaction with the portion of displayed data in order to model at least one quantity related associated with the corporeal object, the quantity is at least one of a force, a mass, a velocity, or a friction.
    Type: Grant
    Filed: September 3, 2008
    Date of Patent: April 10, 2012
    Assignee: Microsoft Corporation
    Inventors: Andrew David Wilson, Shahram Izadi, Armando Garcia-Mendoza, David Kirk, Otmar Hilliges
  • Patent number: 8131750
    Abstract: Systems (and corresponding methodologies) that annotate experience data in real-time are provided. The real-time annotated experience data can be employed in accordance with augmented reality systems which are capable of overlaying virtual data upon real-world data. The system employs ‘smart-tags’ that are capable of identifying data that relates to and/or is associated with the real-world scenarios and situations.
    Type: Grant
    Filed: December 28, 2007
    Date of Patent: March 6, 2012
    Assignee: Microsoft Corporation
    Inventors: Steven N. Bathiche, Shai Guday, Zachary Lewis Russell, Boyd Cannon Multerer, Jon Marcus Randall Whitten, Andrew David Wilson, Matthew B. MacLaurin
  • Patent number: 8117094
    Abstract: A system to facilitate royalty tracking is provided. The system includes at least one tag to identify a portion of a creative work. A distribution component tracks the portion of the creative work and a crediting component reports usage of the creative work when the portion is detected in a larger body of work.
    Type: Grant
    Filed: June 29, 2007
    Date of Patent: February 14, 2012
    Assignee: Microsoft Corporation
    Inventors: Boyd Cannon Multerer, William T. Flora, Bret P. O'Rourke, John Mark Miller, Eric Peter Wilfrid, Nigel Stuart Keam, Steven N. Bathiche, Oliver Roup, James Morris Alkove, Zachary Lewis Russell, Jon Marcus Randall Whitten, Andrew David Wilson
  • Patent number: 8108398
    Abstract: A system that facilitates data presentation and management includes at least one database to store a corpus of data relating to one or more topics. The system further includes a summarizer component to automatically determine a subset of the data over the corpus of data relating to at least one of the topic(s), wherein the subset forms a summary of at least one topic.
    Type: Grant
    Filed: June 29, 2007
    Date of Patent: January 31, 2012
    Assignee: Microsoft Corporation
    Inventors: Shai Guday, Bret P. O'Rourke, John Mark Miller, James Morris Alkove, Andrew David Wilson
  • Patent number: 8049719
    Abstract: Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.
    Type: Grant
    Filed: October 14, 2010
    Date of Patent: November 1, 2011
    Assignee: Microsoft Corporation
    Inventors: Andrew David Wilson, Michael J. Sinclair
  • Publication number: 20110238612
    Abstract: A multi-factor probabilistic model evaluates user input to determine if the user input was intended for an on-screen user interface control. When user input is received, a probability is computed that the user input was intended for each on-screen user interface control. The user input is then associated with the user interface control that has the highest computed probability. The probability that user input was intended for each user interface control may be computed utilizing a multitude of factors including the probability that the user input is near each user interface control, the probability that the motion of the user input is consistent with the user interface control, the probability that the shape of the user input is consistent with the user interface control, and that the size of the user input is consistent with the user interface control.
    Type: Application
    Filed: March 26, 2010
    Publication date: September 29, 2011
    Applicant: Microsoft Corporation
    Inventor: Andrew David Wilson
  • Publication number: 20110205341
    Abstract: Architecture that combines multiple depth cameras and multiple projectors to cover a specified space (e.g., a room). The cameras and projectors are calibrated, allowing the development of a multi-dimensional (e.g., 3D) model of the objects in the space, as well as the ability to project graphics in a controlled fashion on the same objects. The architecture incorporates the depth data from all depth cameras, as well as color information, into a unified multi-dimensional model in combination with calibrated projectors. In order to provide visual continuity when transferring objects between different locations in the space, the user's body can provide a canvas on which to project this interaction. As the user moves body parts in the space, without any other object, the body parts can serve as temporary “screens” for “in-transit” data.
    Type: Application
    Filed: June 21, 2010
    Publication date: August 25, 2011
    Applicant: Microsoft Corporation
    Inventors: Andrew David Wilson, Hrvoje Benko
  • Publication number: 20110205147
    Abstract: Concepts and technologies are described herein for interacting with an omni-directionally projected display. The omni-directionally projected display includes, in some embodiments, visual information projected on a display surface by way of an omni-directional projector. A user is able to interact with the projected visual information using gestures in free space, voice commands, and/or other tools, structures, and commands. The visual information can be projected omni-directionally, to provide a user with an immersive interactive experience with the projected display. The concepts and technologies disclosed herein can support more than one interacting user. Thus, the concepts and technologies disclosed herein may be employed to provide a number of users with immersive interactions with projected visual information.
    Type: Application
    Filed: February 22, 2010
    Publication date: August 25, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Andrew David Wilson, Hrvoje Benko
  • Patent number: 7978185
    Abstract: The present invention extends to methods, systems, and computer program products for creating virtual replicas of physical objects. A computer system detects that a portion of a physical object has come into the physical vicinity of a portion of a multi-touch input display surface. The computer system accesses object identifying data corresponding to the object. The computer system uses the object identifying data to access image data for the object from a repository of stored image data. The computer system uses the at least the accessed image data to generate a virtual replica of the object. The computer system presents the virtual replica of the object at a location on the multi-touch input display surface where the portion of the object was detected.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: July 12, 2011
    Assignee: Microsoft Corporation
    Inventors: Andrew David Wilson, Daniel Chaim Robbins