Patents by Inventor David Alexander Butler
David Alexander Butler has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20140208274Abstract: Methods and system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more hand touch gestures and/or one or more hand air gestures.Type: ApplicationFiled: January 18, 2013Publication date: July 24, 2014Applicant: MICROSOFT CORPORATIONInventors: Samuel Gavin Smyth, Peter John Ansell, Christopher Jozef O'Prey, Mitchel Alan Goldberg, Jamie Daniel Joseph Shotton, Toby Sharp, Shahram Izadi, Abigail Jane Sellen, Richard Malcolm Banks, Kenton O'Hara, Richard Harry Robert Harper, Eric John Greveson, David Alexander Butler, Stephen E Hodges
-
Patent number: 8711206Abstract: Mobile camera localization using depth maps is described for robotics, immersive gaming, augmented reality and other applications. In an embodiment a mobile depth camera is tracked in an environment at the same time as a 3D model of the environment is formed using the sensed depth data. In an embodiment, when camera tracking fails, this is detected and the camera is relocalized either by using previously gathered keyframes or in other ways. In an embodiment, loop closures are detected in which the mobile camera revisits a location, by comparing features of a current depth map with the 3D model in real time. In embodiments the detected loop closures are used to improve the consistency and accuracy of the 3D model of the environment.Type: GrantFiled: January 31, 2011Date of Patent: April 29, 2014Assignee: Microsoft CorporationInventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
-
Patent number: 8704822Abstract: A volumetric display system which enables user interaction is described. In an embodiment, the system consists of a volumetric display and an optical system. The volumetric display creates a 3D light field of an object to be displayed and the optical system creates a copy of the 3D light field in a position away from the volumetric display and where a user can interact with the image of the object displayed. In an embodiment, the optical system involves a pair of parabolic mirror portions.Type: GrantFiled: December 17, 2008Date of Patent: April 22, 2014Assignee: Microsoft CorporationInventors: David Alexander Butler, Stephen E. Hodges, Shahram Izadi, Stuart Taylor, Nicolas Villar
-
Publication number: 20140098018Abstract: A wearable sensor for tracking articulated body parts is described such as a wrist-worn device which enables 3D tracking of fingers and optionally also the arm and hand without the need to wear a glove or markers on the hand. In an embodiment a camera captures images of an articulated part of a body of a wearer of the device and an articulated model of the body part is tracked in real time to enable gesture-based control of a separate computing device such as a smart phone, laptop computer or other computing device. In examples the device has a structured illumination source and a diffuse illumination source for illuminating the articulated body part.Type: ApplicationFiled: October 4, 2012Publication date: April 10, 2014Applicant: MICROSOFT CORPORATIONInventors: David Kim, Shahram Izadi, Otmar Hilliges, David Alexander Butler, Stephen Hodges, Patrick Luke Olivier, Jiawen Chen, Iason Oikonomidis
-
Patent number: 8681127Abstract: In some implementations, a touch point on a surface of a touchscreen device may be determined. An image of a region of space above the surface and surrounding the touch point may be determined. The image may include a brightness gradient that captures a brightness of objects above the surface. A binary image that includes one or more binary blobs may be created based on a brightness of portions of the image. A determination may be made as to which of the one more binary blobs are connected to each other to form portions of a particular user. A determination may be made that the particular user generated the touch point.Type: GrantFiled: April 22, 2013Date of Patent: March 25, 2014Assignee: Microsoft CorporationInventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
-
Patent number: 8587583Abstract: Three-dimensional environment reconstruction is described. In an example, a 3D model of a real-world environment is generated in a 3D volume made up of voxels stored on a memory device. The model is built from data describing a camera location and orientation, and a depth image with pixels indicating a distance from the camera to a point in the environment. A separate execution thread is assigned to each voxel in a plane of the volume. Each thread uses the camera location and orientation to determine a corresponding depth image location for its associated voxel, determines a factor relating to the distance between the associated voxel and the point in the environment at the corresponding location, and updates a stored value at the associated voxel using the factor. Each thread iterates through an equivalent voxel in the remaining planes of the volume, repeating the process to update the stored value.Type: GrantFiled: January 31, 2011Date of Patent: November 19, 2013Assignee: Microsoft CorporationInventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
-
Publication number: 20130290910Abstract: User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates.Type: ApplicationFiled: June 24, 2013Publication date: October 31, 2013Inventors: Harper LaFave, Stephen Hodges, James Scott, Shahram Izadi, David Molyneaux, Nicolas Villar, David Alexander Butler, Mike Hazas
-
Patent number: 8570320Abstract: Use of a 3D environment model in gameplay is described. In an embodiment, a mobile depth camera is used to capture a series of depth images as it is moved around and a dense 3D model of the environment is generated from this series of depth images. This dense 3D model is incorporated within an interactive application, such as a game. The mobile depth camera is then placed in a static position for an interactive phase, which in some examples is gameplay, and the system detects motion of a user within a part of the environment from a second series of depth images captured by the camera. This motion provides a user input to the interactive application, such as a game. In further embodiments, automatic recognition and identification of objects within the 3D model may be performed and these identified objects then change the way that the interactive application operates.Type: GrantFiled: January 31, 2011Date of Patent: October 29, 2013Assignee: Microsoft CorporationInventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
-
Publication number: 20130241806Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.Type: ApplicationFiled: May 6, 2013Publication date: September 19, 2013Applicant: Microsoft CorporationInventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
-
Publication number: 20130244782Abstract: Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.Type: ApplicationFiled: February 23, 2013Publication date: September 19, 2013Applicant: MICROSOFT CORPORATIONInventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
-
Publication number: 20130234992Abstract: In some implementations, a touch point on a surface of a touchscreen device may be determined. An image of a region of space above the surface and surrounding the touch point may be determined The image may include a brightness gradient that captures a brightness of objects above the surface. A binary image that includes one or more binary blobs may be created based on a brightness of portions of the image. A determination may be made as to which of the one more binary blobs are connected to each other to form portions of a particular user. A determination may be made that the particular user generated the touch point.Type: ApplicationFiled: April 22, 2013Publication date: September 12, 2013Applicant: Microsoft CorporationInventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
-
Patent number: 8502816Abstract: A tabletop display providing multiple views to users is described. In an embodiment the display comprises a rotatable view-angle restrictive filter and a display system. The display system displays a sequence of images synchronized with the rotation of the filter to provide multiple views according to viewing angle. These multiple views provide a user with a 3D display or with personalized content which is not visible to a user at a sufficiently different viewing angle. In some embodiments, the display comprises a diffuser layer on which the sequence of images are displayed. In further embodiments, the diffuser is switchable between a diffuse state when images are displayed and a transparent state when imaging beyond the surface can be performed. The device may form part of a tabletop comprising with a touch-sensitive surface. Detected touch events and images captured through the surface may be used to modify the images being displayed.Type: GrantFiled: December 2, 2010Date of Patent: August 6, 2013Assignee: Microsoft CorporationInventors: David Alexander Butler, Stephen Edward Hodges, Shahram Izadi, Nicolas Villar, Stuart Taylor, David Molyneaux, Otmar Hilliges
-
Patent number: 8489569Abstract: Retrieval and display of digital media items is described. For example, the digital media items may be photographs, videos, audio files, emails, text documents or parts of these. In an embodiment a dedicated apparatus having a touch display screen is provided in a form designed to look like a domestic fish tank. In an embodiment graphical animated agents are depicted on the display as fish whose motion varies according to at least one behavior parameter which is pseudo random. In embodiments, the agents have associated search criteria and when a user selects one or more agents the associated search criteria are used in a retrieval operation to retrieve digital media items from a store. In some embodiments media items are communicated between the apparatus and a portable communications device using a communications link established by tapping the portable device against the media retrieval and display apparatus.Type: GrantFiled: December 8, 2008Date of Patent: July 16, 2013Assignee: Microsoft CorporationInventors: David Kirk, Nicolas Villar, Richard Banks, David Alexander Butler, Shahram Izadi, Abigail Sellen, Stuart Taylor
-
Patent number: 8471814Abstract: User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates.Type: GrantFiled: February 26, 2010Date of Patent: June 25, 2013Assignee: Microsoft CorporationInventors: Harper LaFave, Stephen Hodges, James Scott, Shahram Izadi, David Molyneaux, Nicolas Villar, David Alexander Butler, Mike Hazas
-
Patent number: 8436789Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.Type: GrantFiled: January 16, 2009Date of Patent: May 7, 2013Assignee: Microsoft CorporationInventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
-
Patent number: 8432366Abstract: The claimed subject matter provides a system and/or a method that facilitates distinguishing input among one or more users in a surface computing environment. A variety of information can be obtained and analyzed to infer an association between a particular input and a particular user. Touch point information can be acquired from a surface wherein the touch point information relates to a touch point. In addition, one or more environmental sensors can monitor the surface computing environment and provide environmental information. The touch point information and the environmental information can be analyzed to determine direction of inputs, location of users, and movement of users and so on. Individual analysis results can be correlated and/or aggregated to generate a inference of association between a touch point and user.Type: GrantFiled: March 3, 2009Date of Patent: April 30, 2013Assignee: Microsoft CorporationInventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
-
Patent number: 8432372Abstract: A device is described which enables users to interact with software running on the device through gestures made in an area adjacent to the device. In an embodiment, a portable computing device has proximity sensors arranged on an area of its surface which is not a display, such as on the sides of the device. These proximity sensors define an area of interaction adjacent to the device. User gestures in this area of interaction are detected by creating sensing images from data received from each of the sensors and then analyzing sequences of these images to detect gestures. The detected gestures may be mapped to particular inputs to a software program running on the device and therefore a user can control the operation of the program through gestures.Type: GrantFiled: November 30, 2007Date of Patent: April 30, 2013Assignee: Microsoft CorporationInventors: David Alexander Butler, Shahram Izadi, Stephen E. Hodges, Malcolm Hall
-
Patent number: 8401242Abstract: Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.Type: GrantFiled: January 31, 2011Date of Patent: March 19, 2013Assignee: Microsoft CorporationInventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
-
Patent number: 8400410Abstract: Ferromagnetic user interfaces are described. In embodiments, user interface devices are described that can detect the location of movement on a user-touchable portion by sensing movement of a ferromagnetic material. In some embodiments sensors are arranged in a two dimensional array, and the user interface device can determine the location of the movement in a plane substantially parallel to the two-dimensional array and the acceleration of movement substantially perpendicular to the two-dimensional array. In other embodiments, user interface devices are described that can cause a raised surface region to be formed on a ferrofluid layer of a user-touchable portion, which is detectable by the touch of a user. Embodiments describe how the raised surface region can be moved on the ferrofluid layer. Embodiments also describe how the raised surface region can be caused to vibrate.Type: GrantFiled: May 26, 2009Date of Patent: March 19, 2013Assignee: Microsoft CorporationInventors: Stuart Taylor, Jonathan Hook, Shahram Izadi, Nicolas Villar, David Alexander Butler, Stephen E. Hodges
-
Patent number: 8401225Abstract: Moving object segmentation using depth images is described. In an example, a moving object is segmented from the background of a depth image of a scene received from a mobile depth camera. A previous depth image of the scene is retrieved, and compared to the current depth image using an iterative closest point algorithm. The iterative closest point algorithm includes a determination of a set of points that correspond between the current depth image and the previous depth image. During the determination of the set of points, one or more outlying points are detected that do not correspond between the two depth images, and the image elements at these outlying points are labeled as belonging to the moving object. In examples, the iterative closest point algorithm is executed as part of an algorithm for tracking the mobile depth camera, and hence the segmentation does not add substantial additional computational complexity.Type: GrantFiled: January 31, 2011Date of Patent: March 19, 2013Assignee: Microsoft CorporationInventors: Richard Newcombe, Shahram Izadi, Otmar Hilliges, David Kim, David Molyneaux, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler