Patents by Inventor Shahram Izadi

Shahram Izadi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140206443
    Abstract: Camera pose estimation for 3D reconstruction is described, for example, to enable position and orientation of a depth camera moving in an environment to be tracked for robotics, gaming and other applications. In various embodiments, depth observations from the mobile depth camera are aligned with surfaces of a 3D model of the environment in order to find an updated position and orientation of the mobile depth camera which facilitates the alignment. For example, the mobile depth camera is moved through the environment in order to build a 3D reconstruction of surfaces in the environment which may be stored as the 3D model. In examples, an initial estimate of the pose of the mobile depth camera is obtained and then updated by using a parallelized optimization process in real time.
    Type: Application
    Filed: January 24, 2013
    Publication date: July 24, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Toby Sharp, Andrew William Fitzgibbon, Shahram Izadi
  • Publication number: 20140208274
    Abstract: Methods and system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more hand touch gestures and/or one or more hand air gestures.
    Type: Application
    Filed: January 18, 2013
    Publication date: July 24, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Samuel Gavin Smyth, Peter John Ansell, Christopher Jozef O'Prey, Mitchel Alan Goldberg, Jamie Daniel Joseph Shotton, Toby Sharp, Shahram Izadi, Abigail Jane Sellen, Richard Malcolm Banks, Kenton O'Hara, Richard Harry Robert Harper, Eric John Greveson, David Alexander Butler, Stephen E Hodges
  • Patent number: 8780088
    Abstract: An infrared source is configured to illuminate the underside of one or more objects on or above a touchable surface of a touch panel. Infrared light reflected from the underside of the object(s) is detected by an infrared sensor integrated in the touch panel below the touchable surface.
    Type: Grant
    Filed: April 1, 2013
    Date of Patent: July 15, 2014
    Assignee: Microsoft Corporation
    Inventors: Willem den Boer, Steven N. Bathiche, Stephen Edward Hodges, Shahram Izadi
  • Publication number: 20140192158
    Abstract: The description relates to stereo image matching to determine depth of a scene as captured by images. More specifically, the described implementations can involve a two-stage approach where the first stage can compute depth at highly accurate but sparse feature locations. The second stage can compute a dense depth map using the first stage as initialization. This improves accuracy and robustness of the dense depth map.
    Type: Application
    Filed: January 4, 2013
    Publication date: July 10, 2014
    Applicant: Microsoft Corporation
    Inventors: Oliver Whyte, Adam G. Kirk, Shahram Izadi, Carsten Rother, Michael Bleyer, Christoph Rhemann
  • Publication number: 20140184749
    Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.
    Type: Application
    Filed: December 28, 2012
    Publication date: July 3, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Otmar Hilliges, Malte Hanno Weiss, Shahram Izadi, David Kim, Carsten Curt Eckard Rother
  • Patent number: 8760395
    Abstract: In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device.
    Type: Grant
    Filed: May 31, 2011
    Date of Patent: June 24, 2014
    Assignee: Microsoft Corporation
    Inventors: David Kim, Otmar D. Hilliges, Shahram Izadi, Patrick L. Olivier, Jamie Daniel Joseph Shotton, Pushmeet Kohli, David G. Molyneaux, Stephen E. Hodges, Andrew W. Fitzgibbon
  • Patent number: 8711206
    Abstract: Mobile camera localization using depth maps is described for robotics, immersive gaming, augmented reality and other applications. In an embodiment a mobile depth camera is tracked in an environment at the same time as a 3D model of the environment is formed using the sensed depth data. In an embodiment, when camera tracking fails, this is detected and the camera is relocalized either by using previously gathered keyframes or in other ways. In an embodiment, loop closures are detected in which the mobile camera revisits a location, by comparing features of a current depth map with the 3D model in real time. In embodiments the detected loop closures are used to improve the consistency and accuracy of the 3D model of the environment.
    Type: Grant
    Filed: January 31, 2011
    Date of Patent: April 29, 2014
    Assignee: Microsoft Corporation
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
  • Patent number: 8704822
    Abstract: A volumetric display system which enables user interaction is described. In an embodiment, the system consists of a volumetric display and an optical system. The volumetric display creates a 3D light field of an object to be displayed and the optical system creates a copy of the 3D light field in a position away from the volumetric display and where a user can interact with the image of the object displayed. In an embodiment, the optical system involves a pair of parabolic mirror portions.
    Type: Grant
    Filed: December 17, 2008
    Date of Patent: April 22, 2014
    Assignee: Microsoft Corporation
    Inventors: David Alexander Butler, Stephen E. Hodges, Shahram Izadi, Stuart Taylor, Nicolas Villar
  • Publication number: 20140104274
    Abstract: An augmented reality system which enables grasping of virtual objects is described such as to stack virtual cubes or to manipulate virtual objects in other ways. In various embodiments a user's hand or another real object is tracked in an augmented reality environment. In examples, the shape of the tracked real object is approximated using at least two different types of particles and the virtual objects are updated according to simulated forces exerted between the augmented reality environment and at least some of the particles. In various embodiments 3D positions of a first one of the types of particles, kinematic particles, are updated according to the tracked real object; and passive particles move with linked kinematic particles without penetrating virtual objects. In some examples a real-time optic flow process is used to track motion of the real object.
    Type: Application
    Filed: October 17, 2012
    Publication date: April 17, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, Malte Hanno Weiss
  • Publication number: 20140098018
    Abstract: A wearable sensor for tracking articulated body parts is described such as a wrist-worn device which enables 3D tracking of fingers and optionally also the arm and hand without the need to wear a glove or markers on the hand. In an embodiment a camera captures images of an articulated part of a body of a wearer of the device and an articulated model of the body part is tracked in real time to enable gesture-based control of a separate computing device such as a smart phone, laptop computer or other computing device. In examples the device has a structured illumination source and a diffuse illumination source for illuminating the articulated body part.
    Type: Application
    Filed: October 4, 2012
    Publication date: April 10, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: David Kim, Shahram Izadi, Otmar Hilliges, David Alexander Butler, Stephen Hodges, Patrick Luke Olivier, Jiawen Chen, Iason Oikonomidis
  • Patent number: 8681127
    Abstract: In some implementations, a touch point on a surface of a touchscreen device may be determined. An image of a region of space above the surface and surrounding the touch point may be determined. The image may include a brightness gradient that captures a brightness of objects above the surface. A binary image that includes one or more binary blobs may be created based on a brightness of portions of the image. A determination may be made as to which of the one more binary blobs are connected to each other to form portions of a particular user. A determination may be made that the particular user generated the touch point.
    Type: Grant
    Filed: April 22, 2013
    Date of Patent: March 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
  • Patent number: 8660303
    Abstract: A system and method for detecting and tracking targets including body parts and props is described. In one aspect, the disclosed technology acquires one or more depth images, generates one or more classification maps associated with one or more body parts and one or more props, tracks the one or more body parts using a skeletal tracking system, tracks the one or more props using a prop tracking system, and reports metrics regarding the one or more body parts and the one or more props. In some embodiments, feedback may occur between the skeletal tracking system and the prop tracking system.
    Type: Grant
    Filed: December 20, 2010
    Date of Patent: February 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, Jamie Shotton, John Winn, Antonio Criminisi, Otmar Hilliges, Mat Cook, David Molyneaux
  • Publication number: 20140035901
    Abstract: Methods of animating objects using the human body are described. In an embodiment, a deformation graph is generated from a mesh which describes the object. Tracked skeleton data is received which is generated from sensor data and the tracked skeleton is then embedded in the graph. Subsequent motion which is captured by the sensor result in motion of the tracked skeleton and this motion is used to define transformations on the deformation graph. The transformations are then applied to the mesh to generate an animation of the object which corresponds to the captured motion. In various examples, the mesh is generated by scanning an object and the deformation graph is generated using orientation-aware sampling such that nodes can be placed close together within the deformation graph where there are sharp corners or other features with high curvature in the object.
    Type: Application
    Filed: July 31, 2012
    Publication date: February 6, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Jiawen Chen, Shahram Izadi, Andrew William Fitzgibbon
  • Patent number: 8638985
    Abstract: Techniques for human body pose estimation are disclosed herein. Images such as depth images, silhouette images, or volumetric images may be generated and pixels or voxels of the images may be identified. The techniques may process the pixels or voxels to determine a probability that each pixel or voxel is associated with a segment of a body captured in the image or to determine a three-dimensional representation for each pixel or voxel that is associated with a location on a canonical body. These probabilities or three-dimensional representations may then be utilized along with the images to construct a posed model of the body captured in the image.
    Type: Grant
    Filed: March 3, 2011
    Date of Patent: January 28, 2014
    Assignee: Microsoft Corporation
    Inventors: Jamie Daniel Joseph Shotton, Shahram Izadi, Otmar Hilliges, David Kim, David Geoffrey Molyneaux, Matthew Darius Cook, Pushmeet Kohli, Antonio Criminisi, Ross Brook Girshick, Andrew William Fitzgibbon
  • Patent number: 8587583
    Abstract: Three-dimensional environment reconstruction is described. In an example, a 3D model of a real-world environment is generated in a 3D volume made up of voxels stored on a memory device. The model is built from data describing a camera location and orientation, and a depth image with pixels indicating a distance from the camera to a point in the environment. A separate execution thread is assigned to each voxel in a plane of the volume. Each thread uses the camera location and orientation to determine a corresponding depth image location for its associated voxel, determines a factor relating to the distance between the associated voxel and the point in the environment at the corresponding location, and updates a stored value at the associated voxel using the factor. Each thread iterates through an equivalent voxel in the remaining planes of the volume, repeating the process to update the stored value.
    Type: Grant
    Filed: January 31, 2011
    Date of Patent: November 19, 2013
    Assignee: Microsoft Corporation
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
  • Patent number: 8581852
    Abstract: Touch detection systems and methods are described. The system comprises a light guiding sheet, a light source, a reflective layer and a detector. When a fingertip or other suitable object is pressed against the light guiding sheet, light which is undergoing total internal reflection within the sheet is scattered. The scattered light is reflected by the reflective layer and detected by the detector. In an embodiment, the light is infra-red light. The touch detection system may, in some embodiments, be placed on a display and the touch events used to control the display.
    Type: Grant
    Filed: November 15, 2007
    Date of Patent: November 12, 2013
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, Stuart Taylor, Stephen E. Hodges
  • Publication number: 20130290910
    Abstract: User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates.
    Type: Application
    Filed: June 24, 2013
    Publication date: October 31, 2013
    Inventors: Harper LaFave, Stephen Hodges, James Scott, Shahram Izadi, David Molyneaux, Nicolas Villar, David Alexander Butler, Mike Hazas
  • Patent number: 8570320
    Abstract: Use of a 3D environment model in gameplay is described. In an embodiment, a mobile depth camera is used to capture a series of depth images as it is moved around and a dense 3D model of the environment is generated from this series of depth images. This dense 3D model is incorporated within an interactive application, such as a game. The mobile depth camera is then placed in a static position for an interactive phase, which in some examples is gameplay, and the system detects motion of a user within a part of the environment from a second series of depth images captured by the camera. This motion provides a user input to the interactive application, such as a game. In further embodiments, automatic recognition and identification of objects within the 3D model may be performed and these identified objects then change the way that the interactive application operates.
    Type: Grant
    Filed: January 31, 2011
    Date of Patent: October 29, 2013
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20130241806
    Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.
    Type: Application
    Filed: May 6, 2013
    Publication date: September 19, 2013
    Applicant: Microsoft Corporation
    Inventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
  • Publication number: 20130244782
    Abstract: Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.
    Type: Application
    Filed: February 23, 2013
    Publication date: September 19, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler