Patents by Inventor Shahram Izadi

Shahram Izadi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20120306850
    Abstract: A system and method for providing an augmented reality environment in which the environmental mapping process is decoupled from the localization processes performed by one or more mobile devices is described. In some embodiments, an augmented reality system includes a mapping system with independent sensing devices for mapping a particular real-world environment and one or more mobile devices. Each of the one or more mobile devices utilizes a separate asynchronous computing pipeline for localizing the mobile device and rendering virtual objects from a point of view of the mobile device. This distributed approach provides an efficient way for supporting mapping and localization processes for a large number of mobile devices, which are typically constrained by form factor and battery life limitations.
    Type: Application
    Filed: June 2, 2011
    Publication date: December 6, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Alexandru Balan, Jason Flaks, Steve Hodges, Michael Isard, Oliver Williams, Paul Barham, Shahram Izadi, Otmar Hilliges, David Molyneaux, David Kim
  • Publication number: 20120306734
    Abstract: In one or more implementations, a static geometry model is generated, from one or more images of a physical environment captured using a camera, using one or more static objects to model corresponding one or more objects in the physical environment. Interaction of a dynamic object with at least one of the static objects is identified by analyzing at least one image and a gesture is recognized from the identified interaction of the dynamic object with the at least one of the static objects to initiate an operation of the computing device.
    Type: Application
    Filed: May 31, 2011
    Publication date: December 6, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: David Kim, Otmar D. Hilliges, Shahram Izadi, Patrick L. Olivier, Jamie Daniel Joseph Shotton, Pushmeet Kohli, David G. Molyneaux, Stephen E. Hodges, Andrew W. Fitzgibbon
  • Publication number: 20120306876
    Abstract: Generating computer models of 3D objects is described. In one example, depth images of an object captured by a substantially static depth camera are used to generate the model, which is stored in a memory device in a three-dimensional volume. Portions of the depth image determined to relate to the background are removed to leave a foreground depth image. The position and orientation of the object in the foreground depth image is tracked by comparison to a preceding depth image, and the foreground depth image is integrated into the volume by using the position and orientation to determine where to add data derived from the foreground depth image into the volume. In examples, the object is hand-rotated by a user before the depth camera. Hands that occlude the object are integrated out of the model as they do not move in sync with the object due to re-gripping.
    Type: Application
    Filed: June 6, 2011
    Publication date: December 6, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Jamie Daniel Joseph SHOTTON, Shahram IZADI, Otmar HILLIGES, David KIM, David MOLYNEAUX, Pushmeet KOHLI, Andrew FITZGIBBON, Stephen Edward HODGES
  • Patent number: 8325020
    Abstract: Methods and apparatus for uniquely identifying wireless devices in close physical proximity are described. When two wireless devices are brought into close proximity, one of the devices displays an optical indicator, such as a light pattern. This device then sends messages to other devices which are within wireless range to cause them to use any light sensor to detect a signal. In an embodiment, the light sensor is a camera and the detected signal is an image captured by the camera. Each device then sends data identifying what was detected back to the device displaying the pattern. By analyzing this data, the first device can determine which other device detected the indicator that it displayed and therefore determine that this device is within close physical proximity. In an example, the first device is an interactive surface arranged to identify the wireless addresses of devices which are placed on the surface.
    Type: Grant
    Filed: January 31, 2011
    Date of Patent: December 4, 2012
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, Malcolm Hall, Stephen Hodges, William A. S. Buxton, David Alexander Butler
  • Publication number: 20120299837
    Abstract: A touch sensor provides frames of touch sensor data, as the touch sensor is sampled over time. Spatial and temporal features of the touch sensor data from a plurality of frames, and contacts and attributes of the contacts in previous frames, are processed to identify contacts and attributes of the contacts in a current frame. Attributes of the contacts can include, whether the contact is reliable, shrinking, moving, or related to a fingertip touch. The characteristics of contacts can include information about the shape and rate of change of the contact, including but not limited to a sum of its pixels, its shape, size and orientation, motion, average intensities and aspect ratio.
    Type: Application
    Filed: May 24, 2011
    Publication date: November 29, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Hrvoje Benko, John Miller, Shahram Izadi, Andy Wilson, Peter Ansell, Steve Hodges
  • Publication number: 20120242609
    Abstract: Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use). An archiving system is described which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. The image capture device is operable to capture digital images of physical objects for archiving. The receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus.
    Type: Application
    Filed: June 11, 2012
    Publication date: September 27, 2012
    Applicant: Microsoft Corporation
    Inventors: Shahram Izadi, Abigail J. Sellen, Richard M. Banks, Stuart Taylor, Stephen E. Hodges, Alex Butler
  • Patent number: 8272743
    Abstract: The techniques described herein provide a surface computing device that includes a surface layer configured to be in a transparent state and a diffuse state. In the diffuse state, an image can be projected onto the surface. In the transparent state, an image can be projected through the surface.
    Type: Grant
    Filed: October 24, 2011
    Date of Patent: September 25, 2012
    Assignee: Microsoft Corporation
    Inventors: Stuart Taylor, Shahram Izadi, Daniel A. Rosenfeld, Stephen Hodges, David Alexander Butler, James Scott, Nicolas Villar
  • Patent number: 8269746
    Abstract: A touch panel is arranged to enable communication using infrared signals with nearby devices. The touch panel includes an array of infrared sensors, arranged parallel to the touchable surface of the panel and at least one of the sensors is capable of detecting an infrared signal received from a nearby device.
    Type: Grant
    Filed: March 29, 2007
    Date of Patent: September 18, 2012
    Assignee: Microsoft Corporation
    Inventors: Stephen Hodges, Shahram Izadi, David Alexander Butler, Alban Rrustemi
  • Publication number: 20120206330
    Abstract: A multi-touch orientation sensing input device may enhance task performance efficiency. The multi-touch orientation sensing input device may include a device body that is partially enclosed or completely enclosed by a multi-touch sensor. The multi-touch orientation sensing input device may further include an inertia measurement unit that is disposed on the device body, The inertia measurement unit may measures a tilt angle of the device body with respect to a horizontal surface, as well as a roll angle of the device body along a length-wise axis of the device body with respect to an initial point on the device body.
    Type: Application
    Filed: February 11, 2011
    Publication date: August 16, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Xiang Cao, Minghui Sun, Shahram Izadi, Hrvoje Benko, Kenneth P. Hinckley
  • Publication number: 20120196679
    Abstract: Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.
    Type: Application
    Filed: January 31, 2011
    Publication date: August 2, 2012
    Applicant: Microsoft Corporation
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20120195471
    Abstract: Moving object segmentation using depth images is described. In an example, a moving object is segmented from the background of a depth image of a scene received from a mobile depth camera. A previous depth image of the scene is retrieved, and compared to the current depth image using an iterative closest point algorithm. The iterative closest point algorithm includes a determination of a set of points that correspond between the current depth image and the previous depth image. During the determination of the set of points, one or more outlying points are detected that do not correspond between the two depth images, and the image elements at these outlying points are labeled as belonging to the moving object. In examples, the iterative closest point algorithm is executed as part of an algorithm for tracking the mobile depth camera, and hence the segmentation does not add substantial additional computational complexity.
    Type: Application
    Filed: January 31, 2011
    Publication date: August 2, 2012
    Applicant: Microsoft Corporation
    Inventors: Richard NEWCOMBE, Shahram IZADI, Otmar HILLIGES, David KIM, David MOLYNEAUX, Jamie Daniel Joseph SHOTTON, Pushmeet KOHLI, Andrew FITZGIBBON, Stephen Edward HODGES, David Alexander BUTLER
  • Publication number: 20120194650
    Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.
    Type: Application
    Filed: January 31, 2011
    Publication date: August 2, 2012
    Applicant: Microsoft Corporation
    Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
  • Publication number: 20120198103
    Abstract: A modular development platform is described which enables creation of reliable, compact, physically robust and power efficient embedded device prototypes. The platform consists of a base module which holds a processor and one or more peripheral modules each having an interface element. The base module and the peripheral modules may be electrically and/or physically connected together. The base module communicates with peripheral modules using packets of data with an addressing portion which identifies the peripheral module that is the intended recipient of the data packet.
    Type: Application
    Filed: April 4, 2012
    Publication date: August 2, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Stephen E. Hodges, David Alexander Butler, Shahram Izadi, Chih-Chieh Han
  • Publication number: 20120194516
    Abstract: Three-dimensional environment reconstruction is described. In an example, a 3D model of a real-world environment is generated in a 3D volume made up of voxels stored on a memory device. The model is built from data describing a camera location and orientation, and a depth image with pixels indicating a distance from the camera to a point in the environment. A separate execution thread is assigned to each voxel in a plane of the volume. Each thread uses the camera location and orientation to determine a corresponding depth image location for its associated voxel, determines a factor relating to the distance between the associated voxel and the point in the environment at the corresponding location, and updates a stored value at the associated voxel using the factor. Each thread iterates through an equivalent voxel in the remaining planes of the volume, repeating the process to update the stored value.
    Type: Application
    Filed: January 31, 2011
    Publication date: August 2, 2012
    Applicant: Microsoft Corporation
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
  • Publication number: 20120194644
    Abstract: Mobile camera localization using depth maps is described for robotics, immersive gaming, augmented reality and other applications. In an embodiment a mobile depth camera is tracked in an environment at the same time as a 3D model of the environment is formed using the sensed depth data. In an embodiment, when camera tracking fails, this is detected and the camera is relocalized either by using previously gathered keyframes or in other ways. In an embodiment, loop closures are detected in which the mobile camera revisits a location, by comparing features of a current depth map with the 3D model in real time. In embodiments the detected loop closures are used to improve the consistency and accuracy of the 3D model of the environment.
    Type: Application
    Filed: January 31, 2011
    Publication date: August 2, 2012
    Applicant: Microsoft Corporation
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20120194517
    Abstract: Use of a 3D environment model in gameplay is described. In an embodiment, a mobile depth camera is used to capture a series of depth images as it is moved around and a dense 3D model of the environment is generated from this series of depth images. This dense 3D model is incorporated within an interactive application, such as a game. The mobile depth camera is then placed in a static position for an interactive phase, which in some examples is gameplay, and the system detects motion of a user within a part of the environment from a second series of depth images captured by the camera. This motion provides a user input to the interactive application, such as a game. In further embodiments, automatic recognition and identification of objects within the 3D model may be performed and these identified objects then change the way that the interactive application operates.
    Type: Application
    Filed: January 31, 2011
    Publication date: August 2, 2012
    Applicant: Microsoft Corporation
    Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20120162117
    Abstract: The claimed subject matter provides a system and/or a method that facilitates enhancing interactive surface technologies for data manipulation. A surface detection component can employ a multiple contact surfacing technology to detect a surface input, wherein the detected surface input enables a physical interaction with a portion of displayed data that represents a corporeal object. A physics engine can integrate a portion of Newtonian physics into the interaction with the portion of displayed data in order to model at least one quantity related associated with the corporeal object, the quantity is at least one of a force, a mass, a velocity, or a friction.
    Type: Application
    Filed: February 29, 2012
    Publication date: June 28, 2012
    Applicant: Microsoft Corporation
    Inventors: Andrew David Wilson, Shahram Izadi, Armando Garcia-Mendoza, David Kirk, Otmar Hilliges
  • Patent number: 8199117
    Abstract: Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use). An archiving system is described which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. The image capture device is operable to capture digital images of physical objects for archiving. The receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus.
    Type: Grant
    Filed: May 9, 2007
    Date of Patent: June 12, 2012
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, Abigail J Sellen, Richard M Banks, Stuart Taylor, Stephen E Hodges, Alex Butler
  • Publication number: 20120139841
    Abstract: A user interface device with actuated buttons is described. In an embodiment, the user interface device comprises two or more buttons and the motion of the buttons is controlled by actuators under software control such that their motion is inter-related. The position or motion of the buttons may provide a user with feedback about the current state of a software program they are using or provide them with enhanced user input functionality. In another embodiment, the ability to move the buttons is used to reconfigure the user interface buttons and this may be performed dynamically, based on the current state of the software program, or may be performed dependent upon the software program being used. The user interface device may be a peripheral device, such as a mouse or keyboard, or may be integrated within a computing device such as a games device.
    Type: Application
    Filed: December 1, 2010
    Publication date: June 7, 2012
    Applicant: Microsoft Corporation
    Inventors: Stuart Taylor, Jonathan Hook, David Alexander Butler, Shahram Izadi, Nicolas Villar, Stephen Edward Hodges
  • Publication number: 20120139897
    Abstract: A tabletop display providing multiple views to users is described. In an embodiment the display comprises a rotatable view-angle restrictive filter and a display system. The display system displays a sequence of images synchronized with the rotation of the filter to provide multiple views according to viewing angle. These multiple views provide a user with a 3D display or with personalized content which is not visible to a user at a sufficiently different viewing angle. In some embodiments, the display comprises a diffuser layer on which the sequence of images are displayed. In further embodiments, the diffuser is switchable between a diffuse state when images are displayed and a transparent state when imaging beyond the surface can be performed. The device may form part of a tabletop comprising with a touch-sensitive surface. Detected touch events and images captured through the surface may be used to modify the images being displayed.
    Type: Application
    Filed: December 2, 2010
    Publication date: June 7, 2012
    Applicant: Microsoft Corporation
    Inventors: David Alexander Butler, Stephen Edward Hodges, Shahram Izadi, Nicolas Villar, Stuart Taylor, David Molyneaux, Otmar Hilliges