Patents by Inventor Shahram Izadi

Shahram Izadi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20120131514
    Abstract: Gesture recognition is described. In one example, gestures performed by a user of an input device having a touch-sensitive portion are detected using a definition of a number of regions corresponding to zones on the touch-sensitive portion, each region being associated with a distinct set of gestures. Data describing movement of the user's digits on the touch-sensitive portion is received, and an associated region for the data determined. The data is compared to the associated region's set of gestures, and a gesture applicable to the data selected. A command associated with the selected gesture can then be executed. In an example, comparing the data to the set of gestures comprises positioning a threshold for each gesture relative to the start of the digit's movement. The digit's location is compared to each threshold to determine whether a threshold has been crossed, and, if so, selecting the gesture associated with that threshold.
    Type: Application
    Filed: November 19, 2010
    Publication date: May 24, 2012
    Applicant: Microsoft Corporation
    Inventors: Peter John Ansell, Shahram Izadi
  • Publication number: 20120113223
    Abstract: Techniques for user-interaction in augmented reality are described. In one example, a direct user-interaction method comprises displaying a 3D augmented reality environment having a virtual object and a real first and second object controlled by a user, tracking the position of the objects in 3D using camera images, displaying the virtual object on the first object from the user's viewpoint, and enabling interaction between the second object and the virtual object when the first and second objects are touching. In another example, an augmented reality system comprises a display device that shows an augmented reality environment having a virtual object and a real user's hand, a depth camera that captures depth images of the hand, and a processor. The processor receives the images, tracks the hand pose in six degrees-of-freedom, and enables interaction between the hand and the virtual object.
    Type: Application
    Filed: November 5, 2010
    Publication date: May 10, 2012
    Applicant: Microsoft Corporation
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, David Molyneaux, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20120113140
    Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.
    Type: Application
    Filed: November 5, 2010
    Publication date: May 10, 2012
    Applicant: Microsoft Corporation
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, David Molyneaux, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20120117514
    Abstract: Three-dimensional user interaction is described. In one example, a virtual environment having virtual objects and a virtual representation of a user's hand with digits formed from jointed portions is generated, a point on each digit of the user's hand is tracked, and the virtual representation's digits controlled to correspond to those of the user. An algorithm is used to calculate positions for the jointed portions, and the physical forces acting between the virtual representation and objects are simulated. In another example, an interactive computer graphics system comprises a processor that generates the virtual environment, a display device that displays the virtual objects, and a camera that capture images of the user's hand. The processor uses the images to track the user's digits, computes the algorithm, and controls the display device to update the virtual objects on the display device by simulating the physical forces.
    Type: Application
    Filed: November 4, 2010
    Publication date: May 10, 2012
    Applicant: Microsoft Corporation
    Inventors: David Kim, Otmar Hilliges, Shahram Izadi, David Molyneaux, Stephen Edward Hodges
  • Patent number: 8175099
    Abstract: A modular development platform is described which enables creation of reliable, compact, physically robust and power efficient embedded device prototypes. The platform consists of a base module which holds the processor and one or more peripheral modules each having a peripheral device and an interface element. The modules can be electrically and physically connected together. The base module communicates with peripheral modules using packets of data with an addressing portion which identifies the peripheral module that is the intended recipient of the data packet.
    Type: Grant
    Filed: May 14, 2007
    Date of Patent: May 8, 2012
    Assignee: Microsoft Corporation
    Inventors: Stephen E. Hodges, David Alexander Butler, Shahram Izadi, Chih-Chieh Han
  • Publication number: 20120105312
    Abstract: A user input device is described. In an embodiment the user input device is hand held and comprises a sensing strip to detect one-dimensional motion of a user's finger or thumb along the sensing strip and to detect position of a user's finger or thumb on the sensing strip. In an embodiment the sensed data is used for cursor movement and/or text input at a master device. In an example the user input device has an orientation sensor and orientation of the device influences orientation of a cursor. For example, a user may move the cursor in a straight line in the pointing direction of the cursor by sliding a finger or thumb along the sensing strip. In an example, an alphabetical scale is displayed and a user is able to zoom into the scale and select letters for text input using the sensing strip.
    Type: Application
    Filed: October 29, 2010
    Publication date: May 3, 2012
    Applicant: Microsoft Corporation
    Inventors: John Helmes, Shahram Izadi, Xiang Cao, Nicolas Villar, Richard Banks
  • Patent number: 8154524
    Abstract: The claimed subject matter provides a system and/or a method that facilitates enhancing interactive surface technologies for data manipulation. A surface detection component can employ a multiple contact surfacing technology to detect a surface input, wherein the detected surface input enables a physical interaction with a portion of displayed data that represents a corporeal object. A physics engine can integrate a portion of Newtonian physics into the interaction with the portion of displayed data in order to model at least one quantity related associated with the corporeal object, the quantity is at least one of a force, a mass, a velocity, or a friction.
    Type: Grant
    Filed: September 3, 2008
    Date of Patent: April 10, 2012
    Assignee: Microsoft Corporation
    Inventors: Andrew David Wilson, Shahram Izadi, Armando Garcia-Mendoza, David Kirk, Otmar Hilliges
  • Publication number: 20120075256
    Abstract: A touch panel is described which uses at least one infrared source and an array of infrared sensors to detect objects which are in contact with, or close to, the touchable surface of the panel. The panel may be operated in both reflective and shadow modes, in arbitrary per-pixel combinations which change over time. For example, if the level of ambient infrared is detected and if that level exceeds a threshold, shadow mode is used for detection of touch events over some or all of the display. If the threshold is not exceeded, reflective mode is used to detect touch events. The touch panel includes an infrared source and an array of infrared sensors.
    Type: Application
    Filed: December 7, 2011
    Publication date: March 29, 2012
    Applicant: Microsoft Corporation
    Inventors: Shahram Izadi, Stephen Hodges, David Alexander Butler, Alban Rrustemi
  • Publication number: 20120038891
    Abstract: The techniques described herein provide a surface computing device that includes a surface layer configured to be in a transparent state and a diffuse state. In the diffuse state, an image can be projected onto the surface. In the transparent state, an image can be projected through the surface.
    Type: Application
    Filed: October 24, 2011
    Publication date: February 16, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Stuart Taylor, Shahram Izadi, Daniel A. Rosenfeld, Stephen Hodges, David Alexander Butler, James Scott, Nicolas Villar
  • Patent number: 8094129
    Abstract: A touch panel is described which uses at least one infrared source and an array of infrared sensors to detect objects which are in contact with, or close to, the touchable surface of the panel. The panel may be operated in both reflective and shadow modes, in arbirary per-pixel combinations which change over time. For example, if the level of ambient infrared is detected and if that level exceeds a threshold, shadow mode is used for detection of touch events over some or all of the display. If the threshold is not exceeded, reflective mode is used to detect touch events. The touch panel includes an infrared source and an array of infrared sensors.
    Type: Grant
    Filed: March 29, 2007
    Date of Patent: January 10, 2012
    Assignee: Microsoft Corporation
    Inventors: Shahram Izadi, Stephen Hodges, David Alexander Butler, Alban Rrustemi
  • Publication number: 20110296043
    Abstract: Sharing and exchanging sessions between devices and users in a Shared Resource Computing (SRC) environment are disclosed. Example systems include a shared resource computing server and a plurality of peripheral devices. The SRC server (“SRC Box”) may include functionality configured to share and exchange sessions between the peripheral devices and the users, including functionality to map graphical representations of sessions to sessions and to map graphical representations of users to users, and functionality to display the representations of sessions and users within a graphical user interface. Alternate embodiments may also include functionality for transferring a saved session between devices.
    Type: Application
    Filed: June 1, 2010
    Publication date: December 1, 2011
    Applicant: Microsoft Corporation
    Inventors: Paul C. Sutton, Shahram Izadi, Behrooz Chitsaz
  • Patent number: 8042949
    Abstract: A surface computing device is described which has a surface which can be switched between transparent and diffuse states. When the surface is in its diffuse state, an image can be projected onto the surface and when the surface is in its transparent state, an image can be projected through the surface and onto an object. In an embodiment, the image projected onto the object is redirected onto a different face of the object, so as to provide an additional display surface or to augment the appearance of the object. In another embodiment, the image may be redirected onto another object.
    Type: Grant
    Filed: May 2, 2008
    Date of Patent: October 25, 2011
    Assignee: Microsoft Corporation
    Inventors: Stuart Taylor, Shahram Izadi, Daniel A. Rosenfeld, Stephen Hodges, David Alexander Butler, James Scott, Nicolas Villar
  • Publication number: 20110252163
    Abstract: An integrated development environment for rapid device development is described. In an embodiment the integrated development environment provides a number of different views to a user which each relate to a different aspect of device design, such as hardware configuration, software development and physical design. The device, which may be a prototype device, is formed from a number of objects which are selected from a database and the database stores multiple data types for each object, such as a 3D model, software libraries and code-stubs for the object and hardware parameters. A user can design the device by selecting different views in any order and can switch between views as they choose. Changes which are made in one view, such as the selection of a new object, are fed into the other views.
    Type: Application
    Filed: April 9, 2010
    Publication date: October 13, 2011
    Applicant: Microsoft Corporation
    Inventors: Nicolas Villar, James Scott, Stephen Hodges, David Alexander Butler, Shahram Izadi
  • Publication number: 20110239117
    Abstract: Sharing and exchanging information in a Shared Resource Computing (SRC) environment are disclosed. Example systems include a shared resource computing server and a plurality of peripheral devices. The SRC server may include functionality configured to share and exchange information between the peripheral devices, including functionality to determine the physical position of the peripheral devices, functionality to associate avatars to the peripheral devices, and functionality to display the avatars within a representation of the environment. Alternate embodiments may also include functionality for user authentication and functionality for sending a document between peripheral devices.
    Type: Application
    Filed: March 25, 2010
    Publication date: September 29, 2011
    Applicant: Microsoft Corporation
    Inventors: Paul C. Sutton, Shahram Izadi, Behrooz Chitsaz
  • Publication number: 20110227947
    Abstract: Multi-touch user interface interaction is described. In an embodiment, an object in a user interface (UI) is manipulated by a cursor and a representation of a plurality of digits of a user. At least one parameter, which comprises the cursor location in the UI, is used to determine that multi-touch input is to be provided to the object. Responsive to this, the relative movement of the digits is analyzed and the object manipulated accordingly. In another embodiment, an object in a UI is manipulated by a representation of a plurality of digits of a user. Movement of each digit by the user moves the corresponding representation in the UI, and the movement velocity of the representation is a non-linear function of the digit's velocity. After determining that multi-touch input is to be provided to the object, the relative movement of the representations is analyzed and the object manipulated accordingly.
    Type: Application
    Filed: March 16, 2010
    Publication date: September 22, 2011
    Applicant: Microsoft Corporation
    Inventors: Hrvoje Benko, Shahram Izadi, Andrew D. Wilson, Daniel Rosenfeld, Ken Hinckley, Xiang Cao, Nicolas Villar, Stephen Hodges
  • Publication number: 20110225366
    Abstract: A dual-mode, dual-display shared resource computing (SRC) device is usable to stream SRC content from a host SRC device while in an on-line mode and maintain functionality with the content during an off-line mode. Such remote SRC devices can be used to maintain multiple user-specific caches and to back-up cached content for multi-device systems.
    Type: Application
    Filed: March 9, 2010
    Publication date: September 15, 2011
    Applicant: Microsoft Corporation
    Inventors: Shahram Izadi, Behrooz Chitsaz
  • Publication number: 20110210915
    Abstract: Techniques for human body pose estimation are disclosed herein. Images such as depth images, silhouette images, or volumetric images may be generated and pixels or voxels of the images may be identified. The techniques may process the pixels or voxels to determine a probability that each pixel or voxel is associated with a segment of a body captured in the image or to determine a three-dimensional representation for each pixel or voxel that is associated with a location on a canonical body. These probabilities or three-dimensional representations may then be utilized along with the images to construct a posed model of the body captured in the image.
    Type: Application
    Filed: March 3, 2011
    Publication date: September 1, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Jamie Daniel Joseph Shotton, Shahram Izadi, Otmar Hilliges, David Kim, David Geoffrey Molyneaux, Matthew Darius Cook, Pushmeet Kohli, Antonio Criminisi, Ross Brook Girshick, Andrew William Fitzgibbon
  • Publication number: 20110210917
    Abstract: User interface control using a keyboard is described. In an embodiment, a user interface displayed on a display device is controlled using a computer connected to a keyboard. The keyboard has a plurality of alphanumeric keys that can be used for text entry. The computer receives data comprising a sequence of key-presses from the keyboard, and generates for each key-press a physical location on the keyboard. The relative physical locations of the key-presses are compared to calculate a movement path over the keyboard. The movement path describes the path of a user's digit over the keyboard. The movement path is mapped to a sequence of coordinates in the user interface, and the movement of an object displayed in the user interface is controlled in accordance with the sequence of coordinates.
    Type: Application
    Filed: February 26, 2010
    Publication date: September 1, 2011
    Applicant: Microsoft Corporation
    Inventors: Harper LaFave, Stephen Hodges, James Scott, Shahram Izadi, David Molyneaux, Nicolas Villar, David Alexander Butler, Mike Hazas
  • Publication number: 20110214053
    Abstract: Assisting input from a keyboard is described. In an embodiment, a processor receives a plurality of key-presses from the keyboard comprising alphanumeric data for input to application software executed at the processor. The processor analyzes the plurality of key-presses to detect at least one predefined typing pattern, and, in response, controls a display device to display a representation of at least a portion of the keyboard in association with a user interface of the application software. In another embodiment, a computer device has a keyboard and at least one sensor arranged to monitor at least a subset of keys on the keyboard, and detect an object within a predefined distance of a selected key prior to activation of the selected key. The processor then controls the display device to display a representation of a portion of the keyboard comprising the selected key.
    Type: Application
    Filed: February 26, 2010
    Publication date: September 1, 2011
    Applicant: Microsoft Corporation
    Inventors: James Scott, Shahram Izadi, Nicolas Villar, Ravin Balakrishnan
  • Publication number: 20110169779
    Abstract: An infrared source is configured to illuminate the underside of one or more objects on or above a touchable surface of a touch panel. Infrared light reflected from the underside of the object(s) is detected by an infrared sensor integrated in the touch panel below the touchable surface.
    Type: Application
    Filed: March 11, 2011
    Publication date: July 14, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Willem den Boer, Steven N. Bathiche, Stephen Edward Hodges, Shahram Izadi