Patents by Inventor Shahram Izadi
Shahram Izadi has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20110157094Abstract: An infrared source is configured to illuminate the underside of one or more objects on or above a touchable surface of a touch panel. Infrared light reflected from the underside of the object(s) is detected by an infrared sensor integrated in the touch panel below the touchable surface.Type: ApplicationFiled: March 11, 2011Publication date: June 30, 2011Applicant: MICROSOFT CORPORATIONInventors: Willem den Boer, Steven N. Bathiche, Stephen Edward Hodges, Shahram Izadi
-
Publication number: 20110121950Abstract: Methods and apparatus for uniquely identifying wireless devices in close physical proximity are described. When two wireless devices are brought into close proximity, one of the devices displays an optical indicator, such as a light pattern. This device then sends messages to other devices which are within wireless range to cause them to use any light sensor to detect a signal. In an embodiment, the light sensor is a camera and the detected signal is an image captured by the camera. Each device then sends data identifying what was detected back to the device displaying the pattern. By analyzing this data, the first device can determine which other device detected the indicator that it displayed and therefore determine that this device is within close physical proximity. In an example, the first device is an interactive surface arranged to identify the wireless addresses of devices which are placed on the surface.Type: ApplicationFiled: January 31, 2011Publication date: May 26, 2011Applicant: Microsoft CorporationInventors: Shahram Izadi, Malcolm Hall, Stephen E. Hodges, William Buxton, David Alexander Butler
-
Publication number: 20110085705Abstract: A system and method for detecting and tracking targets including body parts and props is described. In one aspect, the disclosed technology acquires one or more depth images, generates one or more classification maps associated with one or more body parts and one or more props, tracks the one or more body parts using a skeletal tracking system, tracks the one or more props using a prop tracking system, and reports metrics regarding the one or more body parts and the one or more props. In some embodiments, feedback may occur between the skeletal tracking system and the prop tracking system.Type: ApplicationFiled: December 20, 2010Publication date: April 14, 2011Applicant: MICROSOFT CORPORATIONInventors: Shahram Izadi, Jamie Shotton, John Winn, Antonio Criminisi, Otmar Hilliges, Mat Cook, David Molyneaux
-
Patent number: 7924272Abstract: An infrared source is configured to illuminate the underside of one or more objects on or above a touchable surface of a touch panel. Infrared light reflected from the underside of the object(s) is detected by an infrared sensor integrated in the touch panel below the touchable surface.Type: GrantFiled: November 27, 2006Date of Patent: April 12, 2011Assignee: Microsoft CorporationInventors: Willem den Boer, Steven N. Bathiche, Stephen Edward Hodges, Shahram Izadi
-
Publication number: 20110080341Abstract: Indirect multi-touch interaction is described. In an embodiment, a user interface is controlled using a cursor and a touch region comprising a representation of one or more digits of a user. The cursor and the touch region are moved together in the user interface in accordance with data received from a cursor control device, such that the relative location of the touch region and the cursor is maintained. The representations of the digits of the user are moved in the touch region in accordance with data describing movement of the user's digits. In another embodiment, a user interface is controlled in a first mode of operation using an aggregate cursor, and switched to a second mode of operation in which the aggregate cursor is divided into separate portions, each of which can be independently controlled by the user.Type: ApplicationFiled: October 1, 2009Publication date: April 7, 2011Applicant: Microsoft CorporationInventors: John Helmes, Nicolas Villar, Hrvoje Benko, Shahram Izadi, Daniel Rosenfeld, Stephen Hodges, David Alexander Butler, Xiang Cao, Richard Banks
-
Patent number: 7904720Abstract: System and method for providing secure resource management. The system includes a first device that creates a secure, shared resource space and a corresponding root certificate for the shared space. The first device associates one or more resources that it can access with the shared space. The first device invites one or more other devices to join as members of the space, and establishes secure communication channels with the devices that accept this invitation. The first device generates a member certificate for each accepting device, and sends the root certificate and the generated member certificate to the device through the secure channel. These devices may then access resources associated with the shared space by presenting their member certificates. Further, members of the shared space may invite other device to join the space, and may create member certificates in the same manner as the first device.Type: GrantFiled: November 6, 2002Date of Patent: March 8, 2011Assignee: Palo Alto Research Center IncorporatedInventors: Diana Kathryn Smetters, Warren Keith Edwards, Dirk Balfanz, Hao-Chi Wong, Mark Webster Newman, Jana Zdislava Sedivy, Trevor Smith, Shahram Izadi
-
Patent number: 7884734Abstract: Methods and apparatus for uniquely identifying wireless devices in close physical proximity are described. When two wireless devices are brought into close proximity, one of the devices displays an optical indicator, such as a light pattern. This device then sends messages to other devices which are within wireless range to cause them to use any light sensor to detect a signal. In an embodiment, the light sensor is a camera and the detected signal is an image captured by the camera. Each device then sends data identifying what was detected back to the device displaying the pattern. By analyzing this data, the first device can determine which other device detected the indicator that it displayed and therefore determine that this device is in close physical proximity to it. In an example, the first device is an interactive surface arranged to identify the wireless addresses of devices which are placed on the surface.Type: GrantFiled: January 31, 2008Date of Patent: February 8, 2011Assignee: Microsoft CorporationInventors: Shahram Izadi, Malcolm Hall, Stephen E. Hodges, William Buxton, David Alexander Butler
-
Publication number: 20100315413Abstract: Surface computer user interaction is described. In an embodiment, an image of a user's hand interacting with a user interface displayed on a surface layer of a surface computing device is captured. The image is used to render a corresponding representation of the hand. The representation is displayed in the user interface such that the representation is geometrically aligned with the user's hand. In embodiments, the representation is a representation of a shadow or a reflection. The process is performed in real-time, such that movement of the hand causes the representation to correspondingly move. In some embodiments, a separation distance between the hand and the surface is determined and used to control the display of an object rendered in a 3D environment on the surface layer. In some embodiments, at least one parameter relating to the appearance of the object is modified in dependence on the separation distance.Type: ApplicationFiled: June 16, 2009Publication date: December 16, 2010Applicant: Microsoft CorporationInventors: Shahram Izadi, Nicolas Villar, Otmar Hilliges, Stephen E. Hodges, Armando Garcia-Mendoza, Andrew David Wilson
-
Publication number: 20100315336Abstract: A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device.Type: ApplicationFiled: June 16, 2009Publication date: December 16, 2010Applicant: Microsoft CorporationInventors: David Alexander BUTLER, Nicolas VILLAR, John HELMES, Shahram IZADI, Stephen E. HODGES, Daniel ROSENFELD, Hrvoje BENKO
-
Publication number: 20100315335Abstract: A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture.Type: ApplicationFiled: June 16, 2009Publication date: December 16, 2010Applicant: Microsoft CorporationInventors: Nicolas Villar, John Helmes, Shahram Izadi, Daniel Rosenfeld, Stephen E. Hodges, David Alexander Butler, Xiang Cao, Otmar Hilliges, Richard Banks, Benjamin David Eidelson, Hrvoje Benko
-
Publication number: 20100313150Abstract: The claimed subject matter relates to a display that is physically separable and to an associated architecture that can facilitate data mobility or collaboration in connection with the separable display. In particular, the separable display can be configured as an apparent unitary or singular UI for an associated multi-node computer, yet for which portion of the separable display can be physically decoupled. The multi-node computer can include a set of computing nodes, each of which can potentially operate autonomously, yet also in unison with other nodes to form a collective multiprocessor computing platform. Moreover, each of the computing nodes can be embedded in and distributed throughout the separable display. Accordingly, when a portion of the separable display is decoupled from a remainder of the separable display, both the portion and the remainder can include some subset of the computing nodes, and can therefore maintain the UI.Type: ApplicationFiled: June 3, 2009Publication date: December 9, 2010Applicant: MICROSOFT CORPORATIONInventors: Meredith J. Morris, Steven N. Bathiche, Stephen Edward Hodges, Ian C. LeGrow, Victor Kevin Russ, Ian M. Sands, William J. Westerinen, John Christopher Whytock, Andrew D. Wilson, David Alexander Butler, Shahram Izadi
-
Publication number: 20100302199Abstract: Ferromagnetic user interfaces are described. In embodiments, user interface devices are described that can detect the location of movement on a user-touchable portion by sensing movement of a ferromagnetic material. In some embodiments sensors are arranged in a two dimensional array, and the user interface device can determine the location of the movement in a plane substantially parallel to the two-dimensional array and the acceleration of movement substantially perpendicular to the two-dimensional array. In other embodiments, user interface devices are described that can cause a raised surface region to be formed on a ferrofluid layer of a user-touchable portion, which is detectable by the touch of a user. Embodiments describe how the raised surface region can be moved on the ferrofluid layer. Embodiments also describe how the raised surface region can be caused to vibrate.Type: ApplicationFiled: May 26, 2009Publication date: December 2, 2010Applicant: Microsoft CorporationInventors: Stuart Taylor, Jonathan Hook, Shahram Izadi, Nicolas Villar, David Alexander Butler, Stephen E Hodges
-
Publication number: 20100265178Abstract: Technologies for a camera-based multi-touch input device operable to provide conventional mouse movement data as well as three-dimensional multi-touch data. Such a device is based on an internal camera focused on a mirror or set of mirrors enabling the camera to image the inside of a working surface of the device. The working surface allows light to pass through. An internal light source illuminates the inside of the working surface and reflects off of any objects proximate to the outside of the device. This reflected light is received by the mirror and then directed to the camera. Imaging from the camera can be processed to extract touch points corresponding to the position of one or more objects outside the working surface as well as to detect gestures performed by the objects. Thus the device can provide conventional mouse functionality as well as three-dimensional multi-touch functionality.Type: ApplicationFiled: April 17, 2009Publication date: October 21, 2010Applicant: Microsoft CorporationInventors: Hrvoje Benko, Daniel Allen Rosenfeld, Eyal Ofek, Billy Chen, Shahram Izadi, Nicolas Villar, John Helmes
-
Publication number: 20100242274Abstract: Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. For example, in one disclosed embodiment, a method of making a multi-touch input device having a curved touch-sensitive surface comprises forming on a substrate an array of sensor elements defining a plurality of pixels of the multi-touch sensor, forming the substrate into a shape that conforms to a surface of the curved geometric feature of the body of the input device, and fixing the substrate to the curved geometric feature of the body of the input device.Type: ApplicationFiled: June 19, 2009Publication date: September 30, 2010Applicant: MICROSOFT CORPORATIONInventors: Daniel Rosenfeld, Jonathan Westhues, Shahram Izadi, Nicolas Villar, Hrvoje Benko, John Helmes, Kurt Allen Jenkins
-
Publication number: 20100245246Abstract: Embodiments are disclosed herein that are related to input devices with curved multi-touch surfaces. One disclosed embodiment comprises a touch-sensitive input device having a curved geometric feature comprising a touch sensor, the touch sensor comprising an array of sensor elements integrated into the curved geometric feature and being configured to detect a location of a touch made on a surface of the curved geometric feature.Type: ApplicationFiled: June 19, 2009Publication date: September 30, 2010Applicant: MICROSOFT CORPORATIONInventors: Daniel Rosenfeld, Jonathan Westhues, Shahram Izadi, Nicolas Villar, Hrvoje Benko, John Helmes, Kurt Allen Jenkins
-
Publication number: 20100225595Abstract: The claimed subject matter provides a system and/or a method that facilitates distinguishing input among one or more users in a surface computing environment. A variety of information can be obtained and analyzed to infer an association between a particular input and a particular user. Touch point information can be acquired from a surface wherein the touch point information relates to a touch point. In addition, one or more environmental sensors can monitor the surface computing environment and provide environmental information. The touch point information and the environmental information can be analyzed to determine direction of inputs, location of users, and movement of users and so on. Individual analysis results can be correlated and/or aggregated to generate a inference of association between a touch point and user.Type: ApplicationFiled: March 3, 2009Publication date: September 9, 2010Applicant: MICROSOFT CORPORATIONInventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
-
Publication number: 20100218249Abstract: The claimed subject matter provides a system and/or a method that facilitates authentication of a user in a surface computing environment. A device or authentication object can be carried by a user and employed to retain authentication information. An authentication component can obtain the authentication information from the device and analyze the information to verify an identity of the user. A touch input component can ascertain if a touch input is authentication by associating touch input with the user. In addition, authentication information can be employed to establish a secure communications channel for transfer of user data.Type: ApplicationFiled: February 25, 2009Publication date: August 26, 2010Applicant: MICROSOFT CORPORATIONInventors: Andrew D. Wilson, Stephen E. Hodges, Peter B. Thompson, Meredith June Morris, Paul Armistead Hoover, William J. Westerinen, Steven N. Bathiche, Ian M. Sands, Shahram Izadi, David Alexander Butler, Matthew B. MacLaurin, Arthur T. Whitten, William Ben Kunz, Shawn R. LeProwse, Hrvoje Benko
-
Publication number: 20100182220Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.Type: ApplicationFiled: January 16, 2009Publication date: July 22, 2010Applicant: Microsoft CorporationInventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
-
Publication number: 20100149182Abstract: A volumetric display system which enables user interaction is described. In an embodiment, the system consists of a volumetric display and an optical system. The volumetric display creates a 3D light field of an object to be displayed and the optical system creates a copy of the 3D light field in a position away from the volumetric display and where a user can interact with the image of the object displayed. In an embodiment, the optical system involves a pair of parabolic mirror portions.Type: ApplicationFiled: December 17, 2008Publication date: June 17, 2010Applicant: Microsoft CorporationInventors: David Alexander Butler, Stephen E. Hodges, Shahram Izadi, Stuart Taylor, Nicolas Villar
-
Publication number: 20100149090Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.Type: ApplicationFiled: December 15, 2008Publication date: June 17, 2010Applicant: MICROSOFT CORPORATIONInventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz