Patents by Inventor Arjun Dayal

Arjun Dayal has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8920241
    Abstract: A computing system translates a world space position of a hand of a human target to a screen space position of a user interface and locks the hand to a handle of the user interface if world space parameters of the hand overcome a grab threshold of the handle. When the hand is locked to the handle, the computing system translates a world space position of the hand to a screen space handle position that is constrained along one or more interface guides. The hand is unlocked from the handle at a release position of the handle if world space parameters of the hand overcome a release threshold of the handle. The handle is retained at the release position after the hand is unlocked from the handle.
    Type: Grant
    Filed: December 15, 2010
    Date of Patent: December 30, 2014
    Assignee: Microsoft Corporation
    Inventors: Brendan Reville, Jack Bridges, Andy Mattingly, Jordan Andersen, Christian Klein, Arjun Dayal
  • Patent number: 8499257
    Abstract: A system is disclosed for providing on-screen graphical handles to control interaction between a user and on-screen objects. A handle defines what actions a user may perform on the object, such as for example scrolling through a textual or graphical navigation menu. Affordances are provided to guide the user through the process of interacting with a handle.
    Type: Grant
    Filed: February 9, 2010
    Date of Patent: July 30, 2013
    Assignee: Microsoft Corporation
    Inventors: Andrew Mattingly, Jeremy Hill, Arjun Dayal, Brian Kramp, Ali Vassigh, Christian Klein, Adam Poulos, Alex Kipman, Jeffrey Margolis
  • Publication number: 20120157208
    Abstract: A computing system translates a world space position of a hand of a human target to a screen space position of a user interface and locks the hand to a handle of the user interface if world space parameters of the hand overcome a grab threshold of the handle. When the hand is locked to the handle, the computing system translates a world space position of the hand to a screen space handle position that is constrained along one or more interface guides. The hand is unlocked from the handle at a release position of the handle if world space parameters of the hand overcome a release threshold of the handle. The handle is retained at the release position after the hand is unlocked from the handle.
    Type: Application
    Filed: December 15, 2010
    Publication date: June 21, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Brendan Reville, Jack Bridges, Andy Mattingly, Jordan Andersen, Christian Klein, Arjun Dayal
  • Publication number: 20110289455
    Abstract: Symbolic gestures and associated recognition technology are provided for controlling a system user-interface, such as that provided by the operating system of a general computing system or multimedia console. The symbolic gesture movements in mid-air are performed by a user with or without the aid of an input device. A capture device is provided to generate depth images for three-dimensional representation of a capture area including a human target. The human target is tracked using skeletal mapping to capture the mid-air motion of the user. The skeletal mapping data is used to identify movements corresponding to pre-defined gestures using gesture filters that set forth parameters for determining when a target's movement indicates a viable gesture. When a gesture is detected, one or more pre-defined user-interface control actions are performed.
    Type: Application
    Filed: May 18, 2010
    Publication date: November 24, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Brendan Reville, Ali Vassigh, Arjun Dayal, Christian Klein, Adam Poulos, Andy Mattingly
  • Publication number: 20110279368
    Abstract: Techniques are provided for inferring a user's intent to interact with an application run by a motion capture system. Deliberate user gestures to interact with the motion capture system are disambiguated from unrelated user motions within the system's field of view. An algorithm may be used to determine the user's aggregated level of intent to engage the system. Parameters in the algorithm may include posture and motion of the user's body, as well as the state of the system. The system may develop a skeletal model to determine the various parameters. If the system determines that the parameters strongly indicate an intent to engage the system, then the system may react quickly. However, if the parameters only weakly indicate an intent to engage the system, it may take longer for the user to engage the system.
    Type: Application
    Filed: May 12, 2010
    Publication date: November 17, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Christian Klein, Andrew Mattingly, Ali Vassigh, Chen Li, Arjun Dayal
  • Publication number: 20110246383
    Abstract: Summary presentation of media consumption is described herein. An exemplary method for generating a personal highlight reel includes receiving personal consumption data indicating one or more media units consumed by a user computing device, and storing the personal consumption data in association with a user identifier. The method further includes identifying one or more relevant personal media units based on the personal consumption data. The method further includes generating a personal highlight reel including one or more personal media events representative of the one or more relevant personal media units, and outputting the personal highlight reel.
    Type: Application
    Filed: March 30, 2010
    Publication date: October 6, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Chad Gibson, Arjun Dayal
  • Publication number: 20110197161
    Abstract: A system is disclosed for providing on-screen graphical handles to control interaction between a user and on-screen objects. A handle defines what actions a user may perform on the object, such as for example scrolling through a textual or graphical navigation menu. Affordances are provided to guide the user through the process of interacting with a handle.
    Type: Application
    Filed: February 9, 2010
    Publication date: August 11, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: Andrew Mattingly, Jeremy Hill, Arjun Dayal, Brian Kramp, Ali Vassigh, Christian Klein, Adam Poulos, Alex Kipman, Jeffrey Margolis