Patents by Inventor Alex A. Kipman

Alex A. Kipman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20100302015
    Abstract: A system to present the user a 3-D virtual environment as well as non-visual sensory feedback for interactions that user makes with virtual objects in that environment is disclosed. In an exemplary embodiment, a system comprising a depth camera that captures user position and movement, a three-dimensional (3-D) display device that presents the user a virtual environment in 3-D and a haptic feedback device provides haptic feedback to the user as he interacts with a virtual object in the virtual environment. As the user moves through his physical space, he is captured by the depth camera. Data from that depth camera is parsed to correlate a user position with a position in the virtual environment. Where the user position or movement causes the user to touch the virtual object, that is determined, and corresponding haptic feedback is provided to the user.
    Type: Application
    Filed: July 12, 2010
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex Kipman, Kudo Tsunoda, Todd Eric Holmdahl, John Clavin, Kathryn Stone Perez
  • Publication number: 20100303289
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Publication number: 20100302257
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
  • Publication number: 20100302253
    Abstract: Techniques for generating an avatar model during the runtime of an application are herein disclosed. The avatar model can be generated from an image captured by a capture device. End-effectors can be positioned an inverse kinematics can be used to determine positions of other nodes in the avatar model.
    Type: Application
    Filed: August 26, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex A. Kipman, Kudo Tsunoda, Jeffrey N. Margolis, Scott W. Sims, Nicholas D. Burton, Andrew Wilson
  • Publication number: 20100306712
    Abstract: A capture device may capture a user's motion and a display device may display a model that maps to the user's motion, including gestures that are applicable for control. A user may be unfamiliar with a system that maps the user's motions or not know what gestures are applicable for an executing application. A user may not understand or know how to perform gestures that are applicable for the executing application. User motion data and/or outputs of filters corresponding to gestures may be analyzed to determine those cases where assistance to the user on performing the gesture is appropriate.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Gregory N. Snook, Stephen Latta, Kevin Geisner, Darren Alexander Bennett, Kudo Tsunoda, Alex Kipman, Kathryn Stone Perez
  • Publication number: 20100303302
    Abstract: A depth image of a scene may be received, observed, or captured by a device. The depth image may include a human target that may have, for example, a portion thereof non-visible or occluded. For example, a user may be turned such that a body part may not be visible to the device, may have one or more body parts partially outside a field of view of the device, may have a body part or a portion of a body part behind another body part or object, or the like such that the human target associated with the user may also have a portion body part or a body part non-visible or occluded in the depth image. A position or location of the non-visible or occluded portion or body part of the human target associated with the user may then be estimated.
    Type: Application
    Filed: June 26, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex A. Kipman, Kathryn Stone Perez, Mark J. Finocchio, Ryan Michael Geiss, Kudo Tsunoda
  • Publication number: 20100306714
    Abstract: Systems, methods and computer readable media are disclosed for gesture shortcuts. A user's movement or body position is captured by a capture device of a system, and is used as input to control the system. For a system-recognized gesture, there may be a full version of the gesture and a shortcut of the gesture. Where the system recognizes that either the full version of the gesture or the shortcut of the gesture has been performed, it sends an indication that the system-recognized gesture was observed to a corresponding application. Where the shortcut comprises a subset of the full version of the gesture, and both the shortcut and the full version of the gesture are recognized as the user performs the full version of the gesture, the system recognizes that only a single performance of the gesture has occurred, and indicates to the application as such.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Stephen Latta, Kevin Geisner, John Clavin, Kudo Tsunoda, Kathryn Stone Perez, Alex Kipman, Relja Markovic, Gregory N. Snook
  • Publication number: 20100302247
    Abstract: Techniques may comprise identifying surfaces, textures, and object dimensions from unorganized point clouds derived from a capture device, such as a depth sensing device. Employing target digitization may comprise surface extraction, identifying points in a point cloud, labeling surfaces, computing object properties, tracking changes in object properties over time, and increasing confidence in the object boundaries and identity as additional frames are captured. If the point cloud data includes an object, a model of the object may be generated. Feedback of the model associated with a particular object may be generated and provided real time to the user. Further, the model of the object may be tracked in response to any movement of the object in the physical space such that the model may be adjusted to mimic changes or movement of the object, or increase the fidelity of the target's characteristics.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas Burton, Andrew Wilson, Diego Fernandes Nehab
  • Publication number: 20100281436
    Abstract: Techniques for managing a set of states associated with a capture device are disclosed herein. The capture device may detect and bind to users, and may provide feedback about whether the capture device is bound to, or detecting a user. Techniques are also disclosed wherein virtual ports may be associated with users bound to a capture device and feedback about the state of virtual ports may be provided.
    Type: Application
    Filed: May 1, 2009
    Publication date: November 4, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex Kipman, Kathryn Stone Perez, R. Stephen Polzin, William Guo
  • Publication number: 20100195867
    Abstract: A method of tracking a target includes receiving an observed depth image of the target from a source and analyzing the observed depth image with a prior-trained collection of known poses to find an exemplar pose that represents an observed pose of the target. The method further includes rasterizing a model of the target into a synthesized depth image having a rasterized pose and adjusting the rasterized pose of the model into a model-fitting pose based, at least in part, on differences between the observed depth image and the synthesized depth image. Either the exemplar pose or the model-fitting pose is then selected to represent the target.
    Type: Application
    Filed: February 6, 2009
    Publication date: August 5, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex Kipman, Mark Finocchio, Ryan M. Geiss, Johnny Chung Lee, Charles Claudius Marais, Zsolt Mathe
  • Publication number: 20100199229
    Abstract: Systems and methods for mapping natural input devices to legacy system inputs are disclosed. One example system may include a computing device having an algorithmic preprocessing module configured to receive input data containing a natural user input and to identify the natural user input in the input data. The computing device may further include a gesture module coupled to the algorithmic preprocessing module, the gesture module being configured to associate the natural user input to a gesture in a gesture library. The computing device may also include a mapping module to map the gesture to a legacy controller input, and to send the legacy controller input to a legacy system in response to the natural user input.
    Type: Application
    Filed: March 25, 2009
    Publication date: August 5, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex Kipman, Stephen Polzin, Kudo Tsunoda, Darren Bennett, Stephen Latta, Mark Finocchio, Gregory G. Snook, Relja Markovic
  • Publication number: 20080183591
    Abstract: A developer of digital products can provide one or more digital products for sale to a user, some of which are not purchased or active. Subsequently, when a user desires to purchase such digital products, only a license file needs to be provided to the user. To encourage such bundling, the bundler, such as a hardware manufacturer can include identifiers to enable a referral payment to be paid to the manufacturer. Similarly, identifiers of a merchant of record can be included to direct the user to a specific merchant for the sale. To streamline the management of digital licenses, an authorized merchant can be used to provide digital licenses upon purchase, even if the sale was completed with a merchant of record.
    Type: Application
    Filed: January 31, 2007
    Publication date: July 31, 2008
    Applicant: Microsoft Corporation
    Inventors: Hakan Olsson, Alex A. Kipman, Joshua Kriesberg, Mark Svancarek
  • Publication number: 20060212857
    Abstract: An “out-of-the-box” automated build process application capable of executing a build process without any human intervention. The automated build process application may be configured to be installed and executed without any intervening manual coding of the build process, and may be capable of being configured through a user interface. The automated build application may be integrated within a software development environment, eliminating the need to independently create and use non-integrated software tools and scripts to automate aspects of the build process. Embodiments of the invention may be implemented using a workflow engine configured to execute a build process. A workflow engine (e.g., the MSBuild engine available from Microsoft Corporation) can be configured to perform all of the acts involved in a build process. The build process may be defined by one or more files formatted in accordance with a markup language such as, for example, XML or HTML.
    Type: Application
    Filed: March 21, 2005
    Publication date: September 21, 2006
    Applicant: Microsoft Corporation
    Inventors: Douglas Neumann, Brian Harry, Sam Guckenheimer, Alex Kipman
  • Publication number: 20060048094
    Abstract: Decoupling inputs and outputs in a workflow process may be accomplished by adding a level of indirection. Steps in a workflow can associate their outputs with both a primary identification and a secondary identification. Each step can be configured to accept files or other data associated with particular secondary identifications as input, regardless of the primary identification. Thus, while the output, and thus the primary identification of a step may change, the secondary identification need not change. This reduces the chance of breaking or degrading subsequent downstream steps in a workflow process by modifying an upstream step. The secondary identification may be further associated with metadata, which allows for more sophisticated, input-specific control of the steps in a workflow. A list of the steps in a workflow can be created that incorporates the secondary identification and allows for high-performance integration of build process control into an Integrated Development Environment (IDE).
    Type: Application
    Filed: August 26, 2004
    Publication date: March 2, 2006
    Applicant: Microsoft Corporation
    Inventors: Alex Kipman, Sumedh Kanetkar, Rajeev Goel
  • Publication number: 20050210448
    Abstract: Architecture that facilitates management of a build process according to a level of trust of a build entity. The build process processes one or more build entities, each of which is associated with a level of trust. These associations are stored in a policy file that is run against the one or more entities at the start of the build process. The build process runs at a permission level that is representative of the lowest level of trust of the build entities. The levels of trust include at least trusted, semi-trusted, and untrusted levels. If the lowest level is untrusted, the build process fails, and the user is notified.
    Type: Application
    Filed: March 17, 2004
    Publication date: September 22, 2005
    Inventors: Alex Kipman, Rajeev Goel, Jomo Fisher, Christopher Flaat, Chad Royal
  • Publication number: 20050177824
    Abstract: Re-useable build tasks are analyzed and tasks with identical attribute values are grouped together into batches, and the task is executed once for each batch, eliminating the need for consideration of looping constructs in the build process.
    Type: Application
    Filed: February 5, 2004
    Publication date: August 11, 2005
    Inventors: Alex Kipman, Rajeev Goel