Patents by Inventor Alex A. Kipman

Alex A. Kipman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190340317
    Abstract: Systems and methods are disclosed for using a synthetic world interface to model environments, sensors, and platforms, such as for computer vision sensor platform design. Digital models may be passed through a simulation service to generate synthetic experiment data. Systematic sweeps of parameters for various components of the sensor or platform design under test, under multiple environmental conditions, can facilitate time- and cost-efficient engineering efforts by revealing parameter sensitivities and environmental effects for multiple proposed configurations. Searches through the generated synthetic experimental data results can permit rapid identification of desirable design configuration candidates.
    Type: Application
    Filed: May 7, 2018
    Publication date: November 7, 2019
    Inventors: Jonathan Chi Hang CHAN, Michael EBSTYNE, Alex A. KIPMAN, Pedro U. ESCOS, Andrew C. GORIS
  • Patent number: 10398972
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Grant
    Filed: September 16, 2016
    Date of Patent: September 3, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
  • Patent number: 9943755
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: April 19, 2017
    Date of Patent: April 17, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 9861886
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Grant
    Filed: July 14, 2014
    Date of Patent: January 9, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
  • Publication number: 20170216718
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Application
    Filed: April 19, 2017
    Publication date: August 3, 2017
    Inventors: R. STEPHEN POLZIN, ALEX A. KIPMAN, MARK J. FINOCCHIO, RYAN MICHAEL GEISS, KATHRYN STONE PEREZ, KUDO TSUNODA, DARREN ALEXANDER BENNETT
  • Publication number: 20170144067
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Application
    Filed: September 16, 2016
    Publication date: May 25, 2017
    Inventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
  • Patent number: 9656162
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: April 14, 2014
    Date of Patent: May 23, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 9468848
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Grant
    Filed: December 12, 2013
    Date of Patent: October 18, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
  • Publication number: 20150363005
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Application
    Filed: August 24, 2015
    Publication date: December 17, 2015
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Patent number: 9182814
    Abstract: A depth image of a scene may be received, observed, or captured by a device. The depth image may include a human target that may have, for example, a portion thereof non-visible or occluded. For example, a user may be turned such that a body part may not be visible to the device, may have one or more body parts partially outside a field of view of the device, may have a body part or a portion of a body part behind another body part or object, or the like such that the human target associated with the user may also have a portion body part or a body part non-visible or occluded in the depth image. A position or location of the non-visible or occluded portion or body part of the human target associated with the user may then be estimated.
    Type: Grant
    Filed: June 26, 2009
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alex A. Kipman, Kathryn Stone Perez, Mark J. Finocchio, Ryan Michael Geiss, Kudo Tsunoda
  • Patent number: 9141193
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Grant
    Filed: August 31, 2009
    Date of Patent: September 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Publication number: 20140320508
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Application
    Filed: July 14, 2014
    Publication date: October 30, 2014
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffery Margolis
  • Publication number: 20140228123
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Application
    Filed: April 14, 2014
    Publication date: August 14, 2014
    Applicant: Microsoft Corporation
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 8803889
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: August 12, 2014
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
  • Patent number: 8744121
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: June 3, 2014
    Assignee: Microsoft Corporation
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Publication number: 20140109023
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Application
    Filed: December 12, 2013
    Publication date: April 17, 2014
    Applicant: Microsoft Corporation
    Inventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
  • Patent number: 8631355
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Grant
    Filed: January 8, 2010
    Date of Patent: January 14, 2014
    Assignee: Microsoft Corporation
    Inventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
  • Publication number: 20110173204
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Application
    Filed: January 8, 2010
    Publication date: July 14, 2011
    Applicant: Microsoft Corporation
    Inventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
  • Patent number: 7950000
    Abstract: Architecture that facilitates management of a build process according to a level of trust of a build entity. The build process processes one or more build entities, each of which is associated with a level of trust. These associations are stored in a policy file that is run against the one or more entities at the start of the build process. The build process runs at a permission level that is representative of the lowest level of trust of the build entities. The levels of trust include at least trusted, semi-trusted, and untrusted levels. If the lowest level is untrusted, the build process fails, and the user is notified.
    Type: Grant
    Filed: March 17, 2004
    Date of Patent: May 24, 2011
    Assignee: Microsoft Corporation
    Inventors: Alex A. Kipman, Rajeev Goel, Jomo A. Fisher, Christopher A. Flaat, Chad W. Royal
  • Publication number: 20110055846
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Application
    Filed: August 31, 2009
    Publication date: March 3, 2011
    Applicant: Microsoft Corporation
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda