Patents by Inventor Alex A. Kipman
Alex A. Kipman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20190340317Abstract: Systems and methods are disclosed for using a synthetic world interface to model environments, sensors, and platforms, such as for computer vision sensor platform design. Digital models may be passed through a simulation service to generate synthetic experiment data. Systematic sweeps of parameters for various components of the sensor or platform design under test, under multiple environmental conditions, can facilitate time- and cost-efficient engineering efforts by revealing parameter sensitivities and environmental effects for multiple proposed configurations. Searches through the generated synthetic experimental data results can permit rapid identification of desirable design configuration candidates.Type: ApplicationFiled: May 7, 2018Publication date: November 7, 2019Inventors: Jonathan Chi Hang CHAN, Michael EBSTYNE, Alex A. KIPMAN, Pedro U. ESCOS, Andrew C. GORIS
-
Patent number: 10398972Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: GrantFiled: September 16, 2016Date of Patent: September 3, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
-
Patent number: 9943755Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: GrantFiled: April 19, 2017Date of Patent: April 17, 2018Assignee: Microsoft Technology Licensing, LLCInventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
-
Patent number: 9861886Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.Type: GrantFiled: July 14, 2014Date of Patent: January 9, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
-
Publication number: 20170216718Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: ApplicationFiled: April 19, 2017Publication date: August 3, 2017Inventors: R. STEPHEN POLZIN, ALEX A. KIPMAN, MARK J. FINOCCHIO, RYAN MICHAEL GEISS, KATHRYN STONE PEREZ, KUDO TSUNODA, DARREN ALEXANDER BENNETT
-
Publication number: 20170144067Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: ApplicationFiled: September 16, 2016Publication date: May 25, 2017Inventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
-
Patent number: 9656162Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: GrantFiled: April 14, 2014Date of Patent: May 23, 2017Assignee: Microsoft Technology Licensing, LLCInventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
-
Patent number: 9468848Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: GrantFiled: December 12, 2013Date of Patent: October 18, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
-
Publication number: 20150363005Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.Type: ApplicationFiled: August 24, 2015Publication date: December 17, 2015Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
-
Patent number: 9182814Abstract: A depth image of a scene may be received, observed, or captured by a device. The depth image may include a human target that may have, for example, a portion thereof non-visible or occluded. For example, a user may be turned such that a body part may not be visible to the device, may have one or more body parts partially outside a field of view of the device, may have a body part or a portion of a body part behind another body part or object, or the like such that the human target associated with the user may also have a portion body part or a body part non-visible or occluded in the depth image. A position or location of the non-visible or occluded portion or body part of the human target associated with the user may then be estimated.Type: GrantFiled: June 26, 2009Date of Patent: November 10, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Alex A. Kipman, Kathryn Stone Perez, Mark J. Finocchio, Ryan Michael Geiss, Kudo Tsunoda
-
Patent number: 9141193Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.Type: GrantFiled: August 31, 2009Date of Patent: September 22, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
-
Publication number: 20140320508Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.Type: ApplicationFiled: July 14, 2014Publication date: October 30, 2014Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffery Margolis
-
Publication number: 20140228123Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: ApplicationFiled: April 14, 2014Publication date: August 14, 2014Applicant: Microsoft CorporationInventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
-
Patent number: 8803889Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.Type: GrantFiled: May 29, 2009Date of Patent: August 12, 2014Assignee: Microsoft CorporationInventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
-
Patent number: 8744121Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: GrantFiled: May 29, 2009Date of Patent: June 3, 2014Assignee: Microsoft CorporationInventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
-
Publication number: 20140109023Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: ApplicationFiled: December 12, 2013Publication date: April 17, 2014Applicant: Microsoft CorporationInventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
-
Patent number: 8631355Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: GrantFiled: January 8, 2010Date of Patent: January 14, 2014Assignee: Microsoft CorporationInventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
-
Publication number: 20110173204Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: ApplicationFiled: January 8, 2010Publication date: July 14, 2011Applicant: Microsoft CorporationInventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
-
Patent number: 7950000Abstract: Architecture that facilitates management of a build process according to a level of trust of a build entity. The build process processes one or more build entities, each of which is associated with a level of trust. These associations are stored in a policy file that is run against the one or more entities at the start of the build process. The build process runs at a permission level that is representative of the lowest level of trust of the build entities. The levels of trust include at least trusted, semi-trusted, and untrusted levels. If the lowest level is untrusted, the build process fails, and the user is notified.Type: GrantFiled: March 17, 2004Date of Patent: May 24, 2011Assignee: Microsoft CorporationInventors: Alex A. Kipman, Rajeev Goel, Jomo A. Fisher, Christopher A. Flaat, Chad W. Royal
-
Publication number: 20110055846Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.Type: ApplicationFiled: August 31, 2009Publication date: March 3, 2011Applicant: Microsoft CorporationInventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda