Patents by Inventor Alex A. Kipman

Alex A. Kipman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150363005
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Application
    Filed: August 24, 2015
    Publication date: December 17, 2015
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Publication number: 20150325054
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: July 22, 2015
    Publication date: November 12, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9182814
    Abstract: A depth image of a scene may be received, observed, or captured by a device. The depth image may include a human target that may have, for example, a portion thereof non-visible or occluded. For example, a user may be turned such that a body part may not be visible to the device, may have one or more body parts partially outside a field of view of the device, may have a body part or a portion of a body part behind another body part or object, or the like such that the human target associated with the user may also have a portion body part or a body part non-visible or occluded in the depth image. A position or location of the non-visible or occluded portion or body part of the human target associated with the user may then be estimated.
    Type: Grant
    Filed: June 26, 2009
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alex A. Kipman, Kathryn Stone Perez, Mark J. Finocchio, Ryan Michael Geiss, Kudo Tsunoda
  • Publication number: 20150317831
    Abstract: Various embodiments relating to controlling a see-through display are disclosed. In one embodiment, virtual objects may be displayed on the see-through display. The virtual objects transition between having a position that is body-locked and a position that is world-locked based on various transition events.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Michael John Ebstyne, Frederik Schaffalitzky, Stephen Latta, Paul Albert Lalonde, Drew Steedly, Alex Kipman, Ethan Eade
  • Publication number: 20150317833
    Abstract: An augmented reality device including a plurality of sensors configured to output pose information indicating a pose of the augmented reality device. The augmented reality device further includes a band-agnostic filter and a band-specific filter. The band-specific filter includes an error correction algorithm configured to receive pose information as filtered by the band-agnostic filter and reduce a tracking error of the pose information in a selected frequency band. The augmented reality device further includes a display engine configured to position a virtual object on a see-through display as a function of the pose information as filtered by the band-agnostic filter and the band-specific filter.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Michael John Ebstyne, Frederik Schaffalitzky, Drew Steedly, Calvin Chan, Ethan Eade, Alex Kipman, Georg Klein
  • Patent number: 9159151
    Abstract: Data captured with respect to a human may be analyzed and applied to a visual representation of a user such that the visual representation begins to reflect the behavioral characteristics of the user. For example, a system may have a capture device that captures data about the user in the physical space. The system may identify the user's characteristics, tendencies, voice patterns, behaviors, gestures, etc. Over time, the system may learn a user's tendencies and intelligently apply animations to the user's avatar such that the avatar behaves and responds in accordance with the identified behaviors of the user. The animations applied to the avatar may be animations selected from a library of pre-packaged animations, or the animations may be entered and recorded by the user into the avatar's avatar library.
    Type: Grant
    Filed: July 13, 2009
    Date of Patent: October 13, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Patent number: 9141193
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Grant
    Filed: August 31, 2009
    Date of Patent: September 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Patent number: 9129430
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: September 8, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Patent number: 9098873
    Abstract: An on-screen shopping application which reacts to a human target user's motions to provide a shopping experience to the user is provided. A tracking system captures user motions and executes a shopping application allowing a user to manipulate an on-screen representation the user. The on-screen representation has a likeness of the user or another individual and movements of the user in the on-screen interface allows the user to interact with virtual articles that represent real-world articles. User movements which are recognized as article manipulation or transaction control gestures are translated into commands for the shopping application.
    Type: Grant
    Filed: April 1, 2010
    Date of Patent: August 4, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Kudo Tsunoda, Darren Bennett, Brian S. Murphy, Stephen G. Latta, Relja Markovic, Alex Kipman
  • Patent number: 9026596
    Abstract: Embodiments are disclosed that relate to sharing media streams capturing different perspectives of an event. For example, one embodiment provides, on a computing device, a method including storing an event definition for an event, receiving from each capture device of a plurality of capture devices a request to share a media stream provided by the capture device, receiving a media stream from each capture device of the plurality of capture devices, and associating a subset of media streams from the plurality of capture devices with the event based upon the event definition. The method further includes receiving a request for transmission of a selected media stream associated with the event, and sending the selected media stream associated with the event to the requesting capture device.
    Type: Grant
    Filed: June 16, 2011
    Date of Patent: May 5, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex Kipman, Andrew Fuller
  • Patent number: 9015638
    Abstract: Techniques for managing a set of states associated with a capture device are disclosed herein. The capture device may detect and bind to users, and may provide feedback about whether the capture device is bound to, or detecting a user. Techniques are also disclosed wherein virtual ports may be associated with users bound to a capture device and feedback about the state of virtual ports may be provided.
    Type: Grant
    Filed: May 1, 2009
    Date of Patent: April 21, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alex Kipman, Kathryn Stone Perez, R. Stephen Polzin, William Guo
  • Publication number: 20150035861
    Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a selected geo-located data item and provides a minimum visual information density level for the item to a head-mounted display device. The program receives via the head-mounted display device a user input corresponding to the selected geo-located data item. Based on the input, the program provides an increasing visual information density level for the selected item to the head-mounted display device for display within the mixed reality environment.
    Type: Application
    Filed: July 31, 2013
    Publication date: February 5, 2015
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
  • Publication number: 20140375683
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20140320508
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Application
    Filed: July 14, 2014
    Publication date: October 30, 2014
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffery Margolis
  • Publication number: 20140228123
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Application
    Filed: April 14, 2014
    Publication date: August 14, 2014
    Applicant: Microsoft Corporation
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 8803889
    Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: August 12, 2014
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
  • Publication number: 20140160001
    Abstract: Embodiments that relate to presenting a mixed reality environment via a mixed reality display device are disclosed. For example, one disclosed embodiment provides a method for presenting a mixed reality environment via a head-mounted display device. The method includes using head pose data to generally identify one or more gross selectable targets within a sub-region of a spatial region occupied by the mixed reality environment. The method further includes specifically identifying a fine selectable target from among the gross selectable targets based on eye-tracking data. Gesture data is then used to identify a gesture, and an operation associated with the identified gesture is performed on the fine selectable target.
    Type: Application
    Filed: December 6, 2012
    Publication date: June 12, 2014
    Inventors: Peter Tobias Kinnebrew, Alex Kipman
  • Patent number: 8744121
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: June 3, 2014
    Assignee: Microsoft Corporation
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Publication number: 20140109023
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Application
    Filed: December 12, 2013
    Publication date: April 17, 2014
    Applicant: Microsoft Corporation
    Inventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
  • Patent number: 8631355
    Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.
    Type: Grant
    Filed: January 8, 2010
    Date of Patent: January 14, 2014
    Assignee: Microsoft Corporation
    Inventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore