Patents by Inventor Kudo Tsunoda

Kudo Tsunoda has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10486065
    Abstract: A system to present the user a 3-D virtual environment as well as non-visual sensory feedback for interactions that user makes with virtual objects in that environment is disclosed. In an exemplary embodiment, a system comprising a depth camera that captures user position and movement, a three-dimensional (3-D) display device that presents the user a virtual environment in 3-D and a haptic feedback device provides haptic feedback to the user as he interacts with a virtual object in the virtual environment. As the user moves through his physical space, he is captured by the depth camera. Data from that depth camera is parsed to correlate a user position with a position in the virtual environment. Where the user position or movement causes the user to touch the virtual object, that is determined, and corresponding haptic feedback is provided to the user.
    Type: Grant
    Filed: July 27, 2011
    Date of Patent: November 26, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alex Kipman, Kudo Tsunoda, Todd Eric Holmdahl, John Clavin, Kathryn Stone Perez
  • Patent number: 9943755
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: April 19, 2017
    Date of Patent: April 17, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 9821224
    Abstract: Depth-image analysis is performed with a device that analyzes a human target within an observed scene by capturing depth-images that include depth information from the observed scene. The human target is modeled with a virtual skeleton including a plurality of joints. The virtual skeleton is used as an input for controlling a driving simulation.
    Type: Grant
    Filed: December 21, 2010
    Date of Patent: November 21, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Stephen Latta, Darren Bennett, Kevin Geisner, Relja Markovic, Kudo Tsunoda, Rhett Mathis, Matthew Monson, David Gierok, William Paul Giese, Darrin Brown, Cam McRae, David Seymour, William Axel Olsen, Matthew Searcy
  • Publication number: 20170216718
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Application
    Filed: April 19, 2017
    Publication date: August 3, 2017
    Inventors: R. STEPHEN POLZIN, ALEX A. KIPMAN, MARK J. FINOCCHIO, RYAN MICHAEL GEISS, KATHRYN STONE PEREZ, KUDO TSUNODA, DARREN ALEXANDER BENNETT
  • Patent number: 9656162
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: April 14, 2014
    Date of Patent: May 23, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 9646340
    Abstract: A method to help a user visualize how a wearable article will look on the user's body. Enacted on a computing system, the method includes receiving an image of the user's body from an image-capture component. Based on the image, a posable, three-dimensional, virtual avatar is constructed to substantially resemble the user. In this example method, data is obtained that identifies the wearable article as being selected for the user. This data includes a plurality of metrics that at least partly define the wearable article. Then, a virtualized form of the wearable article is attached to the avatar, which is provided to a display component for the user to review.
    Type: Grant
    Filed: August 2, 2012
    Date of Patent: May 9, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jay Kapur, Sheridan Jones, Kudo Tsunoda
  • Patent number: 9400559
    Abstract: Systems, methods and computer readable media are disclosed for gesture shortcuts. A user's movement or body position is captured by a capture device of a system, and is used as input to control the system. For a system-recognized gesture, there may be a full version of the gesture and a shortcut of the gesture. Where the system recognizes that either the full version of the gesture or the shortcut of the gesture has been performed, it sends an indication that the system-recognized gesture was observed to a corresponding application. Where the shortcut comprises a subset of the full version of the gesture, and both the shortcut and the full version of the gesture are recognized as the user performs the full version of the gesture, the system recognizes that only a single performance of the gesture has occurred, and indicates to the application as such.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: July 26, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen Latta, Kevin Geisner, John Clavin, Kudo Tsunoda, Kathryn Stone Perez, Alex Kipman, Relja Markovic, Gregory N. Snook
  • Patent number: 9367136
    Abstract: Methods for providing real-time feedback to an end user of a mobile device as they are interacting with or manipulating one or more virtual objects within an augmented reality environment are described. The real-time feedback may comprise visual feedback, audio feedback, and/or haptic feedback. In some embodiments, a mobile device, such as a head-mounted display device (HMD), may determine an object classification associated with a virtual object within an augmented reality environment, detect an object manipulation gesture performed by an end user of the mobile device, detect an interaction with the virtual object based on the object manipulation gesture, determine a magnitude of a virtual force associated with the interaction, and provide real-time feedback to the end user of the mobile device based on the interaction, the magnitude of the virtual force applied to the virtual object, and the object classification associated with the virtual object.
    Type: Grant
    Filed: April 12, 2013
    Date of Patent: June 14, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Stephen G. Latta, Adam G. Poulos, Cameron G. Brown, Daniel J. McCulloch, Matthew Kaplan, Arnulfo Zepeda Navratil, Jon Paulovich, Kudo Tsunoda
  • Publication number: 20150363005
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Application
    Filed: August 24, 2015
    Publication date: December 17, 2015
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Patent number: 9182814
    Abstract: A depth image of a scene may be received, observed, or captured by a device. The depth image may include a human target that may have, for example, a portion thereof non-visible or occluded. For example, a user may be turned such that a body part may not be visible to the device, may have one or more body parts partially outside a field of view of the device, may have a body part or a portion of a body part behind another body part or object, or the like such that the human target associated with the user may also have a portion body part or a body part non-visible or occluded in the depth image. A position or location of the non-visible or occluded portion or body part of the human target associated with the user may then be estimated.
    Type: Grant
    Filed: June 26, 2009
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alex A. Kipman, Kathryn Stone Perez, Mark J. Finocchio, Ryan Michael Geiss, Kudo Tsunoda
  • Patent number: 9141193
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Grant
    Filed: August 31, 2009
    Date of Patent: September 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Patent number: 9098873
    Abstract: An on-screen shopping application which reacts to a human target user's motions to provide a shopping experience to the user is provided. A tracking system captures user motions and executes a shopping application allowing a user to manipulate an on-screen representation the user. The on-screen representation has a likeness of the user or another individual and movements of the user in the on-screen interface allows the user to interact with virtual articles that represent real-world articles. User movements which are recognized as article manipulation or transaction control gestures are translated into commands for the shopping application.
    Type: Grant
    Filed: April 1, 2010
    Date of Patent: August 4, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Kudo Tsunoda, Darren Bennett, Brian S. Murphy, Stephen G. Latta, Relja Markovic, Alex Kipman
  • Patent number: 9030495
    Abstract: A system and related methods for an augmented reality help system in a head-mounted display device are provided. In one example, the head-mounted display device includes a plurality of sensors and a display system for presenting holographic objects. An augmented reality help program is configured to receive one or more user biometric parameters from the plurality of sensors. Based on the user biometric parameters, the program determines that the user is experiencing a stress response, and presents help content to the user via the head-mounted display device.
    Type: Grant
    Filed: November 21, 2012
    Date of Patent: May 12, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel McCulloch, Kudo Tsunoda, Abby Lin Lee, Ryan Hastings, Jason Scott
  • Publication number: 20140306891
    Abstract: Methods for providing real-time feedback to an end user of a mobile device as they are interacting with or manipulating one or more virtual objects within an augmented reality environment are described. The real-time feedback may comprise visual feedback, audio feedback, and/or haptic feedback. In some embodiments, a mobile device, such as a head-mounted display device (HMD), may determine an object classification associated with a virtual object within an augmented reality environment, detect an object manipulation gesture performed by an end user of the mobile device, detect an interaction with the virtual object based on the object manipulation gesture, determine a magnitude of a virtual force associated with the interaction, and provide real-time feedback to the end user of the mobile device based on the interaction, the magnitude of the virtual force applied to the virtual object, and the object classification associated with the virtual object.
    Type: Application
    Filed: April 12, 2013
    Publication date: October 16, 2014
    Inventors: Stephen G. Latta, Adam G. Poulos, Cameron G. Brown, Daniel J. McCulloch, Matthew Kaplan, Arnulfo Zepeda Navratil, Jon Paulovich, Kudo Tsunoda
  • Publication number: 20140228123
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Application
    Filed: April 14, 2014
    Publication date: August 14, 2014
    Applicant: Microsoft Corporation
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Patent number: 8744121
    Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: June 3, 2014
    Assignee: Microsoft Corporation
    Inventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
  • Publication number: 20140139551
    Abstract: A system and related methods for an augmented reality help system in a head-mounted display device are provided. In one example, the head-mounted display device includes a plurality of sensors and a display system for presenting holographic objects. An augmented reality help program is configured to receive one or more user biometric parameters from the plurality of sensors. Based on the user biometric parameters, the program determines that the user is experiencing a stress response, and presents help content to the user via the head-mounted display device.
    Type: Application
    Filed: November 21, 2012
    Publication date: May 22, 2014
    Inventors: Daniel McCulloch, Kudo Tsunoda, Abby Lin Lee, Ryan Hastings, Jason Scott
  • Publication number: 20140125698
    Abstract: A computing system comprises a see-through display device, a logic subsystem, and a storage subsystem storing instructions. When executed by the logic subsystem, the instructions display on the see-through display device a virtual arena, a user-controlled avatar, and an opponent avatar. The virtual arena appears to be integrated within a physical space when the physical space is viewed through the see-through display device. In response to receiving a user input, the instructions may also display on the see-through display device an updated user-controlled avatar.
    Type: Application
    Filed: November 5, 2012
    Publication date: May 8, 2014
    Inventors: Stephen Latta, Daniel McCulloch, Kudo Tsunoda, Aaron Krauss
  • Patent number: 8487938
    Abstract: Systems, methods and computer readable media are disclosed for grouping complementary sets of standard gestures into gesture libraries. The gestures may be complementary in that they are frequently used together in a context or in that their parameters are interrelated. Where a parameter of a gesture is set with a first value, all other parameters of the gesture and of other gestures in the gesture package that depend on the first value may be set with their own value which is determined using the first value.
    Type: Grant
    Filed: February 23, 2009
    Date of Patent: July 16, 2013
    Assignee: Microsoft Corporation
    Inventors: Stephen G. Latta, Kudo Tsunoda, Kevin Geisner, Relja Markovic, Darren Alexander Bennett
  • Patent number: 8451278
    Abstract: It may be desirable to apply corrective data to aspects of captured image or the user-performed gesture for display of a visual representation that corresponds to the corrective data. The captured motion may be any motion in the physical space that is captured by the capture device, such as a camera. Aspects of a skeletal or mesh model of a person, that is generated based on the image data captured by the capture device, may be modified prior to animation. The modification may be made to the model generated from image data that represents a target or a target's motion, including user gestures, in the physical space. For example, certain joints of a skeletal model may be readjusted or realigned. A model of a target may be modified by applying differential correction, magnetism principles, binary snapping, confining virtual movement to defined spaces, or the like.
    Type: Grant
    Filed: August 3, 2012
    Date of Patent: May 28, 2013
    Assignee: Microsoft Corporation
    Inventors: Kevin Geisner, Relja Markovic, Stephen Gilchrist Latta, Gregory Nelson Snook, Kudo Tsunoda, Darren Alexander Bennett