Patents by Inventor Nicholas D. Burton

Nicholas D. Burton has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9519989
    Abstract: Using facial recognition and gesture/body posture recognition techniques, a system can naturally convey the emotions and attitudes of a user via the user's visual representation. Techniques may comprise customizing a visual representation of a user based on detectable characteristics, deducting a user's temperament from the detectable characteristics, and applying attributes indicative of the temperament to the visual representation in real time. Techniques may also comprise processing changes to the user's characteristics in the physical space and updating the visual representation in real time. For example, the system may track a user's facial expressions and body movements to identify a temperament and then apply attributes indicative of that temperament to the visual representation. Thus, a visual representation of a user, which may be, for example, an avatar or fanciful character, can reflect the user's expressions and moods in real time.
    Type: Grant
    Filed: March 4, 2013
    Date of Patent: December 13, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Patent number: 9159151
    Abstract: Data captured with respect to a human may be analyzed and applied to a visual representation of a user such that the visual representation begins to reflect the behavioral characteristics of the user. For example, a system may have a capture device that captures data about the user in the physical space. The system may identify the user's characteristics, tendencies, voice patterns, behaviors, gestures, etc. Over time, the system may learn a user's tendencies and intelligently apply animations to the user's avatar such that the avatar behaves and responds in accordance with the identified behaviors of the user. The animations applied to the avatar may be animations selected from a library of pre-packaged animations, or the animations may be entered and recorded by the user into the avatar's avatar library.
    Type: Grant
    Filed: July 13, 2009
    Date of Patent: October 13, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Patent number: 8390680
    Abstract: Using facial recognition and gesture/body posture recognition techniques, a system can naturally convey the emotions and attitudes of a user via the user's visual representation. Techniques may comprise customizing a visual representation of a user based on detectable characteristics, deducting a user's temperament from the detectable characteristics, and applying attributes indicative of the temperament to the visual representation in real time. Techniques may also comprise processing changes to the user's characteristics in the physical space and updating the visual representation in real time. For example, the system may track a user's facial expressions and body movements to identify a temperament and then apply attributes indicative of that temperament to the visual representation. Thus, a visual representation of a user, such as an avatar or fanciful character, can reflect the user's expressions and moods in real time.
    Type: Grant
    Filed: July 9, 2009
    Date of Patent: March 5, 2013
    Assignee: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Publication number: 20110025689
    Abstract: Techniques for auto-generating the target's visual representation may reduce or eliminate the manual input required for the generation of the target's visual representation. For example, a system having a capture device may detect various features of a user in the physical space and make feature selections from a library of visual representation feature options based on the detected features. The system can automatically apply the selections to the visual representation of the user based on the detected features. Alternately, the system may make selections that narrow the number of options for features from which the user chooses. The system may apply the selections to the user in real time as well as make updates to the features selected and applied to the target's visual representation in real time.
    Type: Application
    Filed: July 29, 2009
    Publication date: February 3, 2011
    Applicant: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Publication number: 20110007142
    Abstract: Using facial recognition and gesture/body posture recognition techniques, a system can naturally convey the emotions and attitudes of a user via the user's visual representation. Techniques may comprise customizing a visual representation of a user based on detectable characteristics, deducting a user's temperament from the detectable characteristics, and applying attributes indicative of the temperament to the visual representation in real time. Techniques may also comprise processing changes to the user's characteristics in the physical space and updating the visual representation in real time. For example, the system may track a user's facial expressions and body movements to identify a temperament and then apply attributes indicative of that temperament to the visual representation. Thus, a visual representation of a user, such as an avatar or fanciful character, can reflect the user's expressions and moods in real time.
    Type: Application
    Filed: July 9, 2009
    Publication date: January 13, 2011
    Applicant: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Publication number: 20110007079
    Abstract: Data captured with respect to a human may be analyzed and applied to a visual representation of a user such that the visual representation begins to reflect the behavioral characteristics of the user. For example, a system may have a capture device that captures data about the user in the physical space. The system may identify the user's characteristics, tendencies, voice patterns, behaviors, gestures, etc. Over time, the system may learn a user's tendencies and intelligently apply animations to the user's avatar such that the avatar behaves and responds in accordance with the identified behaviors of the user. The animations applied to the avatar may be animations selected from a library of pre-packaged animations, or the animations may be entered and recorded by the user into the avatar's avatar library.
    Type: Application
    Filed: July 13, 2009
    Publication date: January 13, 2011
    Applicant: Microsoft Corporation
    Inventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
  • Publication number: 20100302253
    Abstract: Techniques for generating an avatar model during the runtime of an application are herein disclosed. The avatar model can be generated from an image captured by a capture device. End-effectors can be positioned an inverse kinematics can be used to determine positions of other nodes in the avatar model.
    Type: Application
    Filed: August 26, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Alex A. Kipman, Kudo Tsunoda, Jeffrey N. Margolis, Scott W. Sims, Nicholas D. Burton, Andrew Wilson