Patents by Inventor Arthur Charles Tomlin

Arthur Charles Tomlin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9280203
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Grant
    Filed: August 2, 2011
    Date of Patent: March 8, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Patent number: 9256282
    Abstract: Systems, methods and computer readable media are disclosed for manipulating virtual objects. A user may utilize a controller, such as his hand, in physical space to associate with a cursor in a virtual environment. As the user manipulates the controller in physical space, this is captured by a depth camera. The image data from the depth camera is parsed to determine how the controller is manipulated, and a corresponding manipulation of the cursor is performed in virtual space. Where the cursor interacts with a virtual object in the virtual space, that virtual object is manipulated by the cursor.
    Type: Grant
    Filed: March 20, 2009
    Date of Patent: February 9, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Kevin Geisner, Relja Markovic, Darren Alexander Bennett, Arthur Charles Tomlin
  • Publication number: 20150302867
    Abstract: Various embodiments relating to detecting a conversation during presentation of content on a computing device, and taking one or more actions in response to detecting the conversation, are disclosed. In one example, an audio data stream is received from one or more sensors, a conversation between a first user and a second user is detected based on the audio data stream, and presentation of a digital content item is modified by the computing device in response to detecting the conversation.
    Type: Application
    Filed: April 17, 2014
    Publication date: October 22, 2015
    Inventors: Arthur Charles Tomlin, Jonathan Paulovich, Evan Michael Keibler, Jason Scott, Cameron Brown, Jonathan William Plumb
  • Publication number: 20150302869
    Abstract: Various embodiments relating to detecting at least one of conversation, the presence and the identity of others during presentation of digital content on a computing device. When another person is detected, one or more actions may be taken with respect to the digital content. For example, the digital content may be minimized, moved, resized or otherwise modified.
    Type: Application
    Filed: January 16, 2015
    Publication date: October 22, 2015
    Inventors: Arthur Charles Tomlin, Dave Hill, Jonathan Paulovich, Evan Michael Keibler, Jason Scott, Cameron G. Brown, Thomas Forsythe, Jeffrey A. Kohler, Brian Murphy
  • Publication number: 20150009135
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Application
    Filed: September 26, 2014
    Publication date: January 8, 2015
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Publication number: 20140380254
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Application
    Filed: September 4, 2014
    Publication date: December 25, 2014
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Patent number: 8869072
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Grant
    Filed: August 2, 2011
    Date of Patent: October 21, 2014
    Assignee: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Patent number: 8856691
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: October 7, 2014
    Assignee: Microsoft Corporation
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Patent number: 8782567
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Grant
    Filed: November 4, 2011
    Date of Patent: July 15, 2014
    Assignee: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Publication number: 20120050157
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Application
    Filed: November 4, 2011
    Publication date: March 1, 2012
    Applicant: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Publication number: 20110285626
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Application
    Filed: August 2, 2011
    Publication date: November 24, 2011
    Applicant: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Publication number: 20110285620
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Application
    Filed: August 2, 2011
    Publication date: November 24, 2011
    Applicant: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Patent number: 7996793
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Grant
    Filed: April 13, 2009
    Date of Patent: August 9, 2011
    Assignee: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
  • Publication number: 20110099476
    Abstract: Disclosed herein are systems and methods for decorating a display environment. In one embodiment, a user may decorate a display environment by making one or more gestures, using voice commands, using a suitable interface device, and/or combinations thereof. A voice command can be detected for user selection of an artistic feature, such as, for example, a color, a texture, an object, and a visual effect for decorating in a display environment. The user can also gesture for selecting a portion of the display environment for decoration. Next, the selected portion of the display environment can be altered based on the selected artistic feature. The user's motions can be reflected in the display environment by an avatar. In addition, a virtual canvas or three-dimensional object can be displayed in the display environment for decoration by the user.
    Type: Application
    Filed: October 23, 2009
    Publication date: April 28, 2011
    Applicant: Microsoft Corporation
    Inventors: Gregory N. Snook, Relja Markovic, Stephen G. Latta, Kevin Geisner, Christopher Vuchetich, Darren Alexander Bennett, Arthur Charles Tomlin, Joel Deaguero, Matt Puls, Matt Coohill, Ryan Hastings, Kate Kolesar, Brian Scott Murphy
  • Publication number: 20100306713
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Publication number: 20100281438
    Abstract: Disclosed herein are systems and methods for altering a view perspective within a display environment. For example, gesture data corresponding to a plurality of inputs may be stored. The input may be input into a game or application implemented by a computing device. Images of a user of the game or application may be captured. For example, a suitable capture device may capture several images of the user over a period of time. The images may be analyzed and processed for detecting a user's gesture. Aspects of the user's gesture may be compared to the stored gesture data for determining an intended gesture input for the user. The comparison may be part of an analysis for determining inputs corresponding to the gesture data, where one or more of the inputs are input into the game or application and cause a view perspective within the display environment to be altered.
    Type: Application
    Filed: May 29, 2009
    Publication date: November 4, 2010
    Applicant: Microsoft Corporation
    Inventors: Stephen G. Latta, Gregory N. Snook, Justin McBride, Arthur Charles Tomlin, Peter Sarrett, Kevin Geisner, Relja Markovic, Christopher Vuchetich
  • Publication number: 20100241998
    Abstract: Systems, methods and computer readable media are disclosed for manipulating virtual objects. A user may utilize a controller, such as his hand, in physical space to associate with a cursor in a virtual environment. As the user manipulates the controller in physical space, this is captured by a depth camera. The image data from the depth camera is parsed to determine how the controller is manipulated, and a corresponding manipulation of the cursor is performed in virtual space. Where the cursor interacts with a virtual object in the virtual space, that virtual object is manipulated by the cursor.
    Type: Application
    Filed: March 20, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: Stephen G Latta, Kevin Geisner, Relja Markovic, Darren Alexander Bennett, Arthur Charles Tomlin
  • Publication number: 20100199230
    Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
    Type: Application
    Filed: April 13, 2009
    Publication date: August 5, 2010
    Applicant: Microsoft Corporation
    Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook