Patents by Inventor Kevin Geisner

Kevin Geisner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8942428
    Abstract: A system may receive image data and capture motion with respect to a target in a physical space and recognize a gesture from the captured motion. It may be desirable to isolate aspects of captured motion to differentiate random and extraneous motions. For example, a gesture may comprise motion of a user's right arm, and it may be desirable to isolate the motion of the user's right arm and exclude an interpretation of any other motion. Thus, the isolated aspect may be the focus of the received data for gesture recognition. Alternately, the isolated aspects may be an aspect of the captured motion that is removed from consideration when identifying a gesture from the captured motion. For example, gesture filters may be modified to correspond to the user's natural lean to eliminate the effect the lean has on the registry of a motion with a gesture filter.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: January 27, 2015
    Assignee: Microsoft Corporation
    Inventors: Gregory Nelson Snook, Relja Markovic, Stephen Gilchrist Latta, Kevin Geisner
  • Publication number: 20140380254
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Application
    Filed: September 4, 2014
    Publication date: December 25, 2014
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Patent number: 8894484
    Abstract: A system and related methods for inviting a potential player to participate in a multiplayer game via a user head-mounted display device are provided. In one example, a potential player invitation program receives user voice data and determines that the user voice data is an invitation to participate in a multiplayer game. The program receives eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data. The program associates the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data. The program matches a potential player account with the potential player. The program receives an acceptance response from the potential player, and joins the potential player account with a user account in participating in the multiplayer game.
    Type: Grant
    Filed: January 30, 2012
    Date of Patent: November 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Stephen Latta, Kevin Geisner, Brian Mount, Jonathan Steed, Tony Ambrus, Arnulfo Zepeda, Aaron Krauss
  • Patent number: 8884968
    Abstract: A method for modeling an object from image data comprises identifying in an image from the video a set of reference points on the object, and, for each reference point identified, observing a displacement of that reference point in response to a motion of the object. The method further comprises grouping together those reference points for which a common translational or rotational motion of the object results in the observed displacement, and fitting the grouped-together reference points to a shape.
    Type: Grant
    Filed: December 15, 2010
    Date of Patent: November 11, 2014
    Assignee: Microsoft Corporation
    Inventors: Relja Markovic, Stephen Latta, Kevin Geisner
  • Publication number: 20140320389
    Abstract: Embodiments that relate to interacting with a physical object in a mixed reality environment via a head-mounted display are disclosed. In one embodiment a mixed reality interaction program identifies an object based on an image from captured by the display. An interaction context for the object is determined based on an aspect of the mixed reality environment. A profile for the physical object is queried to determine interaction modes for the object. A selected interaction mode is programmatically selected based on the interaction context. A user input directed at the object is received via the display and interpreted to correspond to a virtual action based on the selected interaction mode. The virtual action is executed with respect to a virtual object associated with the physical object to modify an appearance of the virtual object. The modified virtual object is then displayed via the display.
    Type: Application
    Filed: April 29, 2013
    Publication date: October 30, 2014
    Inventors: Michael Scavezze, Jonathan Steed, Stephen Latta, Kevin Geisner, Daniel McCulloch, Brian Mount, Ryan Hastings, Phillip Charles Heckinger
  • Patent number: 8872853
    Abstract: A head-mounted display system includes a see-through display that is configured to visually augment an appearance of a physical environment to a user viewing the physical environment through the see-through display. Graphical content presented via the see-through display is created by modeling the ambient lighting conditions of the physical environment.
    Type: Grant
    Filed: December 1, 2011
    Date of Patent: October 28, 2014
    Assignee: Microsoft Corporation
    Inventors: Ben Sugden, Darren Bennett, Brian Mount, Sebastian Sylvan, Arthur Tomlin, Ryan Hastings, Daniel McCulloch, Kevin Geisner, Robert Crocco, Jr.
  • Patent number: 8856691
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: October 7, 2014
    Assignee: Microsoft Corporation
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Publication number: 20140267311
    Abstract: Embodiments are disclosed that relate to interacting with a user interface via feedback provided by an avatar. One embodiment provides a method comprising receiving depth data, locating a person in the depth data, and mapping a physical space in front of the person to a screen space of a display device. The method further comprises forming an image of an avatar representing the person, outputting to a display an image of a user interface comprising an interactive user interface control, and outputting to the display device the image of the avatar such that the avatar faces the user interface control. The method further comprises detecting a motion of the person via the depth data, forming an animated representation of the avatar interacting with the user interface control based upon the motion of the person, and outputting the animated representation of the avatar interacting with the control.
    Type: Application
    Filed: May 29, 2014
    Publication date: September 18, 2014
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Evertt, Joel Deaguero, Darren Bennett, Dylan Vance, David Galloway, Relja Markovic, Stephen Latta, Oscar Omar Garza Santos, Kevin Geisner
  • Patent number: 8814693
    Abstract: In accordance with one or more aspects, for a particular user one or more other users associated with that particular user are identified based on a social graph of that particular user. An avatar of at least one of the other users is obtained and included as a non-player-character in a game being played by that particular user. The particular user can provide requests to interact with the avatar of the second user (e.g., calling out the name of the second user, tapping the avatar of the second user on the shoulder, etc.), these requests being invitations for the second user to join in a game with the first user. An indication of such an invitation is presented to the second user, which can, for example, accept the invitation to join in a game with the first user.
    Type: Grant
    Filed: May 27, 2011
    Date of Patent: August 26, 2014
    Assignee: Microsoft Corporation
    Inventors: Brian Scott Murphy, Stephen G. Latta, Darren Alexander Bennett, Kevin Geisner, Shawn C. Wright, Relja Markovic, Joel B. Deaguero, Christopher H. Willoughby, Ryan Lucas Hastings
  • Publication number: 20140168075
    Abstract: Systems, methods and computer readable media are disclosed for controlling perspective of a camera-controlled computer. A capture device captures user gestures and sends corresponding data to a recognizer engine. The recognizer engine analyzes the data with a plurality of filters, each filter corresponding to a gesture. Based on the output of those filters, a perspective control is determined, and a display device displays a new perspective corresponding to the perspective control.
    Type: Application
    Filed: January 21, 2014
    Publication date: June 19, 2014
    Inventors: Relja Markovic, Gregory N. Snook, Stephen Latta, Kevin Geisner, Johnny Lee, Adam Jethro Langridge
  • Patent number: 8749557
    Abstract: Embodiments are disclosed that relate to interacting with a user interface via feedback provided by an avatar. One embodiment provides a method comprising receiving depth data, locating a person in the depth data, and mapping a physical space in front of the person to a screen space of a display device. The method further comprises forming an image of an avatar representing the person, outputting to a display an image of a user interface comprising an interactive user interface control, and outputting to the display device the image of the avatar such that the avatar faces the user interface control. The method further comprises detecting a motion of the person via the depth data, forming an animated representation of the avatar interacting with the user interface control based upon the motion of the person, and outputting the animated representation of the avatar interacting with the control.
    Type: Grant
    Filed: June 11, 2010
    Date of Patent: June 10, 2014
    Assignee: Microsoft Corporation
    Inventors: Jeffrey Evertt, Joel Deaguero, Darren Bennett, Dylan Vance, David Galloway, Relja Markovic, Stephen Latta, Oscar Omar Garza Santos, Kevin Geisner
  • Publication number: 20140125574
    Abstract: Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated.
    Type: Application
    Filed: November 5, 2012
    Publication date: May 8, 2014
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta, Kevin Geisner, Brian Mount
  • Publication number: 20140128161
    Abstract: A plurality of game sessions are hosted at a server system. A first computing device of a first user is joined to a first multiplayer gaming session, the first computing device including a see-through display. Augmentation information is sent to the first computing device for the first multiplayer gaming session to provide an augmented reality experience to the first user. A second computing device of a second user is joined to the first multiplayer gaming session. Experience information is sent to the second computing device for the first multiplayer gaming session to provide a cross-platform representation of the augmented reality experience to the second user.
    Type: Application
    Filed: November 6, 2012
    Publication date: May 8, 2014
    Inventors: Stephen Latta, Daniel McCulloch, Jason Scott, Kevin Geisner
  • Patent number: 8696461
    Abstract: A method of matching a player of a multi-player game with a remote participant includes recognizing the player, automatically identifying an observer within a threshold proximity to the player, using an identity of the observer to find one or more candidates to play as the remote participant of the multi-player game, and when selecting the remote participant, choosing a candidate from the one or more candidates above a non-candidate if the candidate satisfies a matching criteria.
    Type: Grant
    Filed: June 1, 2011
    Date of Patent: April 15, 2014
    Assignee: Microsoft Corporation
    Inventors: Relja Markovic, Stephen Latta, Kevin Geisner, A. Dylan Vance, Brian Scott Murphy, Matt Coohill
  • Patent number: 8665479
    Abstract: Three-dimensional printing techniques are described. In one or more implementations, a system includes a three-dimensional printer and a computing device. The three-dimensional printer has a three-dimensional printing mechanism that is configured to form a physical object in three dimensions. The computing device is communicatively coupled to the three-dimensional printer and includes a three-dimensional printing module implemented at least partially in hardware to cause the three-dimensional printer to form the physical object in three dimensions as having functionality configured to communicate with a computing device.
    Type: Grant
    Filed: February 21, 2012
    Date of Patent: March 4, 2014
    Assignee: Microsoft Corporation
    Inventors: Desney S. Tan, Hrvoje Benko, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Kenneth P. Hinckley
  • Publication number: 20140049558
    Abstract: Embodiments for providing instructional information for control devices are disclosed. In one example, a method on a see-through display device comprising a see-through display and an outward-facing image sensor includes acquiring an image of a scene viewable through the see-through display and detecting a control device in the scene. The method also includes retrieving information pertaining to a function of an interactive element of the control device and displaying an image on the see-through display augmenting an appearance of the interactive element of the control device with image data related to the function of the interactive element.
    Type: Application
    Filed: August 14, 2012
    Publication date: February 20, 2014
    Inventors: Aaron Krauss, Stephen Latta, Mike Scavezze, Daniel McCulloch, Brian Mount, Kevin Geisner
  • Patent number: 8649554
    Abstract: Systems, methods and computer readable media are disclosed for controlling perspective of a camera-controlled computer. A capture device captures user gestures and sends corresponding data to a recognizer engine. The recognizer engine analyzes the data with a plurality of filters, each filter corresponding to a gesture. Based on the output of those filters, a perspective control is determined, and a display device displays a new perspective corresponding to the perspective control.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: February 11, 2014
    Assignee: Microsoft Corporation
    Inventors: Relia Markovic, Gregory N. Snook, Stephen Latta, Kevin Geisner, Johnny Lee, Adam Jethro Lanygridge
  • Publication number: 20130335435
    Abstract: Embodiments related to improving a color-resolving ability of a user of a see-thru display device are disclosed. For example, one disclosed embodiment includes, on a see-thru display device, constructing and displaying virtual imagery to superpose onto real imagery sighted by the user through the see-thru display device. The virtual imagery is configured to accentuate a locus of the real imagery of a color poorly distinguishable by the user. Such virtual imagery is then displayed by superposing it onto the real imagery, in registry with the real imagery, in a field of view of the user.
    Type: Application
    Filed: June 18, 2012
    Publication date: December 19, 2013
    Inventors: Tony Ambrus, Adam Smith-Kipnis, Stephen Latta, Daniel McCulloch, Brian Mount, Kevin Geisner, Ian McIntyre
  • Publication number: 20130335594
    Abstract: Captured data is obtained, including various types of captured or recorded data (e.g., image data, audio data, video data, etc.) and/or metadata describing various aspects of the capture device and/or the manner in which the data is captured. One or more elements of the captured data that can be replaced by one or more substitute elements are determined, the replaceable elements are removed from the captured data, and links to the substitute elements are associated with the captured data. Links to additional elements to enhance the captured data are also associated with the captured data. Enhanced content can subsequently be constructed based on the captured data as well as the links to the substitute elements and additional elements.
    Type: Application
    Filed: June 18, 2012
    Publication date: December 19, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Hrvoje Benko, Paul Henry Dietz, Stephen G. Latta, Kevin Geisner, Steven Nabil Bathiche
  • Publication number: 20130286223
    Abstract: Photos are shared among devices that are in close proximity to one another and for which there is a connection among the devices. The photos can be shared automatically, or alternatively based on various user inputs. Various different controls can also be placed on sharing photos to restrict the other devices with which photos can be shared, the manner in which photos can be shared, and/or how the photos are shared.
    Type: Application
    Filed: April 25, 2012
    Publication date: October 31, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Stephen G. Latta, Kenneth P. Hinckley, Kevin Geisner, Steven Nabil Bathiche, Hrvoje Benko, Vivek Pradeep