Patents by Inventor Kevin A Geisner

Kevin A Geisner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9161012
    Abstract: Optical sensor information captured via one or more optical sensors imaging a scene that includes a human subject is received by a computing device. The optical sensor information is processed by the computing device to model the human subject with a virtual skeleton, and to obtain surface information representing the human subject. The virtual skeleton is transmitted by the computing device to a remote computing device at a higher frame rate than the surface information. Virtual skeleton frames are used by the remote computing device to estimate surface information for frames that have not been transmitted by the computing device.
    Type: Grant
    Filed: November 17, 2011
    Date of Patent: October 13, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Mihelich, Kevin Geisner, Mike Scavezze, Stephen Latta, Daniel McCulloch, Brian Mount
  • Patent number: 9153195
    Abstract: The technology provides contextual personal information by a mixed reality display device system being worn by a user. A user inputs person selection criteria, and the display system sends a request for data identifying at least one person in a location of the user who satisfy the person selection criteria to a cloud based application with access to user profile data for multiple users. Upon receiving data identifying the at least one person, the display system outputs data identifying the person if he or she is within the field of view. An identifier and a position indicator of the person in the location is output if not. Directional sensors on the display device may also be used for determining a position of the person. Cloud based executing software can identify and track the positions of people based on image and non-image data from display devices in the location.
    Type: Grant
    Filed: January 30, 2012
    Date of Patent: October 6, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Darren Bennett, Relja Markovic, Stephen G. Latta, Daniel J. McCulloch, Jason Scott, Ryan L. Hastings, Alex Aben-Athar Kipman, Andrew John Fuller, Jeffrey Neil Margolis, Kathryn Stone Perez, Sheridan Martin Small
  • Publication number: 20150271449
    Abstract: Techniques for implementing an integrative interactive space are described. In implementations, video cameras that are positioned to capture video at different locations are synchronized such that aspects of the different locations can be used to generate an integrated interactive space. The integrated interactive space can enable users at the different locations to interact, such as via video interaction, audio interaction, and so on. In at least some embodiments, techniques can be implemented to adjust an image of a participant during a video session such that the participant appears to maintain eye contact with other video session participants at other locations. Techniques can also be implemented to provide a virtual shared space that can enable users to interact with the space, and can also enable users to interact with one another and/or objects that are displayed in the virtual shared space.
    Type: Application
    Filed: June 3, 2015
    Publication date: September 24, 2015
    Inventors: Vivek Pradeep, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Alice Jane Bernheim Brush
  • Patent number: 9141193
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Grant
    Filed: August 31, 2009
    Date of Patent: September 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Patent number: 9128520
    Abstract: A collaborative on-demand system allows a user of a head-mounted display device (HMDD) to obtain assistance with an activity from a qualified service provider. In a session, the user and service provider exchange camera-captured images and augmented reality images. A gaze-detection capability of the HMDD allows the user to mark areas of interest in a scene. The service provider can similarly mark areas of the scene, as well as provide camera-captured images of the service provider's hand or arm pointing to or touching an object of the scene. The service provider can also select an animation or text to be displayed on the HMDD. A server can match user requests with qualified service providers which meet parameters regarding fee, location, rating and other preferences. Or, service providers can review open requests and self-select appropriate requests, initiating contact with a user.
    Type: Grant
    Filed: March 30, 2012
    Date of Patent: September 8, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J Sugden, Benjamin I Vaught, Jeffrey B Cole, Alex Aben-Athar Kipman, Ian D McIntyre, Daniel McCulloch
  • Patent number: 9122053
    Abstract: Technology is described for providing realistic occlusion between a virtual object displayed by a head mounted, augmented reality display system and a real object visible to the user's eyes through the display. A spatial occlusion in a user field of view of the display is typically a three dimensional occlusion determined based on a three dimensional space mapping of real and virtual objects. An occlusion interface between a real object and a virtual object can be modeled at a level of detail determined based on criteria such as distance within the field of view, display size or position with respect to a point of gaze. Technology is also described for providing three dimensional audio occlusion based on an occlusion between a real object and a virtual object in the user environment.
    Type: Grant
    Filed: April 10, 2012
    Date of Patent: September 1, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Brian J. Mount, Stephen G. Latta, Daniel J. McCulloch, Kyungsuk David Lee, Ben J. Sugden, Jeffrey N. Margolis, Kathryn Stone Perez, Sheridan Martin Small, Mark J. Finocchio, Robert L. Crocco, Jr.
  • Patent number: 9122321
    Abstract: A see-through, near-eye, mixed reality display device and system for collaboration amongst various users of other such devices and personal audio/visual devices of more limited capabilities. One or more wearers of a see through head mounted display apparatus define a collaboration environment. For the collaboration environment, a selection of collaboration data and the scope of the environment are determined. Virtual representations of the collaboration data in the field of view of the wearer, and other device users are rendered. Persons in the wearer's field of view to be included in collaboration environment and who are entitled to share information in the collaboration environment are defined by the wearer. If allowed, input from other users in the collaboration environment on the virtual object may be received and allowed to manipulate a change in the virtual object.
    Type: Grant
    Filed: May 4, 2012
    Date of Patent: September 1, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Patent number: 9098873
    Abstract: An on-screen shopping application which reacts to a human target user's motions to provide a shopping experience to the user is provided. A tracking system captures user motions and executes a shopping application allowing a user to manipulate an on-screen representation the user. The on-screen representation has a likeness of the user or another individual and movements of the user in the on-screen interface allows the user to interact with virtual articles that represent real-world articles. User movements which are recognized as article manipulation or transaction control gestures are translated into commands for the shopping application.
    Type: Grant
    Filed: April 1, 2010
    Date of Patent: August 4, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Kudo Tsunoda, Darren Bennett, Brian S. Murphy, Stephen G. Latta, Relja Markovic, Alex Kipman
  • Publication number: 20150212585
    Abstract: A virtual skeleton includes a plurality of joints and provides a machine readable representation of a human target observed with a three-dimensional depth camera. A relative position of a hand joint of the virtual skeleton is translated as a gestured control, and a three-dimensional virtual world is controlled responsive to the gestured control.
    Type: Application
    Filed: February 26, 2015
    Publication date: July 30, 2015
    Inventors: Stephen Latta, Darren Bennett, Kevin Geisner, Relja Markovic
  • Patent number: 9092600
    Abstract: Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated.
    Type: Grant
    Filed: November 5, 2012
    Date of Patent: July 28, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta, Kevin Geisner, Brian Mount
  • Patent number: 9077846
    Abstract: Techniques for implementing an integrative interactive space are described. In implementations, video cameras that are positioned to capture video at different locations are synchronized such that aspects of the different locations can be used to generate an integrated interactive space. The integrated interactive space can enable users at the different locations to interact, such as via video interaction, audio interaction, and so on. In at least some embodiments, techniques can be implemented to adjust an image of a participant during a video session such that the participant appears to maintain eye contact with other video session participants at other locations. Techniques can also be implemented to provide a virtual shared space that can enable users to interact with the space, and can also enable users to interact with one another and/or objects that are displayed in the virtual shared space.
    Type: Grant
    Filed: February 6, 2012
    Date of Patent: July 7, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Vivek Pradeep, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Alice Jane Bernheim Brush
  • Patent number: 9069381
    Abstract: A computing system runs an application (e.g., video game) that interacts with one or more actively engaged users. One or more physical properties of a group are sensed. The group may include the one or more actively engaged users and/or one or more entities not actively engaged with the application. The computing system will determine that the group (or the one or more entities not actively engaged with the application) have performed a predetermined action. A runtime condition of the application is changed in response to determining that the group (or the one or more entities not actively engaged with the computer based application) have performed the predetermined action. Examples of changing a runtime condition include moving an object, changing a score or changing an environmental condition of a video game.
    Type: Grant
    Filed: March 2, 2012
    Date of Patent: June 30, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin Geisner, Relja Markovic, Stephen G. Latta, Mark T. Mihelich, Christopher Willoughby, Jonathan T. Steed, Darren Bennett, Shawn C. Wright, Matt Coohill
  • Patent number: 9063566
    Abstract: Various embodiments are provided for a shared collaboration system and related methods for enabling an active user to interact with one or more additional users and with collaboration items. In one embodiment a head-mounted display device is operatively connected to a computing device that includes a collaboration engine program. The program receives observation information of a physical space from the head-mounted display device along with a collaboration item. The program visually augments an appearance of the physical space as seen through the head-mounted display device to include an active user collaboration item representation of the collaboration item. The program populates the active user collaboration item representation with additional user collaboration item input from an additional user.
    Type: Grant
    Filed: November 30, 2011
    Date of Patent: June 23, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Daniel McCulloch, Stephen Latta, Darren Bennett, Ryan Hastings, Jason Scott, Relja Markovic, Kevin Geisner, Jonathan Steed
  • Patent number: 9053483
    Abstract: A system provides a recommendation of food items to a user based on nutritional preferences of the user, using a head-mounted display device (HMDD) worn by the user. In a store, a forward-facing camera of the HMDD captures an image of a food item. The food item can be identified by the image, such as based on packaging of the food item. Nutritional parameters of the food item are compared to nutritional preferences of the user to determine whether the food item is recommended. The HMDD displays an augmented reality image to the user indicating whether the food item is recommended. If the food item is not recommended, a substitute food item can be identified. The nutritional preferences can indicate food allergies, preferences for low calorie foods and so forth. In a restaurant, the HMDD can recommend menu selections for a user.
    Type: Grant
    Filed: March 30, 2012
    Date of Patent: June 9, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Kathryn Stonw Perez, Stephen G Latta, Ben J Sugden, Benjamin I Vaught, Alex Aben-Athar Kipman, Cameron G Brown, Holly A Hirzel, Brian J Mount, Daniel McCulloch
  • Publication number: 20150154782
    Abstract: In applications that display a representation of a user, it may be reasonable to insert a pre-canned animation rather than animating a user's captured motion. For example, in a tennis swing, the ball toss and take back in a serve could be a pre-canned animation, whereas the actual forward swing may be mapped from the user's gestures. An animation of a user's gestures can be chained together into sequences with pre-canned animations, where animation blending techniques can provide for a smoother transition between the animation types. Techniques for blending animations, that may comprise determining boundaries and transition points between pre-canned animations and animations based on captured motion, may improve animation efficiency. Gesture history, including joint position, velocity, and acceleration, can be used to determine user intent, seed parameters for subsequent animations and game control, and determine the subsequent gestures to initiate.
    Type: Application
    Filed: February 9, 2015
    Publication date: June 4, 2015
    Inventors: Kevin Geisner, Relja Markovic, Stephen Gilchrist Latta
  • Patent number: 9041739
    Abstract: Embodiments for matching participants in a virtual multiplayer entertainment experience are provided. For example, one embodiment provides a method including receiving from each user of a plurality of users a request to join the virtual multiplayer entertainment experience, receiving from each user of the plurality of users information regarding characteristics of a physical space in which each user is located, and matching two or more users of the plurality of users for participation in the virtual multiplayer entertainment experience based on the characteristics of the physical space of each of the two or more users.
    Type: Grant
    Filed: January 31, 2012
    Date of Patent: May 26, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Stephen Latta, Kevin Geisner, Brian Mount, Daniel McCulloch, Cameron Brown, Jeffrey Alan Kohler, Wei Zhang, Ryan Hastings, Darren Bennett, Ian McIntyre
  • Publication number: 20150130689
    Abstract: Embodiments for interacting with an executable virtual object associated with a real object are disclosed. In one example, a method for interacting with an executable virtual object associated with a real object includes receiving sensor input from one or more sensors attached to the portable see-through display device, and obtaining information regarding a location of the user based on the sensor input. The method also includes, if the location includes a real object comprising an associated executable virtual object, then determining an intent of the user to interact with the executable virtual object, and if the intent to interact is determined, then interacting with the executable object.
    Type: Application
    Filed: January 22, 2015
    Publication date: May 14, 2015
    Inventors: Ben Sugden, John Clavin, Ben Vaught, Stephen Latta, Kathryn Stone Perez, Daniel McCulloch, Jason Scott, Wei Zhang, Darren Bennett, Ryan Hastings, Arthur Tomlin, Kevin Geisner
  • Patent number: 9025832
    Abstract: A method of finding a new social network service friend for a player belonging to a social network service and having a friend group including one or more player-accepted friends includes recognizing the player, automatically identifying an observer within a threshold proximity to the player, and adding the observer to the friend group of the player in the social network service if the observer satisfies a friending criteria of the player.
    Type: Grant
    Filed: June 1, 2011
    Date of Patent: May 5, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen Latta, Relja Markovic, Kevin Geisner, A. Dylan Vance, Brian Scott Murphy, Matt Coohill
  • Publication number: 20150110354
    Abstract: A system may receive image data and capture motion with respect to a target in a physical space and recognize a gesture from the captured motion. It may be desirable to isolate aspects of captured motion to differentiate random and extraneous motions. For example, a gesture may comprise motion of a user's right arm, and it may be desirable to isolate the motion of the user's right arm and exclude an interpretation of any other motion. Thus, the isolated aspect may be the focus of the received data for gesture recognition. Alternately, the isolated aspects may be an aspect of the captured motion that is removed from consideration when identifying a gesture from the captured motion. For example, gesture filters may be modified to correspond to the user's natural lean to eliminate the effect the lean has on the registry of a motion with a gesture filter.
    Type: Application
    Filed: December 22, 2014
    Publication date: April 23, 2015
    Inventors: Gregory Nelson Snook, Relja Markovic, Stephen Gilchrist Latta, Kevin Geisner
  • Patent number: 9005029
    Abstract: One or more physical characteristics of each of multiple users are detected. These physical characteristics of a user can include physical attributes of the user (e.g., the user's height, length of the user's legs) and/or physical skills of the user (e.g., how high the user can jump). Based on these detected one or more physical characteristics of the users, two or more of the multiple users to share an online experience (e.g., play a multi-player game) are identified.
    Type: Grant
    Filed: September 14, 2012
    Date of Patent: April 14, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Brian Scott Murphy, Stephen G. Latta, Darren Alexander Bennett, Pedro Perez, Shawn C. Wright, Relja Markovic, Joel B. Deaguero, Christopher H. Willoughby, Ryan Lucas Hastings, Kevin Geisner