Patents by Inventor Kevin A Geisner

Kevin A Geisner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9141193
    Abstract: A capture device can detect gestures made by a user. The gestures can be used to control a gesture unaware program.
    Type: Grant
    Filed: August 31, 2009
    Date of Patent: September 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn S. Perez, Kevin A. Geisner, Alex A. Kipman, Kudo Tsunoda
  • Patent number: 9128520
    Abstract: A collaborative on-demand system allows a user of a head-mounted display device (HMDD) to obtain assistance with an activity from a qualified service provider. In a session, the user and service provider exchange camera-captured images and augmented reality images. A gaze-detection capability of the HMDD allows the user to mark areas of interest in a scene. The service provider can similarly mark areas of the scene, as well as provide camera-captured images of the service provider's hand or arm pointing to or touching an object of the scene. The service provider can also select an animation or text to be displayed on the HMDD. A server can match user requests with qualified service providers which meet parameters regarding fee, location, rating and other preferences. Or, service providers can review open requests and self-select appropriate requests, initiating contact with a user.
    Type: Grant
    Filed: March 30, 2012
    Date of Patent: September 8, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J Sugden, Benjamin I Vaught, Jeffrey B Cole, Alex Aben-Athar Kipman, Ian D McIntyre, Daniel McCulloch
  • Patent number: 9122321
    Abstract: A see-through, near-eye, mixed reality display device and system for collaboration amongst various users of other such devices and personal audio/visual devices of more limited capabilities. One or more wearers of a see through head mounted display apparatus define a collaboration environment. For the collaboration environment, a selection of collaboration data and the scope of the environment are determined. Virtual representations of the collaboration data in the field of view of the wearer, and other device users are rendered. Persons in the wearer's field of view to be included in collaboration environment and who are entitled to share information in the collaboration environment are defined by the wearer. If allowed, input from other users in the collaboration environment on the virtual object may be received and allowed to manipulate a change in the virtual object.
    Type: Grant
    Filed: May 4, 2012
    Date of Patent: September 1, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Patent number: 9122053
    Abstract: Technology is described for providing realistic occlusion between a virtual object displayed by a head mounted, augmented reality display system and a real object visible to the user's eyes through the display. A spatial occlusion in a user field of view of the display is typically a three dimensional occlusion determined based on a three dimensional space mapping of real and virtual objects. An occlusion interface between a real object and a virtual object can be modeled at a level of detail determined based on criteria such as distance within the field of view, display size or position with respect to a point of gaze. Technology is also described for providing three dimensional audio occlusion based on an occlusion between a real object and a virtual object in the user environment.
    Type: Grant
    Filed: April 10, 2012
    Date of Patent: September 1, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Brian J. Mount, Stephen G. Latta, Daniel J. McCulloch, Kyungsuk David Lee, Ben J. Sugden, Jeffrey N. Margolis, Kathryn Stone Perez, Sheridan Martin Small, Mark J. Finocchio, Robert L. Crocco, Jr.
  • Patent number: 9098873
    Abstract: An on-screen shopping application which reacts to a human target user's motions to provide a shopping experience to the user is provided. A tracking system captures user motions and executes a shopping application allowing a user to manipulate an on-screen representation the user. The on-screen representation has a likeness of the user or another individual and movements of the user in the on-screen interface allows the user to interact with virtual articles that represent real-world articles. User movements which are recognized as article manipulation or transaction control gestures are translated into commands for the shopping application.
    Type: Grant
    Filed: April 1, 2010
    Date of Patent: August 4, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Kudo Tsunoda, Darren Bennett, Brian S. Murphy, Stephen G. Latta, Relja Markovic, Alex Kipman
  • Patent number: 9053483
    Abstract: A system provides a recommendation of food items to a user based on nutritional preferences of the user, using a head-mounted display device (HMDD) worn by the user. In a store, a forward-facing camera of the HMDD captures an image of a food item. The food item can be identified by the image, such as based on packaging of the food item. Nutritional parameters of the food item are compared to nutritional preferences of the user to determine whether the food item is recommended. The HMDD displays an augmented reality image to the user indicating whether the food item is recommended. If the food item is not recommended, a substitute food item can be identified. The nutritional preferences can indicate food allergies, preferences for low calorie foods and so forth. In a restaurant, the HMDD can recommend menu selections for a user.
    Type: Grant
    Filed: March 30, 2012
    Date of Patent: June 9, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Kathryn Stonw Perez, Stephen G Latta, Ben J Sugden, Benjamin I Vaught, Alex Aben-Athar Kipman, Cameron G Brown, Holly A Hirzel, Brian J Mount, Daniel McCulloch
  • Publication number: 20150049114
    Abstract: The technology described herein includes a see-through, near-eye, mixed reality display device for providing customized experiences for a user. The personal A/V apparatus serves as an exercise program that is always with the user, provides motivation for the user, visually tells the user how to exercise, and lets the user exercise with other people who are not present.
    Type: Application
    Filed: August 28, 2014
    Publication date: February 19, 2015
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman
  • Patent number: 8933884
    Abstract: In a motion capture system, a unitary input is provided to an application based on detected movement and/or location of a group of people. Audio information from the group can also be used as an input. The application can provide real-time feedback to the person or group via a display and audio output. The group can control the movement of an avatar in a virtual space based on the movement of each person in the group, such as in a steering or balancing game. To avoid a discontinuous or confusing output by the application, missing data can be generated for a person who is occluded or partially out of the field of view. A wait time can be set for activating a new person and deactivating a currently-active person. The wait time can be adaptive based on a first detected position or a last detected position of the person.
    Type: Grant
    Filed: January 15, 2010
    Date of Patent: January 13, 2015
    Assignee: Microsoft Corporation
    Inventors: Relja Markovic, Stephen G. Latta, Kevin A. Geisner, David Hill, Darren A. Bennett, David C. Haley, Jr., Brian S. Murphy, Shawn C. Wright
  • Patent number: 8847988
    Abstract: The technology described herein includes a see-through, near-eye, mixed reality display device for providing customized experiences for a user. The personal A/V apparatus serves as an exercise program that is always with the user, provides motivation for the user, visually tells the user how to exercise, and lets the user exercise with other people who are not present.
    Type: Grant
    Filed: March 30, 2012
    Date of Patent: September 30, 2014
    Assignee: Microsoft Corporation
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman
  • Patent number: 8752963
    Abstract: The technology provides various embodiments for controlling brightness of a see-through, near-eye mixed display device based on light intensity of what the user is gazing at. The opacity of the display can be altered, such that external light is reduced if the wearer is looking at a bright object. The wearer's pupil size may be determined and used to adjust the brightness used to display images, as well as the opacity of the display. A suitable balance between opacity and brightness used to display images may be determined that allows real and virtual objects to be seen clearly, while not causing damage or discomfort to the wearer's eyes.
    Type: Grant
    Filed: November 4, 2011
    Date of Patent: June 17, 2014
    Assignee: Microsoft Corporation
    Inventors: Daniel J. McCulloch, Ryan L. Hastings, Kevin A. Geisner, Robert L. Crocco, Alexandru O. Balan, Derek L. Knee, Michael J. Scavezze, Stephen G. Latta, Brian J. Mount
  • Publication number: 20140002444
    Abstract: Technology is described for automatically determining placement of one or more interaction zones in an augmented reality environment in which one or more virtual features are added to a real environment. An interaction zone includes at least one virtual feature and is associated with a space within the augmented reality environment with boundaries of the space determined based on the one or more real environment features. A plurality of activation criteria may be available for an interaction zone and at least one may be selected based on at least one real environment feature. The technology also describes controlling activation of an interaction zone within the augmented reality environment. In some examples, at least some behavior of a virtual object is controlled by emergent behavior criteria which defines an action independently from a type of object in the real world environment.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Darren Bennett, Brian J. Mount, Michael J. Scavezze, Daniel J. McCulloch, Anthony J. Ambrus, Jonathan T. Steed, Arthur C. Tomlin, Kevin A. Geisner
  • Publication number: 20130328927
    Abstract: A system for generating a virtual gaming environment based on features identified within a real-world environment, and adapting the virtual gaming environment over time as the features identified within the real-world environment change is described. Utilizing the technology described, a person wearing a head-mounted display device (HMD) may walk around a real-world environment and play a virtual game that is adapted to that real-world environment. For example, the HMD may identify environmental features within a real-world environment such as five grassy areas and two cars, and then spawn virtual monsters based on the location and type of the environmental features identified. The location and type of the environmental features identified may vary depending on the particular real-world environment in which the HMD exists and therefore each virtual game may look different depending on the particular real-world environment.
    Type: Application
    Filed: November 29, 2012
    Publication date: December 12, 2013
    Inventors: Brian J. Mount, Jason Scott, Ryan L. Hastings, Darren Bennett, Stephen G. Latta, Daniel J. McCulloch, Kevin A. Geisner, Jonathan T. Steed, Michael J. Scavezze
  • Publication number: 20130293577
    Abstract: A see-through, near-eye, mixed reality display apparatus for providing translations of real world data for a user. A wearer's location and orientation with the apparatus is determined and input data for translation is selected using sensors of the apparatus. Input data can be audio or visual in nature, and selected by reference to the gaze of a wearer. The input data is translated for the user relative to user profile information bearing on accuracy of a translation and determining from the input data whether a linguistic translation, knowledge addition translation or context translation is useful.
    Type: Application
    Filed: May 4, 2012
    Publication date: November 7, 2013
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20130293468
    Abstract: A see-through, near-eye, mixed reality display device and system for collaboration amongst various users of other such devices and personal audio/visual devices of more limited capabilities. One or more wearers of a see through head mounted display apparatus define a collaboration environment. For the collaboration environment, a selection of collaboration data and the scope of the environment are determined. Virtual representations of the collaboration data in the field of view of the wearer, and other device users are rendered. Persons in the wearer's field of view to be included in collaboration environment and who are entitled to share information in the collaboration environment are defined by the wearer. If allowed, input from other users in the collaboration environment on the virtual object may be received and allowed to manipulate a change in the virtual object.
    Type: Application
    Filed: May 4, 2012
    Publication date: November 7, 2013
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20130293530
    Abstract: An augmented reality system that provides augmented product and environment information to a wearer of a see through head mounted display. The augmentation information may include advertising, inventory, pricing and other information about products a wearer may be interested in. Interest is determined from wearer actions and a wearer profile. The information may be used to incentivize purchases of real world products by a wearer, or allow the wearer to make better purchasing decisions. The augmentation information may enhance a wearer's shopping experience by allowing the wearer easy access to important product information while the wearer is shopping in a retail establishment. Through virtual rendering, a wearer may be provided with feedback on how an item would appear in a wearer environment, such as the wearer's home.
    Type: Application
    Filed: May 4, 2012
    Publication date: November 7, 2013
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos
  • Publication number: 20130286004
    Abstract: Technology is described for displaying a collision between objects by an augmented reality display device system. A collision between a real object and a virtual object is identified based on three dimensional space position data of the objects. At least one effect on at least one physical property of the real object is determined based on physical properties of the real object, like a change in surface shape, and physical interaction characteristics of the collision. Simulation image data is generated and displayed simulating the effect on the real object by the augmented reality display. Virtual objects under control of different executing applications can also interact with one another in collisions.
    Type: Application
    Filed: April 27, 2012
    Publication date: October 31, 2013
    Inventors: Daniel J. McCulloch, Stephen G. Latta, Brian J. Mount, Kevin A. Geisner, Roger Sebastian Kevin Sylvan, Arnulfo Zepeda Navratil, Jason Scott, Jonathan T. Steed, Ben J. Sugden, Britta Silke Hummel, Kyungsuk David Lee, Mark J. Finocchio, Alex Aben-Athar Kipman, Jeffrey N. Margolis
  • Publication number: 20130177296
    Abstract: A system and method for efficiently managing life experiences captured by one or more sensors (e.g., video or still camera, image sensors including RGB sensors and depth sensors). A “life recorder” is a recording device that continuously captures life experiences, including unanticipated life experiences, in image, video, and/or audio recordings. In some embodiments, video and/or audio recordings captured by a life recorder are automatically analyzed, tagged with a set of one or more metadata, indexed, and stored for future use. By tagging and indexing life recordings, a life recorder may search for and acquire life recordings generated by itself or another life recorder, thereby allowing life experiences to be shared minutes or even years later.
    Type: Application
    Filed: November 29, 2012
    Publication date: July 11, 2013
    Inventors: Kevin A. Geisner, Relja Markovic, Stephen G. Latta, Daniel McCulloch
  • Patent number: 8465108
    Abstract: Techniques for enhancing the use of a motion capture system are provided. A motion capture system tracks movement and audio inputs from a person in a physical space, and provides the inputs to an application, which displays a virtual space on a display. Bodily movements can be used to define traits of an avatar in the virtual space. The person can be directed to perform the movements by a coaching avatar, or visual or audio cues in the virtual space. The application can respond to the detected movements and voice commands or voice volume of the person to define avatar traits and initiate pre-scripted audio-visual events in the virtual space to provide an entertaining experience. A performance in the virtual space can be captured and played back with automatic modifications, such as alterations to the avatar's voice or appearance, or modifications made by another person.
    Type: Grant
    Filed: September 5, 2012
    Date of Patent: June 18, 2013
    Assignee: Microsoft Corporation
    Inventors: Relja Markovic, Stephen G Latta, Kevin A Geisner, Christopher Vuchetich, Darren A Bennett, Brian S Murphy, Shawn C Wright
  • Publication number: 20130137076
    Abstract: Technology disclosed herein provides for use of HMDs in a classroom setting. Technology disclosed herein provides for HMD use for holographic instruction. In one embodiment, the HMD is used for social coaching. User profile information may be used to tailor instruction to a specific user based on known skills, learning styles, and/or characteristics. One or more individuals may be monitored based on sensor data. The sensor data may come from an HMD. The monitoring may be analyzed to determine how to enhance an experience. The experience may be enhanced by presenting an image in at least one head mounted display worn by the one or more individuals.
    Type: Application
    Filed: November 30, 2011
    Publication date: May 30, 2013
    Inventors: Kathryn Stone Perez, Kevin A. Geisner, Ben J. Sugden, Daniel J. McCulloch, John Clavin
  • Publication number: 20130114043
    Abstract: The technology provides various embodiments for controlling brightness of a see-through, near-eye mixed display device based on light intensity of what the user is gazing at. The opacity of the display can be altered, such that external light is reduced if the wearer is looking at a bright object. The wearer's pupil size may be determined and used to adjust the brightness used to display images, as well as the opacity of the display. A suitable balance between opacity and brightness used to display images may be determined that allows real and virtual objects to be seen clearly, while not causing damage or discomfort to the wearer's eyes.
    Type: Application
    Filed: November 4, 2011
    Publication date: May 9, 2013
    Inventors: Alexandru O. Balan, Ryan L. Hastings, Stephen G. Latta, Michael J. Scavezze, Daniel J. McCulloch, Derek L. Knee, Brian J. Mount, Kevin A. Geisner, Robert L. Crocco