Patents by Inventor Benjamin I. Vaught

Benjamin I. Vaught has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10132633
    Abstract: The technology causes disappearance of a real object in a field of view of a see-through, mixed reality display device system based on user disappearance criteria. Image data is tracked to the real object in the field of view of the see-through display for implementing an alteration technique on the real object causing its disappearance from the display. A real object may satisfy user disappearance criteria by being associated with subject matter that the user does not wish to see or by not satisfying relevance criteria for a current subject matter of interest to the user. In some embodiments, based on a 3D model of a location of the display device system, an alteration technique may be selected for a real object based on a visibility level associated with the position within the location. Image data for alteration may be prefetched based on a location of the display device system.
    Type: Grant
    Filed: December 29, 2015
    Date of Patent: November 20, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: James C. Liu, Stephen G. Latta, Benjamin I. Vaught, Christopher M. Novak, Darren Bennett
  • Patent number: 10019962
    Abstract: A user interface includes a virtual object having an appearance in context with a real environment of a user using a see-through, near-eye augmented reality display device system. A virtual type of object and at least one real world object are selected based on compatibility criteria for forming a physical connection like attachment, supporting or integration of the virtual object with the at least one real object. Other appearance characteristics, e.g. color, size or shape, of the virtual object are selected for satisfying compatibility criteria with the selected at least one real object. Additionally, a virtual object type and appearance characteristics of the virtual object may be selected based on a social context of the user, a personal context of the user or both.
    Type: Grant
    Filed: August 17, 2011
    Date of Patent: July 10, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: James C. Liu, Anton O. Andrews, Benjamin I. Vaught, Craig R. Maitlen, Christopher M. Novak, Sheridan Martin Small
  • Patent number: 9767524
    Abstract: Technology is provided for transferring a right to a digital content item based on one or more physical actions detected in data captured by a see-through, augmented reality display device system. A digital content item may be represented by a three-dimensional (3D) virtual object displayed by the device system. A user can hold the virtual object in some examples, and transfer a right to the content item the object represents by handing the object to another user within a defined distance, who indicates acceptance of the right based upon one or more physical actions including taking hold of the transferred object. Other examples of physical actions performed by a body part of a user may also indicate offer and acceptance in the right transfer. Content may be transferred from display device to display device while rights data is communicated via a network with a service application executing remotely.
    Type: Grant
    Filed: May 18, 2015
    Date of Patent: September 19, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Ryan L. Hastings, Stephen G. Latta, Benjamin I. Vaught, Darren Bennett
  • Patent number: 9606992
    Abstract: Technology is described for resource management based on data including image data of a resource captured by at least one capture device of at least one personal audiovisual (A/V) apparatus including a near-eye, augmented reality (AR) display. A resource is automatically identified from image data captured by at least one capture device of at least one personal A/V apparatus and object reference data. A location in which the resource is situated and a 3D space position or volume of the resource in the location is tracked. A property of the resource is also determined from the image data and tracked. A function of a resource may also be stored for determining whether the resource is usable for a task. Responsive to notification criteria for the resource being satisfied, image data related to the resource is displayed on the near-eye AR display.
    Type: Grant
    Filed: June 27, 2012
    Date of Patent: March 28, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Jeffrey A. Kohler, Daniel J. McCulloch
  • Patent number: 9498720
    Abstract: A game can be created, shared and played using a personal audio/visual apparatus such as a head-mounted display device (HMDD). Rules of the game, and a configuration of the game space, can be standard or custom. Boundary points of the game can be defined by a gaze direction of the HMDD, by the user's location, by a model of a physical game space such as an instrumented court or by a template. Players can be identified and notified of the availability of a game using a server push technology. For example, a user in a particular location may be notified of the availability of a game at that location. A server manages the game, including storing the rules, boundaries and a game state. The game state can identify players and their scores. Real world objects can be imaged and provided as virtual objects in the game space.
    Type: Grant
    Filed: April 12, 2012
    Date of Patent: November 22, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A Geisner, Stephen G Latta, Ben J Sugden, Benjamin I Vaught, Alex Aben-Athar Kipman, Kathryn Stone Perez, Ryan L Hastings, Jason Scott, Darren A Bennett, John Clavin, Daniel McCulloch
  • Publication number: 20160292850
    Abstract: The technology described herein includes a see-through, near-eye, mixed reality display device for providing customized experiences for a user. The system can be used in various entertainment, sports, shopping and theme-park situations to provide a mixed reality experience.
    Type: Application
    Filed: March 4, 2016
    Publication date: October 6, 2016
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Kevin A. Geisner, Alex Aben-Athar Kipman, Jennifer A. Karr
  • Patent number: 9395811
    Abstract: A see-through head-mounted display (HMD) device, e.g., in the form of glasses, provides view an augmented reality image including text, such as in an electronic book or magazine, word processing document, email, karaoke, teleprompter or other public speaking assistance application. The presentation of text and/or graphics can be adjusted based on sensor inputs indicating a gaze direction, focal distance and/or biological metric of the user. A current state of the text can be bookmarked when the user looks away from the image and subsequently resumed from the bookmarked state. A forward facing camera can adjust the text if a real word object passes in front of it, or adjust the appearance of the text based on a color of pattern of a real world background object. In a public speaking or karaoke application, information can be displayed regarding a level of interest of the audience and names of audience members.
    Type: Grant
    Filed: May 23, 2014
    Date of Patent: July 19, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Benjamin I Vaught, Alex Aben-Athar Kipman, Robert Crocco
  • Patent number: 9355583
    Abstract: The technology described herein includes a see-through, near-eye, mixed reality display device for providing customized experiences for a user. The personal A/V apparatus serves as an exercise program that is always with the user, provides motivation for the user, visually tells the user how to exercise, and lets the user exercise with other people who are not present.
    Type: Grant
    Filed: August 28, 2014
    Date of Patent: May 31, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman
  • Patent number: 9345957
    Abstract: Technology is described for providing a personalized sport performance experience with three dimensional (3D) virtual data displayed by a near-eye, augmented reality display of a personal audiovisual (A/V) apparatus. A physical movement recommendation is determined for the user performing a sport based on skills data for the user for the sport, physical characteristics of the user, and 3D space positions for at least one or more sport objects. 3D virtual data depicting one or more visual guides for assisting the user in performing the physical movement recommendation may be displayed from a user perspective associated with a display field of view of the near-eye AR display. An avatar may also be displayed by the near-eye AR display performing a sport. The avatar may perform the sport interactively with the user or be displayed performing a prior performance of an individual represented by the avatar.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: May 24, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, John Clavin
  • Publication number: 20160117861
    Abstract: The technology causes disappearance of a real object in a field of view of a see-through, mixed reality display device system based on user disappearance criteria. Image data is tracked to the real object in the field of view of the see-through display for implementing an alteration technique on the real object causing its disappearance from the display. A real object may satisfy user disappearance criteria by being associated with subject matter that the user does not wish to see or by not satisfying relevance criteria for a current subject matter of interest to the user. In some embodiments, based on a 3D model of a location of the display device system, an alteration technique may be selected for a real object based on a visibility level associated with the position within the location. Image data for alteration may be prefetched based on a location of the display device system.
    Type: Application
    Filed: December 29, 2015
    Publication date: April 28, 2016
    Inventors: James C. Liu, Stephen G. Latta, Benjamin I. Vaught, Christopher M. Novak, Darren Bennett
  • Patent number: 9323325
    Abstract: Technology is disclosed for enhancing the experience of a user wearing a see-through, near eye mixed reality display device. Based on an arrangement of gaze detection elements on each display optical system for each eye of the display device, a respective gaze vector is determined and a current user focal region is determined based on the gaze vectors. Virtual objects are displayed at their respective focal regions in a user field of view for a natural sight view. Additionally, one or more objects of interest to a user may be identified. The identification may be based on a user intent to interact with the object. For example, the intent may be determined based on a gaze duration. Augmented content may be projected over or next to an object, real or virtual. Additionally, a real or virtual object intended for interaction may be zoomed in or out.
    Type: Grant
    Filed: August 30, 2011
    Date of Patent: April 26, 2016
  • Patent number: 9285871
    Abstract: A system for generating an augmented reality environment in association with one or more attractions or exhibits is described. In some cases, a see-through head-mounted display device (HMD) may acquire one or more virtual objects from a supplemental information provider associated with a particular attraction. The one or more virtual objects may be based on whether an end user of the HMD is waiting in line for the particular attraction or is on (or in) the particular attraction. The supplemental information provider may vary the one or more virtual objects based on the end user's previous experiences with the particular attraction. The HMD may adapt the one or more virtual objects based on physiological feedback from the end user (e.g., if a child is scared). The supplemental information provider may also provide and automatically update a task list associated with the particular attraction.
    Type: Grant
    Filed: March 27, 2012
    Date of Patent: March 15, 2016
  • Patent number: 9286711
    Abstract: Technology is described for representing a physical location at a previous time period with three dimensional (3D) virtual data displayed by a near-eye, augmented reality display of a personal audiovisual (A/V) apparatus. The personal A/V apparatus is identified as being within the physical location, and one or more objects in a display field of view of the near-eye, augmented reality display are automatically identified based on a three dimensional mapping of objects in the physical location. User input, which may be natural user interface (NUI) input, indicates a previous time period, and one or more 3D virtual objects associated with the previous time period are displayed from a user perspective associated with the display field of view. An object may be erased from the display field of view, and a camera effect may be applied when changing between display fields of view.
    Type: Grant
    Filed: June 27, 2012
    Date of Patent: March 15, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, John Clavin, Jonathan T. Steed, Jason Scott
  • Patent number: 9288468
    Abstract: Techniques are provided for viewing windows for video streams. A video stream from a video capture device is accessed. Data that describes movement or position of a person is accessed. A viewing window is placed in the video stream based on the data that describes movement or position of the person. The viewing window is provided to a display device in accordance with the placement of the viewing window in the video stream. Motion sensors can detect motion of the person carrying the video capture device in order to dampen the motion such that the video on the remote display does not suffer from motion artifacts. Sensors can also track the eye gaze of either the person carrying the mobile video capture device or the remote display device to enable control of the spatial region of the video stream shown at the display device.
    Type: Grant
    Filed: June 29, 2011
    Date of Patent: March 15, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Arthur C. Tomlin, Relja Markovic, Darren Bennett, Stephen G. Latta
  • Patent number: 9268406
    Abstract: Technology is described for providing a virtual spectator experience for a user of a personal A/V apparatus including a near-eye, augmented reality (AR) display. A position volume of an event object participating in an event in a first 3D coordinate system for a first location is received and mapped to a second position volume in a second 3D coordinate system at a second location remote from where the event is occurring. A display field of view of the near-eye AR display at the second location is determined, and real-time 3D virtual data representing the one or more event objects which are positioned within the display field of view are displayed in the near-eye AR display. A user may select a viewing position from which to view the event. Additionally, virtual data of a second user may be displayed at a position relative to a first user.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: February 23, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kevin A. Geisner, Kathryn Stone Perez, Stephen G. Latta, Ben J. Sugden, Benjamin I. Vaught, Alex Aben-Athar Kipman, Michael J. Scavezze, Daniel J. McCulloch, Darren Bennett, Jason Scott, Ryan L. Hastings, Brian E. Keane, Christopher E. Miles, Robert L. Crocco, Jr., Mathew J. Lamb
  • Patent number: 9255813
    Abstract: The technology causes disappearance of a real object in a field of view of a see-through, mixed reality display device system based on user disappearance criteria. Image data is tracked to the real object in the field of view of the see-through display for implementing an alteration technique on the real object causing its disappearance from the display. A real object may satisfy user disappearance criteria by being associated with subject matter that the user does not wish to see or by not satisfying relevance criteria for a current subject matter of interest to the user. In some embodiments, based on a 3D model of a location of the display device system, an alteration technique may be selected for a real object based on a visibility level associated with the position within the location. Image data for alteration may be prefetched based on a location of the display device system.
    Type: Grant
    Filed: October 14, 2011
    Date of Patent: February 9, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: James C. Liu, Stephen G. Latta, Benjamin I. Vaught, Christopher M. Novak, Darren Bennett
  • Patent number: 9229231
    Abstract: The technology provides for updating printed content with personalized virtual data using a see-through, near-eye, mixed reality display device system. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. Virtual data is selected from available virtual data for the printed content selection based on user profile data, and the display device system displays the selected virtual data in a position registered to the position of the printed content selection. In some examples, a task related to the printed content item is determined based on physical action user input, and personalized virtual data is displayed registered to the printed content item in accordance with the task.
    Type: Grant
    Filed: January 9, 2012
    Date of Patent: January 5, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Patent number: 9213163
    Abstract: The technology provides for automatic alignment of a see-through near-eye, mixed reality device with an inter-pupillary distance (IPD). A determination is made as to whether a see-through, near-eye, mixed reality display device is aligned with an IPD of a user. If the display device is not aligned with the IPD, the display device is automatically adjusted. In some examples, the alignment determination is based on determinations of whether an optical axis of each display optical system positioned to be seen through by a respective eye is aligned with a pupil of the respective eye in accordance with an alignment criteria. The pupil alignment may be determined based on an arrangement of gaze detection elements for each display optical system including at least one sensor for capturing data of the respective eye and the captured data. The captured data may be image data, image and glint data, and glint data only.
    Type: Grant
    Filed: August 30, 2011
    Date of Patent: December 15, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: John R. Lewis, Yichen Wei, Robert L. Crocco, Benjamin I. Vaught, Kathryn Stone Perez, Alex Aben-Athar Kipman
  • Patent number: 9183807
    Abstract: The technology provides embodiments for displaying virtual data as printed content by a see-through, near-eye, mixed reality display device system. One or more literary content items registered to a reading object in a field of view of the display device system are displayed with print layout characteristics. Print layout characteristics from a publisher of each literary content item are selected if available. The reading object has a type like a magazine, book, journal or newspaper and may be a real object or a virtual object displayed by the display device system. The reading object type of the virtual object is based on a reading object type associated with a literary content item to be displayed. Virtual augmentation data registered to a literary content item is displayed responsive to detecting user physical action in image data. An example of a physical action is a page flipping gesture.
    Type: Grant
    Filed: January 10, 2012
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez
  • Patent number: 9182815
    Abstract: The technology provides embodiments for making static printed content being viewed through a see-through, mixed reality display device system more dynamic with display of virtual data. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. A task in relation to the printed content selection can also be determined based on physical action user input. Virtual data for the printed content selection is displayed in accordance with the task. Additionally, virtual data can be linked to a work embodied in a printed content item. Furthermore, a virtual version of the printed material may be displayed at a more comfortable reading position and with improved visibility of the content.
    Type: Grant
    Filed: December 7, 2011
    Date of Patent: November 10, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sheridan Martin Small, Alex Aben-Athar Kipman, Benjamin I. Vaught, Kathryn Stone Perez