Patents by Inventor Jonathan Steed

Jonathan Steed has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150324562
    Abstract: Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated.
    Type: Application
    Filed: July 22, 2015
    Publication date: November 12, 2015
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta, Kevin Geisner, Brian Mount
  • Patent number: 9092896
    Abstract: Embodiments are disclosed that relate to augmenting an appearance of a surface via a see-through display device. For example, one disclosed embodiment provides, on a computing device comprising a see-through display device, a method of augmenting an appearance of a surface. The method includes acquiring, via an outward-facing image sensor, image data of a first scene viewable through the display. The method further includes recognizing a surface viewable through the display based on the image data and, in response to recognizing the surface, acquiring a representation of a second scene comprising one or more of a scene located physically behind the surface viewable through the display and a scene located behind a surface contextually related to the surface viewable through the display. The method further includes displaying the representation via the see-through display.
    Type: Grant
    Filed: August 7, 2012
    Date of Patent: July 28, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta
  • Patent number: 9092600
    Abstract: Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated.
    Type: Grant
    Filed: November 5, 2012
    Date of Patent: July 28, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta, Kevin Geisner, Brian Mount
  • Patent number: 9063566
    Abstract: Various embodiments are provided for a shared collaboration system and related methods for enabling an active user to interact with one or more additional users and with collaboration items. In one embodiment a head-mounted display device is operatively connected to a computing device that includes a collaboration engine program. The program receives observation information of a physical space from the head-mounted display device along with a collaboration item. The program visually augments an appearance of the physical space as seen through the head-mounted display device to include an active user collaboration item representation of the collaboration item. The program populates the active user collaboration item representation with additional user collaboration item input from an additional user.
    Type: Grant
    Filed: November 30, 2011
    Date of Patent: June 23, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Daniel McCulloch, Stephen Latta, Darren Bennett, Ryan Hastings, Jason Scott, Relja Markovic, Kevin Geisner, Jonathan Steed
  • Patent number: 8894484
    Abstract: A system and related methods for inviting a potential player to participate in a multiplayer game via a user head-mounted display device are provided. In one example, a potential player invitation program receives user voice data and determines that the user voice data is an invitation to participate in a multiplayer game. The program receives eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data. The program associates the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data. The program matches a potential player account with the potential player. The program receives an acceptance response from the potential player, and joins the potential player account with a user account in participating in the multiplayer game.
    Type: Grant
    Filed: January 30, 2012
    Date of Patent: November 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Stephen Latta, Kevin Geisner, Brian Mount, Jonathan Steed, Tony Ambrus, Arnulfo Zepeda, Aaron Krauss
  • Publication number: 20140320389
    Abstract: Embodiments that relate to interacting with a physical object in a mixed reality environment via a head-mounted display are disclosed. In one embodiment a mixed reality interaction program identifies an object based on an image from captured by the display. An interaction context for the object is determined based on an aspect of the mixed reality environment. A profile for the physical object is queried to determine interaction modes for the object. A selected interaction mode is programmatically selected based on the interaction context. A user input directed at the object is received via the display and interpreted to correspond to a virtual action based on the selected interaction mode. The virtual action is executed with respect to a virtual object associated with the physical object to modify an appearance of the virtual object. The modified virtual object is then displayed via the display.
    Type: Application
    Filed: April 29, 2013
    Publication date: October 30, 2014
    Inventors: Michael Scavezze, Jonathan Steed, Stephen Latta, Kevin Geisner, Daniel McCulloch, Brian Mount, Ryan Hastings, Phillip Charles Heckinger
  • Publication number: 20140125574
    Abstract: Embodiments are disclosed that relate to authenticating a user of a display device. For example, one disclosed embodiment includes displaying one or more virtual images on the display device, wherein the one or more virtual images include a set of augmented reality features. The method further includes identifying one or more movements of the user via data received from a sensor of the display device, and comparing the identified movements of the user to a predefined set of authentication information for the user that links user authentication to a predefined order of the augmented reality features. If the identified movements indicate that the user selected the augmented reality features in the predefined order, then the user is authenticated, and if the identified movements indicate that the user did not select the augmented reality features in the predefined order, then the user is not authenticated.
    Type: Application
    Filed: November 5, 2012
    Publication date: May 8, 2014
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta, Kevin Geisner, Brian Mount
  • Publication number: 20140125668
    Abstract: Embodiments related to efficiently constructing an augmented reality environment with global illumination effects are disclosed. For example, one disclosed embodiment provides a method of displaying an augmented reality image via a display device. The method includes receiving image data, the image data capturing an image of a local environment of the display device, and identifying a physical feature of the local environment via the image data. The method further includes constructing an augmented reality image of a virtual structure for display over the physical feature in spatial registration with the physical feature from a viewpoint of a user, the augmented reality image comprising a plurality of modular virtual structure segments arranged in adjacent locations to form the virtual structure feature, each modular virtual structure segment comprising a pre-computed global illumination effect, and outputting the augmented reality image to the display device.
    Type: Application
    Filed: November 5, 2012
    Publication date: May 8, 2014
    Inventors: Jonathan Steed, Aaron Krauss, Mike Scavezze, Wei Zhang, Arthur Tomlin, Tony Ambrus, Brian Mount, Stephen Latta, Ryan Hastings
  • Publication number: 20140043433
    Abstract: Embodiments are disclosed that relate to augmenting an appearance of a surface via a see-through display device. For example, one disclosed embodiment provides, on a computing device comprising a see-through display device, a method of augmenting an appearance of a surface. The method includes acquiring, via an outward-facing image sensor, image data of a first scene viewable through the display. The method further includes recognizing a surface viewable through the display based on the image data and, in response to recognizing the surface, acquiring a representation of a second scene comprising one or more of a scene located physically behind the surface viewable through the display and a scene located behind a surface contextually related to the surface viewable through the display. The method further includes displaying the representation via the see-through display.
    Type: Application
    Filed: August 7, 2012
    Publication date: February 13, 2014
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta
  • Publication number: 20140044305
    Abstract: Embodiments are disclosed herein that relate to the automatic tracking of objects. For example, one disclosed embodiment provides a method of operating a mobile computing device having an image sensor. The method includes acquiring image data, identifying an inanimate moveable object in the image data, determining whether the inanimate moveable object is a tracked object, ate moveable object is a tracked object, then storing information regarding a state of the inanimate moveable object, detecting a trigger to provide a notification of the state of the inanimate moveable object, and providing an output of the notification of the state of the inanimate moveable object.
    Type: Application
    Filed: August 7, 2012
    Publication date: February 13, 2014
    Inventors: Mike Scavezze, Jason Scott, Jonathan Steed, Ian McIntyre, Aaron Krauss, Daniel McCulloch, Stephen Latta
  • Publication number: 20130196757
    Abstract: A system and related methods for inviting a potential player to participate in a multiplayer game via a user head-mounted display device are provided. In one example, a potential player invitation program receives user voice data and determines that the user voice data is an invitation to participate in a multiplayer game. The program receives eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data. The program associates the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data. The program matches a potential player account with the potential player. The program receives an acceptance response from the potential player, and joins the potential player account with a user account in participating in the multiplayer game.
    Type: Application
    Filed: January 30, 2012
    Publication date: August 1, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Stephen Latta, Kevin Geisner, Brian Mount, Jonathan Steed, Tony Ambrus, Arnulfo Zepeda, Aaron Krauss
  • Publication number: 20130194259
    Abstract: A system and related methods for visually augmenting an appearance of a physical environment as seen by a user through a head-mounted display device are provided. In one embodiment, a virtual environment generating program receives eye-tracking information, lighting information, and depth information from the head-mounted display. The program generates a virtual environment that models the physical environment and is based on the lighting information and the distance of a real-world object from the head-mounted display. The program visually augments a virtual object representation in the virtual environment based on the eye-tracking information, and renders the virtual object representation on a transparent display of the head-mounted display device.
    Type: Application
    Filed: January 27, 2012
    Publication date: August 1, 2013
    Inventors: Darren Bennett, Brian Mount, Stephen Latta, Alex Kipman, Ryan Hastings, Arthur Tomlin, Sebastian Sylvan, Daniel McCulloch, Jonathan Steed, Jason Scott, Mathew Lamb
  • Publication number: 20130141419
    Abstract: A head-mounted display device is configured to visually augment an observed physical space to a user. The head-mounted display device includes a see-through display and is configured to receive augmented display information, such as a virtual object with occlusion relative to a real world object from a perspective of the see-through display.
    Type: Application
    Filed: December 1, 2011
    Publication date: June 6, 2013
    Inventors: Brian Mount, Stephen Latta, Daniel McCulloch, Kevin Geisner, Jason Scott, Jonathan Steed, Arthur Tomlin, Mark Mihelich
  • Publication number: 20130135180
    Abstract: Various embodiments are provided for a shared collaboration system and related methods for enabling an active user to interact with one or more additional users and with collaboration items. In one embodiment a head-mounted display device is operatively connected to a computing device that includes a collaboration engine program. The program receives observation information of a physical space from the head-mounted display device along with a collaboration item. The program visually augments an appearance of the physical space as seen through the head-mounted display device to include an active user collaboration item representation of the collaboration item. The program populates the active user collaboration item representation with additional user collaboration item input from an additional user.
    Type: Application
    Filed: November 30, 2011
    Publication date: May 30, 2013
    Inventors: Daniel McCulloch, Stephen Latta, Darren Bennett, Ryan Hastings, Jason Scott, Relja Markovic, Kevin Geisner, Jonathan Steed
  • Publication number: 20060271842
    Abstract: A standard graphics specification for use by both developers of graphics files as well as of applications developed to execute in a runtime environment is disclosed. The graphics files are developed to conform to the graphics specification and therefore will be executable by applications in any runtime environment that likewise conform to the graphics specification. The specification includes program syntax standards and standards for metadata in the form of semantics and annotations that further describe the code. The specification additionally includes standards to which applications may conform to ensure that the applications will be capable of executing any graphics files that conform to the graphics specification.
    Type: Application
    Filed: May 27, 2005
    Publication date: November 30, 2006
    Applicant: Microsoft Corporation
    Inventors: David Aronson, Paul Bleisch, Daniel Horowitz, Jonathan Steed