Patents by Inventor Tony Ambrus

Tony Ambrus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150317834
    Abstract: Embodiments are disclosed for methods and systems of distinguishing movements of features in a physical environment. For example, on a head-mounted display device, one embodiment of a method includes obtaining a representation of real-world features in two or more coordinate frames and obtaining motion data from one or more sensors external to the head-mounted display device. The method further includes distinguishing features in one coordinate frame from features in another coordinate frame based upon the motion data.
    Type: Application
    Filed: May 1, 2014
    Publication date: November 5, 2015
    Inventors: Adam G. Poulos, Arthur Tomlin, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Drew Steedly, Frederik Schaffalitzky, Georg Klein, Kathleen P. Mulcahy
  • Publication number: 20150268821
    Abstract: Various embodiments relating to selection of a user interface object displayed on a graphical user interface based on eye gaze are disclosed. In one embodiment, a selection input may be received. A plurality of eye gaze samples at different times within a time window may be evaluated. The time window may be selected based on a time at which the selection input is detected. A user interface object may be selected based on the plurality of eye gaze samples.
    Type: Application
    Filed: March 20, 2014
    Publication date: September 24, 2015
    Inventors: Scott Ramsby, Tony Ambrus, Michael Scavezze, Abby Lin Lee, Brian Mount, Ian Douglas McIntyre, Aaron Mackay Burns, Russ McMackin, Katelyn Elizabeth Doran, Gerhard Schneider, Quentin Simon Charles Miller
  • Patent number: 9129430
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: September 8, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20150237336
    Abstract: A method for displaying virtual imagery on a stereoscopic display system having a display matrix. The virtual imagery presents a surface of individually renderable loci viewable to an eye of the user. The method includes, for each locus of the viewable surface, illuminating a pixel of the display matrix. The illuminated pixel is chosen based on a pupil position of the eye as determined by the stereoscopic display system. For each locus of the viewable surface, a virtual image of the illuminated pixel is formed in a plane in front of the eye. The virtual image is positioned on a straight line passing through the locus, the plane, and the pupil position. In this manner, the virtual image tracks change in the user's pupil position.
    Type: Application
    Filed: February 19, 2014
    Publication date: August 20, 2015
    Inventors: Roger Sebastian Sylvan, Arthur Tomlin, Daniel Joseph McCulloch, Brian Mount, Tony Ambrus
  • Publication number: 20150116354
    Abstract: Various embodiments relating to creating a virtual shadow of an object in an image displayed with a see-through display are provided. In one embodiment, an image of a virtual object may be displayed with the see-through display. The virtual object may appear in front of a real-world background when viewed through the see-through display. A relative brightness of the real-world background around a virtual shadow of the virtual object may be increased when viewed through the see-through display. The virtual shadow may appear to result from a spotlight that is fixed relative to a vantage point of the see-through display.
    Type: Application
    Filed: October 29, 2013
    Publication date: April 30, 2015
    Inventors: Arthur Tomlin, Tony Ambrus, Ron Amador-Leon, Nicholas Gervase Fajt, Ryan Hastings, Matthew G. Kaplan, Michael Scavezze, Daniel McCulloch
  • Publication number: 20140375683
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
  • Publication number: 20140347390
    Abstract: Embodiments are disclosed that relate to placing virtual objects in an augmented reality environment. For example, one disclosed embodiment provides a method comprising receiving sensor data comprising one or more of motion data, location data, and orientation data from one or more sensors located on a head-mounted display device, and based upon the motion data, determining a body-locking direction vector that is based upon an estimated direction in which a body of a user is facing. The method further comprises positioning a displayed virtual object based on the body-locking direction vector.
    Type: Application
    Filed: May 22, 2013
    Publication date: November 27, 2014
    Inventors: Adam G. Poulos, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Stephen Latta, Peter Tobias Kinnebrew, Nicholas Kamuda, Robert Pengelly, Jeffrey C. Fong, Aaron Woo, Udiyan I. Padmanahan, Andrew Wyman MacDonald, Olivia M. Janik
  • Patent number: 8894484
    Abstract: A system and related methods for inviting a potential player to participate in a multiplayer game via a user head-mounted display device are provided. In one example, a potential player invitation program receives user voice data and determines that the user voice data is an invitation to participate in a multiplayer game. The program receives eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data. The program associates the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data. The program matches a potential player account with the potential player. The program receives an acceptance response from the potential player, and joins the potential player account with a user account in participating in the multiplayer game.
    Type: Grant
    Filed: January 30, 2012
    Date of Patent: November 25, 2014
    Assignee: Microsoft Corporation
    Inventors: Stephen Latta, Kevin Geisner, Brian Mount, Jonathan Steed, Tony Ambrus, Arnulfo Zepeda, Aaron Krauss
  • Publication number: 20140240351
    Abstract: Embodiments that relate to providing motion amplification to a virtual environment are disclosed. For example, in one disclosed embodiment a mixed reality augmentation program receives from a head-mounted display device motion data that corresponds to motion of a user in a physical environment. The program presents via the display device the virtual environment in motion in a principal direction, with the principal direction motion being amplified by a first multiplier as compared to the motion of the user in a corresponding principal direction. The program also presents the virtual environment in motion in a secondary direction, where the secondary direction motion is amplified by a second multiplier as compared to the motion of the user in a corresponding secondary direction, and the second multiplier is less than the first multiplier.
    Type: Application
    Filed: February 27, 2013
    Publication date: August 28, 2014
    Inventors: Michael Scavezze, Nicholas Gervase Fajt, Arnulfo Zepeda Navratil, Jason Scott, Adam Benjamin Smith-Kipnis, Brian Mount, John Bevis, Cameron Brown, Tony Ambrus, Phillip Charles Heckinger, Dan Kroymann, Matthew G. Kaplan, Aaron Krauss
  • Publication number: 20140125668
    Abstract: Embodiments related to efficiently constructing an augmented reality environment with global illumination effects are disclosed. For example, one disclosed embodiment provides a method of displaying an augmented reality image via a display device. The method includes receiving image data, the image data capturing an image of a local environment of the display device, and identifying a physical feature of the local environment via the image data. The method further includes constructing an augmented reality image of a virtual structure for display over the physical feature in spatial registration with the physical feature from a viewpoint of a user, the augmented reality image comprising a plurality of modular virtual structure segments arranged in adjacent locations to form the virtual structure feature, each modular virtual structure segment comprising a pre-computed global illumination effect, and outputting the augmented reality image to the display device.
    Type: Application
    Filed: November 5, 2012
    Publication date: May 8, 2014
    Inventors: Jonathan Steed, Aaron Krauss, Mike Scavezze, Wei Zhang, Arthur Tomlin, Tony Ambrus, Brian Mount, Stephen Latta, Ryan Hastings
  • Publication number: 20130342568
    Abstract: Embodiments related to providing low light scene augmentation are disclosed. One embodiment provides, on a computing device comprising a see-through display device, a method including recognizing, from image data received from an image sensor, a background scene of an environment viewable through the see-through display device, the environment comprising a physical object. The method further includes identifying one or more geometrical features of the physical object and displaying, on the see through display device, an image augmenting the one or more geometrical features.
    Type: Application
    Filed: June 20, 2012
    Publication date: December 26, 2013
    Inventors: Tony Ambrus, Mike Scavezze, Stephen Latta, Daniel McCulloch, Brian Mount
  • Publication number: 20130335435
    Abstract: Embodiments related to improving a color-resolving ability of a user of a see-thru display device are disclosed. For example, one disclosed embodiment includes, on a see-thru display device, constructing and displaying virtual imagery to superpose onto real imagery sighted by the user through the see-thru display device. The virtual imagery is configured to accentuate a locus of the real imagery of a color poorly distinguishable by the user. Such virtual imagery is then displayed by superposing it onto the real imagery, in registry with the real imagery, in a field of view of the user.
    Type: Application
    Filed: June 18, 2012
    Publication date: December 19, 2013
    Inventors: Tony Ambrus, Adam Smith-Kipnis, Stephen Latta, Daniel McCulloch, Brian Mount, Kevin Geisner, Ian McIntyre
  • Publication number: 20130196757
    Abstract: A system and related methods for inviting a potential player to participate in a multiplayer game via a user head-mounted display device are provided. In one example, a potential player invitation program receives user voice data and determines that the user voice data is an invitation to participate in a multiplayer game. The program receives eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data. The program associates the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data. The program matches a potential player account with the potential player. The program receives an acceptance response from the potential player, and joins the potential player account with a user account in participating in the multiplayer game.
    Type: Application
    Filed: January 30, 2012
    Publication date: August 1, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Stephen Latta, Kevin Geisner, Brian Mount, Jonathan Steed, Tony Ambrus, Arnulfo Zepeda, Aaron Krauss