Patents by Inventor Anthony J. Ambrus

Anthony J. Ambrus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11960790
    Abstract: A computer implemented method includes detecting user interaction with mixed reality displayed content in a mixed reality system. User focus is determined as a function of the user interaction based on the user interaction using a spatial intent model. A length of time for extending voice engagement with the mixed reality system is modified based on the determined user focus. Detecting user interaction with the displayed content may include tracking eye movements to determine objects in the displayed content at which the user is looking and determining a context of a user dialog during the voice engagement.
    Type: Grant
    Filed: May 27, 2021
    Date of Patent: April 16, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Jonathan Kyle Palmer, Anthony James Ambrus, Mathew J. Lamb, Sheng Kai Tang, Sophie Stellmach
  • Patent number: 10802278
    Abstract: Technology is described for (3D) space carving of a user environment based on movement through the user environment of one or more users wearing a near-eye display (NED) system. One or more sensors of the NED system provide sensor data from which a distance and direction of movement can be determined. Spatial dimensions for a navigable path can be represented based on user height data and user width data of the one or more users who have traversed the path. Space carving data identifying carved out space can be stored in a 3D space carving model of the user environment. The navigable paths can also be related to position data in another kind of 3D mapping like a 3D surface reconstruction mesh model of the user environment generated from depth images.
    Type: Grant
    Filed: January 9, 2019
    Date of Patent: October 13, 2020
    Assignee: Microsoft Technology Licensing LLC
    Inventors: Anthony J. Ambrus, Jea Gon Park, Adam G. Poulos, Justin Avram Clark, Michael Jason Gourlay, Brian J. Mount, Daniel J. McCulloch, Arthur C. Tomlin
  • Patent number: 10330931
    Abstract: Technology is described for (3D) space carving of a user environment based on movement through the user environment of one or more users wearing a near-eye display (NED) system. One or more sensors on the near-eye display (NED) system provide sensor data from which a distance and direction of movement can be determined. Spatial dimensions for a navigable path can be represented based on user height data and user width data of the one or more users who have traversed the path. Space carving data identifying carved out space can be stored in a 3D space carving model of the user environment. The navigable paths can also be related to position data in another kind of 3D mapping like a 3D surface reconstruction mesh model of the user environment generated from depth images.
    Type: Grant
    Filed: June 28, 2013
    Date of Patent: June 25, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anthony J. Ambrus, Jea Gon Park, Adam G. Poulos, Justin Avram Clark, Michael Jason Gourlay, Brian J. Mount, Daniel J. McCulloch, Arthur C. Tomlin
  • Publication number: 20190162964
    Abstract: Technology is described for (3D) space carving of a user environment based on movement through the user environment of one or more users wearing a near-eye display (NED) system. One or more sensors of the NED system provide sensor data from which a distance and direction of movement can be determined. Spatial dimensions for a navigable path can be represented based on user height data and user width data of the one or more users who have traversed the path. Space carving data identifying carved out space can be stored in a 3D space carving model of the user environment. The navigable paths can also be related to position data in another kind of 3D mapping like a 3D surface reconstruction mesh model of the user environment generated from depth images.
    Type: Application
    Filed: January 9, 2019
    Publication date: May 30, 2019
    Inventors: Anthony J. Ambrus, Jea Gon Park, Adam G. Poulos, Justin Avram Clark, Michael Jason Gourlay, Brian J. Mount, Daniel J. McCulloch, Arthur C. Tomlin
  • Patent number: 9846968
    Abstract: A system and method are disclosed for capturing views of a mixed reality environment from various perspectives which can be displayed on a monitor. The system includes one or more physical cameras at user-defined positions within the mixed reality environment. The system renders virtual objects in the mixed reality environment from the perspective of the one or more cameras. Real and virtual objects from the mixed reality environment may then be displayed from the perspective of the one or more cameras on one or more external, 2D monitor for viewing by others.
    Type: Grant
    Filed: June 2, 2015
    Date of Patent: December 19, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Arthur Charles Tomlin, Evan Michael Keibler, Nicholas Gervase Fajt, Brian J. Mount, Gregory Lowell Alt, Jorge Tosar, Jonathan Michael Lyons, Anthony J. Ambrus, Cameron Quinn Egbert, Will Guyman, Jeff W. McGlynn, Jeremy Hance, Roger Sebastian-Kevin Sylvan, Alexander Georg Pfaffe, Dan Kroymann, Erik Andrew Saltwell, Chris Word
  • Patent number: 9563331
    Abstract: Technology is described for web-like hierarchical menu interface which displays a menu in a web-like hierarchical menu display configuration in a near-eye display (NED). The web-like hierarchical menu display configuration links menu levels and menu items within a menu level with flexible spatial dimensions for menu elements. One or more processors executing the interface select a web-like hierarchical menu display configuration based on the available menu space and user head view direction determined from a 3D mapping of the NED field of view data and stored user head comfort rules. Activation parameters in menu item selection criteria are adjusted to be user specific based on user head motion data tracked based on data from one or more sensors when the user wears the NED. Menu display layout may be triggered by changes in head view direction of the user and available menu space about the user's head.
    Type: Grant
    Filed: June 28, 2013
    Date of Patent: February 7, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Anthony J. Ambrus, Cameron G. Brown, Jason Scott, Brian J. Mount, Daniel J. McCulloch, John Bevis, Wei Zhang
  • Patent number: 9552060
    Abstract: Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object.
    Type: Grant
    Filed: January 28, 2014
    Date of Patent: January 24, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Anthony J. Ambrus, Adam G. Poulos, Lewey Alec Geselowitz, Dan Kroymann, Arthur C. Tomlin, Roger Sebastian-Kevin Sylvan, Mathew J. Lamb, Brian J. Mount
  • Publication number: 20160210783
    Abstract: A system and method are disclosed for capturing views of a mixed reality environment from various perspectives which can be displayed on a monitor. The system includes one or more physical cameras at user-defined positions within the mixed reality environment. The system renders virtual objects in the mixed reality environment from the perspective of the one or more cameras. Real and virtual objects from the mixed reality environment may then be displayed from the perspective of the one or more cameras on one or more external, 2D monitor for viewing by others.
    Type: Application
    Filed: June 2, 2015
    Publication date: July 21, 2016
    Inventors: Arthur Charles Tomlin, Evan Michael Keibler, Nicholas Gervase Fajt, Brian J. Mount, Gregory Lowell Alt, Jorge Tosar, Jonathan Michael Lyons, Anthony J. Ambrus, Cameron Quinn Egbert, Will Guyman, Jeff W. McGlynn, Jeremy Hance, Roger Sebastian-Kevin Sylvan, Alexander Georg Pfaffe, Dan Kroymann, Erik Andrew Saltwell, Chris Word
  • Publication number: 20160131902
    Abstract: A method of automatically calibrating a head mounted display for a user is disclosed. The method includes automatically calculating an inter-pupillary distance value for the user, comparing the automatically calculated inter-pupillary distance value to a previously determined inter-pupillary distance value, determining if the automatically calculated inter-pupillary distance value matches the preexisting inter-pupillary distance value, and automatically calibrating the head mounted display using calibration data associated with matching previously determined inter-pupillary distance value.
    Type: Application
    Filed: November 12, 2014
    Publication date: May 12, 2016
    Inventors: Anthony J. Ambrus, Stephen G. Latta, Roger Sebastian-Kevin Sylvan, Adam G. Poulos
  • Patent number: 9292085
    Abstract: Technology is described for automatically determining placement of one or more interaction zones in an augmented reality environment in which one or more virtual features are added to a real environment. An interaction zone includes at least one virtual feature and is associated with a space within the augmented reality environment with boundaries of the space determined based on the one or more real environment features. A plurality of activation criteria may be available for an interaction zone and at least one may be selected based on at least one real environment feature. The technology also describes controlling activation of an interaction zone within the augmented reality environment. In some examples, at least some behavior of a virtual object is controlled by emergent behavior criteria which defines an action independently from a type of object in the real world environment.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: March 22, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Darren Bennett, Brian J. Mount, Michael J. Scavezze, Daniel J. McCulloch, Anthony J. Ambrus, Jonathan T. Steed, Arthur C. Tomlin, Kevin A. Geisner
  • Publication number: 20150212576
    Abstract: Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object.
    Type: Application
    Filed: January 28, 2014
    Publication date: July 30, 2015
    Inventors: Anthony J. Ambrus, Adam G. Poulos, Lewey Alec Geselowitz, Dan Kroymann, Arthur C. Tomlin, Roger Sebastian-Kevin Sylvan, Mathew J. Lamb, Brian J. Mount
  • Patent number: 8933912
    Abstract: A system and method are disclosed for providing a touch interface for electronic devices. The touch interface can be any surface. As one example, a table top can be used as a touch sensitive interface. In one embodiment, the system determines a touch region of the surface, and correlates that touch region to a display of an electronic device for which input is provided. The system may have a 3D camera that identifies the relative position of a user's hands to the touch region to allow for user input. Note that the user's hands do not occlude the display. The system may render a representation of the user's hand on the display in order for the user to interact with elements on the display screen.
    Type: Grant
    Filed: April 2, 2012
    Date of Patent: January 13, 2015
    Assignee: Microsoft Corporation
    Inventors: Anthony J. Ambrus, Abdulwajid N. Mohamed, Andrew D. Wilson, Brian J. Mount, Jordan D. Andersen
  • Publication number: 20150002507
    Abstract: Technology is described for (3D) space carving of a user environment based on movement through the user environment of one or more users wearing a near-eye display (NED) system. One or more sensors on the near-eye display (NED) system provide sensor data from which a distance and direction of movement can be determined. Spatial dimensions for a navigable path can be represented based on user height data and user width data of the one or more users who have traversed the path. Space carving data identifying carved out space can be stored in a 3D space carving model of the user environment. The navigable paths can also be related to position data in another kind of 3D mapping like a 3D surface reconstruction mesh model of the user environment generated from depth images.
    Type: Application
    Filed: June 28, 2013
    Publication date: January 1, 2015
    Inventors: Anthony J. Ambrus, Jea Gon Park, Adam G. Poulos, Justin Avram Clark, Michael Jason Gourlay, Brian J. Mount, Daniel J. McCulloch, Arthur C. Tomlin
  • Publication number: 20150007114
    Abstract: Technology is described for web-like hierarchical menu interface which displays a menu in a web-like hierarchical menu display configuration in a near-eye display (NED). The web-like hierarchical menu display configuration links menu levels and menu items within a menu level with flexible spatial dimensions for menu elements. One or more processors executing the interface select a web-like hierarchical menu display configuration based on the available menu space and user head view direction determined from a 3D mapping of the NED field of view data and stored user head comfort rules. Activation parameters in menu item selection criteria are adjusted to be user specific based on user head motion data tracked based on data from one or more sensors when the user wears the NED. Menu display layout may be triggered by changes in head view direction of the user and available menu space about the user's head.
    Type: Application
    Filed: June 28, 2013
    Publication date: January 1, 2015
    Inventors: Adam G. Poulos, Anthony J. Ambrus, Cameron G. Brown, Jason Scott, Brian J. Mount, Daniel J. McCulloch, John Bevis, Wei Zhang
  • Publication number: 20140160157
    Abstract: Methods for generating and displaying people-triggered holographic reminders are described. In some embodiments, a head-mounted display device (HMD) generates and displays an augmented reality environment to an end user of the HMD in which reminders associated with a particular person may be displayed if the particular person is within a field of view of the HMD or if the particular person is within a particular distance of the HMD. The particular person may be identified individually or identified as belonging to a particular group (e.g., a member of a group with a particular job title such as programmer or administrator). In some cases, a completion of a reminder may be automatically detected by applying speech recognition techniques (e.g., to identify key words, phrases, or names) to captured audio of a conversation occurring between the end user and the particular person.
    Type: Application
    Filed: December 11, 2012
    Publication date: June 12, 2014
    Inventors: Adam G. Poulos, Holly A. Hirzel, Anthony J. Ambrus, Daniel J. McCulloch, Brian J. Mount, Jonathan T. Steed
  • Publication number: 20140002444
    Abstract: Technology is described for automatically determining placement of one or more interaction zones in an augmented reality environment in which one or more virtual features are added to a real environment. An interaction zone includes at least one virtual feature and is associated with a space within the augmented reality environment with boundaries of the space determined based on the one or more real environment features. A plurality of activation criteria may be available for an interaction zone and at least one may be selected based on at least one real environment feature. The technology also describes controlling activation of an interaction zone within the augmented reality environment. In some examples, at least some behavior of a virtual object is controlled by emergent behavior criteria which defines an action independently from a type of object in the real world environment.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Darren Bennett, Brian J. Mount, Michael J. Scavezze, Daniel J. McCulloch, Anthony J. Ambrus, Jonathan T. Steed, Arthur C. Tomlin, Kevin A. Geisner
  • Publication number: 20130257748
    Abstract: A system and method are disclosed for providing a touch interface for electronic devices. The touch interface can be any surface. As one example, a table top can be used as a touch sensitive interface. In one embodiment, the system determines a touch region of the surface, and correlates that touch region to a display of an electronic device for which input is provided. The system may have a 3D camera that identifies the relative position of a user's hands to the touch region to allow for user input. Note that the user's hands do not occlude the display. The system may render a representation of the user's hand on the display in order for the user to interact with elements on the display screen.
    Type: Application
    Filed: April 2, 2012
    Publication date: October 3, 2013
    Inventors: Anthony J. Ambrus, Abdulwajid N. Mohamed, Andrew D. Wilson, Brian J. Mount, Jordan D. Andersen