Patents by Inventor Adam G. Poulos

Adam G. Poulos has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10223799
    Abstract: Embodiments are disclosed for methods and systems of distinguishing movements of features in a physical environment. For example, on a head-mounted display device, one embodiment of a method includes obtaining a representation of real-world features in two or more coordinate frames and obtaining motion data from one or more sensors external to the head-mounted display device. The method further includes distinguishing features in one coordinate frame from features in another coordinate frame based upon the motion data.
    Type: Grant
    Filed: March 30, 2017
    Date of Patent: March 5, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Arthur Tomlin, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Drew Steedly, Frederik Schaffalitzky, Georg Klein, Kathleen P. Mulcahy
  • Patent number: 10222981
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received from a capture device. The user's hand is spaced by an initial actual distance from the capture device, and a holographic keyboard image is displayed spatially separated by a virtual distance from a holographic hand image. The user's hand is determined to move to an updated actual distance from the capture device. In response, the holographic keyboard image is maintained spatially separated by substantially the virtual distance from the holographic hand image.
    Type: Grant
    Filed: September 6, 2017
    Date of Patent: March 5, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Patent number: 10007349
    Abstract: Methods for recognizing gestures using adaptive multi-sensor gesture recognition are described. In some embodiments, a gesture recognition system receives a plurality of sensor inputs from a plurality of sensor devices and a plurality of confidence thresholds associated with the plurality of sensor inputs. A confidence threshold specifies a minimum confidence value for which it is deemed that a particular gesture has occurred. Upon detection of a compensating event, such as excessive motion involving one of the plurality of sensor devices, the gesture recognition system may modify the plurality of confidence thresholds based on the compensating event. Subsequently, the gesture recognition system generates a multi-sensor confidence value based on whether at least a subset of the plurality of confidence thresholds has been satisfied. The gesture recognition system may also modify the plurality of confidence thresholds based on the plugging and unplugging of sensor inputs from the gesture recognition system.
    Type: Grant
    Filed: May 4, 2015
    Date of Patent: June 26, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Stephen G. Latta, Brian J. Mount, Adam G. Poulos, Jeffrey A. Kohler, Arthur C. Tomlin, Jonathan T. Steed
  • Patent number: 10008044
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.
    Type: Grant
    Filed: December 23, 2016
    Date of Patent: June 26, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Evan Michael Keibler, Arthur Tomlin, Cameron Brown, Daniel McCulloch, Brian Mount, Dan Kroymann, Gregory Lowell Alt
  • Patent number: 9934614
    Abstract: An example wearable display system includes a controller, a left display to display a left-eye augmented reality image with a left-eye display size at left-eye display coordinates, and a right display to display a right-eye augmented reality image with a right-eye display size at right-eye display coordinates, the left-eye and right-eye augmented reality images collectively forming an augmented reality object perceivable at an apparent real world depth by a wearer of the display system. The controller sets the left-eye display coordinates relative to the right-eye display coordinates as a function of the apparent real world depth of the augmented reality object. The function maintains an aspect of the left-eye and right-eye display sizes throughout a non-scaling range of apparent real world depths of the augmented reality object, and the function scales the left-eye and right-eye display sizes with changing apparent real world depth outside the non-scaling range.
    Type: Grant
    Filed: May 20, 2015
    Date of Patent: April 3, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Scott Ramsby, Dan Osborn, Shawn Wright, Anatolie Gavriliuc, Forest Woodcroft Gouin, Megan Saunders, Jesse Rapczak, Stephen Latta, Adam G. Poulos, Daniel McCulloch, Wei Zhang
  • Publication number: 20180004308
    Abstract: In embodiments of a camera-based input device, the input device includes an inertial measurement unit that collects motion data associated with velocity and acceleration of the input device in an environment, such as in three-dimensional (3D) space. The input device also includes at least two visual light cameras that capture images of the environment. A positioning application is implemented to receive the motion data from the inertial measurement unit, and receive the images of the environment from the at least two visual light cameras. The positioning application can then determine positions of the input device based on the motion data and the images correlated with a map of the environment, and track a motion of the input device in the environment based on the determined positions of the input device.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventors: Daniel Joseph McCulloch, Nicholas Gervase Fajt, Adam G. Poulos, Christopher Douglas Edmonds, Lev Cherkashin, Brent Charles Allen, Constantin Dulu, Muhammad Jabir Kapasi, Michael Grabner, Michael Edward Samples, Cecilia Bong, Miguel Angel Susffalich, Varun Ramesh Mani, Anthony James Ambrus, Arthur C. Tomlin, James Gerard Dack, Jeffrey Alan Kohler, Eric S. Rehmeyer, Edward D. Parker
  • Publication number: 20180005445
    Abstract: In embodiments of augmenting a moveable entity with a hologram, an alternate reality device includes a tracking system that can recognize an entity in an environment and track movement of the entity in the environment. The alternate reality device can also include a detection algorithm implemented to identify the entity recognized by the tracking system based on identifiable characteristics of the entity. A hologram positioning application is implemented to receive motion data from the tracking system, receive entity characteristic data from the detection algorithm, and determine a position and an orientation of the entity in the environment based on the motion data and the entity characteristic data. The hologram positioning application can then generate a hologram that appears associated with the entity as the entity moves in the environment.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Daniel Joseph McCulloch, Nicholas Gervase Fajt, Adam G. Poulos, Christopher Douglas Edmonds, Lev Cherkashin, Brent Charles Allen, Constantin Dulu, Muhammad Jabir Kapasi, Michael Grabner, Michael Edward Samples, Cecilia Bong, Miguel Angel Susffalich, Varun Ramesh Mani, Anthony James Ambrus, Arthur C. Tomlin, James Gerard Dack, Jeffrey Alan Kohler, Eric S. Rehmeyer, Edward D. Parker
  • Publication number: 20170364261
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received from a capture device. The user's hand is spaced by an initial actual distance from the capture device, and a holographic keyboard image is displayed spatially separated by a virtual distance from a holographic hand image. The user's hand is determined to move to an updated actual distance from the capture device. In response, the holographic keyboard image is maintained spatially separated by substantially the virtual distance from the holographic hand image.
    Type: Application
    Filed: September 6, 2017
    Publication date: December 21, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Publication number: 20170352184
    Abstract: A mixed reality system may comprise a base station, affixed to an object, that emits an electromagnetic field (EMF) and a head-mounted display (HMD) device with a location sensor and an EMF sensor mounted a predetermined offset therefrom. The base station and EMF sensor together may form a magnetic tracking system. The HMD device may determine a relative location of the EMF sensor based on sensing the EMF and determine a location of the base station in space based on the relative location, the predetermined offset, and the location of the location sensor. An optical tracking system comprising a marker and an optical sensor may be included to augment the magnetic tracking system based on captured optical data and a location of the optical sensor or marker. The HMD device may display augmented reality images and overlay a hologram corresponding to the location of the base station over time.
    Type: Application
    Filed: June 6, 2016
    Publication date: December 7, 2017
    Inventors: Adam G. Poulos, Arthur Tomlin, Alexandru Octavian Balan, Constantin Dulu, Christopher Douglas Edmonds
  • Publication number: 20170351094
    Abstract: A mixed reality system may comprise a head-mounted display (HMD) device with a location sensor and a base station, mounted a predetermined offset from the location sensor, that emits an electromagnetic field (EMF). An EMF sensor affixed to an object may sense the EMF, forming a magnetic tracking system. The HMD device may determine a relative location of the EMF sensor therefrom and determine a location of the EMF sensor in space based on the relative location, the predetermined offset, and the location of the location sensor. An optical tracking system comprising a marker an optical sensor configured to capture optical data may be included to augment the magnetic tracking system based on the optical data and a location of the optical sensor or marker. The HMD device may display augmented reality images and overlay a hologram corresponding to the location of the EMF sensor over time.
    Type: Application
    Filed: June 6, 2016
    Publication date: December 7, 2017
    Inventors: Adam G. Poulos, Arthur Tomlin, Alexandru Octavian Balan, Constantin Dulu, Christopher Douglas Edmonds
  • Publication number: 20170287219
    Abstract: A mixed reality system may comprise a head-mounted display (HMD) device with a location sensor from which the HMD device determines a location of the location sensor in space and a base station mounted a predetermined offset from the location sensor and configured to emit an electromagnetic field (EMF). An EMF sensor affixed to an object may be configured to sense a strength of the EMF. The HMD device may determine a location of the EMF sensor relative to the base station based on the sensed strength and determine a location of the EMF sensor in space based on the relative location, the predetermined offset, and the location of the location sensor in space. In some aspects, the HMD device may comprise a see-through display configured to display augmented reality images and overlay a hologram that corresponds to the location of the EMF sensor in space over time.
    Type: Application
    Filed: March 31, 2016
    Publication date: October 5, 2017
    Inventors: Adam G. Poulos, Daniel Joseph McCulloch, Nicholas Gervase Fajt, Arthur Tomlin, Brian Mount, Lev Cherkashin, Lorenz Henric Jentz
  • Patent number: 9766806
    Abstract: Embodiments that relate to displaying holographic keyboard and hand images in a holographic environment are provided. In one embodiment depth information of an actual position of a user's hand is received. Using the depth information, a holographic hand image representing the user's hand is displayed in a virtual hand plane in the holographic environment. In response to receiving a keyboard activation input from the user and using the depth information, the holographic keyboard image is adaptively displayed in a virtual keyboard plane in the holographic environment at a virtual distance under the holographic hand image representing the user's hand.
    Type: Grant
    Filed: July 15, 2014
    Date of Patent: September 19, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Rotem Bennet, Lewey Geselowitz, Wei Zhang, Adam G. Poulos, John Bevis, Kim Pascal Pimmel, Nicholas Gervase Fajt
  • Publication number: 20170206668
    Abstract: Embodiments are disclosed for methods and systems of distinguishing movements of features in a physical environment. For example, on a head-mounted display device, one embodiment of a method includes obtaining a representation of real-world features in two or more coordinate frames and obtaining motion data from one or more sensors external to the head-mounted display device. The method further includes distinguishing features in one coordinate frame from features in another coordinate frame based upon the motion data.
    Type: Application
    Filed: March 30, 2017
    Publication date: July 20, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Adam G. Poulos, Arthur Tomlin, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Drew Steedly, Frederik Schaffalitzky, Georg Klein, Kathleen P. Mulcahy
  • Patent number: 9647847
    Abstract: Various techniques are described to protect secrets held by closed computing devices. In an ecosystem where devices operate and are offered a wide range of services from a service provider, the service provider may want to prevent users from sharing services between devices. In order to guarantee that services are not shared between devices, each device can be manufactured with a different set of secrets such as per device identifiers. Unscrupulous individuals may try to gain access to the secrets and transfer secrets from one device to another. In order to prevent this type of attack, each closed computing system can be manufactured to include a protected memory location that is tied to the device.
    Type: Grant
    Filed: January 8, 2016
    Date of Patent: May 9, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Sebastian Lange, Victor Tan, Adam G. Poulos
  • Patent number: 9626802
    Abstract: Embodiments are disclosed for methods and systems of distinguishing movements of features in a physical environment. For example, on a head-mounted display device, one embodiment of a method includes obtaining a representation of real-world features in two or more coordinate frames and obtaining motion data from one or more sensors external to the head-mounted display device. The method further includes distinguishing features in one coordinate frame from features in another coordinate frame based upon the motion data.
    Type: Grant
    Filed: May 1, 2014
    Date of Patent: April 18, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Arthur Tomlin, Tony Ambrus, Jeffrey Cole, Ian Douglas McIntyre, Drew Steedly, Frederik Schaffalitzky, Georg Klein, Kathleen P. Mulcahy
  • Publication number: 20170103583
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.
    Type: Application
    Filed: December 23, 2016
    Publication date: April 13, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Adam G. Poulos, Evan Michael Keibler, Arthur Tomlin, Cameron Brown, Daniel McCulloch, Brian Mount, Dan Kroymann, Gregory Lowell Alt
  • Patent number: 9563331
    Abstract: Technology is described for web-like hierarchical menu interface which displays a menu in a web-like hierarchical menu display configuration in a near-eye display (NED). The web-like hierarchical menu display configuration links menu levels and menu items within a menu level with flexible spatial dimensions for menu elements. One or more processors executing the interface select a web-like hierarchical menu display configuration based on the available menu space and user head view direction determined from a 3D mapping of the NED field of view data and stored user head comfort rules. Activation parameters in menu item selection criteria are adjusted to be user specific based on user head motion data tracked based on data from one or more sensors when the user wears the NED. Menu display layout may be triggered by changes in head view direction of the user and available menu space about the user's head.
    Type: Grant
    Filed: June 28, 2013
    Date of Patent: February 7, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Anthony J. Ambrus, Cameron G. Brown, Jason Scott, Brian J. Mount, Daniel J. McCulloch, John Bevis, Wei Zhang
  • Patent number: 9552060
    Abstract: Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object.
    Type: Grant
    Filed: January 28, 2014
    Date of Patent: January 24, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Anthony J. Ambrus, Adam G. Poulos, Lewey Alec Geselowitz, Dan Kroymann, Arthur C. Tomlin, Roger Sebastian-Kevin Sylvan, Mathew J. Lamb, Brian J. Mount
  • Patent number: 9530252
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.
    Type: Grant
    Filed: January 25, 2016
    Date of Patent: December 27, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Adam G. Poulos, Evan Michael Keibler, Arthur Tomlin, Cameron Brown, Daniel McCulloch, Brian Mount, Dan Kroymann, Gregory Lowell Alt
  • Patent number: 9519640
    Abstract: A see-through, near-eye, mixed reality display apparatus for providing translations of real world data for a user. A wearer's location and orientation with the apparatus is determined and input data for translation is selected using sensors of the apparatus. Input data can be audio or visual in nature, and selected by reference to the gaze of a wearer. The input data is translated for the user relative to user profile information bearing on accuracy of a translation and determining from the input data whether a linguistic translation, knowledge addition translation or context translation is useful.
    Type: Grant
    Filed: May 4, 2012
    Date of Patent: December 13, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathryn Stone Perez, John Clavin, Kevin A. Geisner, Stephen G. Latta, Brian J. Mount, Arthur C. Tomlin, Adam G. Poulos