Patents by Inventor Anthony James Ambrus

Anthony James Ambrus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180004308
    Abstract: In embodiments of a camera-based input device, the input device includes an inertial measurement unit that collects motion data associated with velocity and acceleration of the input device in an environment, such as in three-dimensional (3D) space. The input device also includes at least two visual light cameras that capture images of the environment. A positioning application is implemented to receive the motion data from the inertial measurement unit, and receive the images of the environment from the at least two visual light cameras. The positioning application can then determine positions of the input device based on the motion data and the images correlated with a map of the environment, and track a motion of the input device in the environment based on the determined positions of the input device.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventors: Daniel Joseph McCulloch, Nicholas Gervase Fajt, Adam G. Poulos, Christopher Douglas Edmonds, Lev Cherkashin, Brent Charles Allen, Constantin Dulu, Muhammad Jabir Kapasi, Michael Grabner, Michael Edward Samples, Cecilia Bong, Miguel Angel Susffalich, Varun Ramesh Mani, Anthony James Ambrus, Arthur C. Tomlin, James Gerard Dack, Jeffrey Alan Kohler, Eric S. Rehmeyer, Edward D. Parker
  • Publication number: 20180005445
    Abstract: In embodiments of augmenting a moveable entity with a hologram, an alternate reality device includes a tracking system that can recognize an entity in an environment and track movement of the entity in the environment. The alternate reality device can also include a detection algorithm implemented to identify the entity recognized by the tracking system based on identifiable characteristics of the entity. A hologram positioning application is implemented to receive motion data from the tracking system, receive entity characteristic data from the detection algorithm, and determine a position and an orientation of the entity in the environment based on the motion data and the entity characteristic data. The hologram positioning application can then generate a hologram that appears associated with the entity as the entity moves in the environment.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Daniel Joseph McCulloch, Nicholas Gervase Fajt, Adam G. Poulos, Christopher Douglas Edmonds, Lev Cherkashin, Brent Charles Allen, Constantin Dulu, Muhammad Jabir Kapasi, Michael Grabner, Michael Edward Samples, Cecilia Bong, Miguel Angel Susffalich, Varun Ramesh Mani, Anthony James Ambrus, Arthur C. Tomlin, James Gerard Dack, Jeffrey Alan Kohler, Eric S. Rehmeyer, Edward D. Parker
  • Publication number: 20170371432
    Abstract: In various embodiments, methods and systems for implementing integrated free space and surface inputs are provided. An integrated free space and surface input system includes a mixed-input pointing device for interacting and controlling interface objects using free space inputs and surface inputs, trigger buttons, pressure sensors, and haptic feedback associated with the mixed-input pointing device. Free space movement data and surface movement data are tracked and determined for the mixed-input pointing device. An interface input is detected for the mixed-input pointing device transitioning from a first input to a second input, such as, from a free space input to a surface input or from the surface input to the free space input. The interface input is processed based on accessing the free space movement data and the surface movement data. An output for the interface input is communicated from the mixed-input pointing device to interact and control an interface.
    Type: Application
    Filed: June 24, 2016
    Publication date: December 28, 2017
    Inventors: Anatolie Gavriliuc, Shawn Crispin Wright, Jeffrey Alan Kohler, Quentin Simon Charles Miller, Scott Francis Fullam, Sergio Paolantonio, Michael Edward Samples, Anthony James Ambrus
  • Publication number: 20170358138
    Abstract: In various embodiments, methods and systems for rendering augmented reality objects based on user heights are provided. Height data of a user of an augmented reality device can be determined. The height data relates to a viewing perspective from an eye level of the user. Placement data for an augmented reality object is generated based on a constraint configuration that is associated with the augmented reality object for user-height-based rendering. The constraint configuration includes rules that support generating placement data for rendering augmented reality objects based on the user height data. The augmented reality object is rendered based on the placement data. Augmented reality objects are rendered in a real world scene, such that, the augmented reality object is personalized for each user during an augmented reality experience. In shared experiences, with multiple users viewing a single augmented reality object, the object can be rendered based on a particular user's height.
    Type: Application
    Filed: June 14, 2016
    Publication date: December 14, 2017
    Inventors: James Gerard Dack, Jeffrey Alan Kohler, Shawn Crispin Wright, Anthony James Ambrus
  • Publication number: 20170132839
    Abstract: In various embodiments, computerized methods and systems for identifying object paths to navigate objects in scene-aware device environments are provided. An object path identification mechanism supports identifying object paths. In operation, a guide path for navigating an object from the start point to the end point in a scene-aware device environment is identified. A guide path can be predefined or recorded in real time. A visibility check, such as a look-ahead operation, is performed based on the guide path. Based on performing the visibility check, a path segment to advance the object from the start point towards the end point is determined. The path segment can be optionally modified or refined based on several factors. The object is caused to advance along the path segment. Iteratively performing visibility checks and traverse actions moves the object from the start point to the end point. The path segments define the object path.
    Type: Application
    Filed: November 9, 2015
    Publication date: May 11, 2017
    Inventors: Anthony James Ambrus, Jeffrey Kohler