Patents by Inventor Anthony James Ambrus

Anthony James Ambrus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11960790
    Abstract: A computer implemented method includes detecting user interaction with mixed reality displayed content in a mixed reality system. User focus is determined as a function of the user interaction based on the user interaction using a spatial intent model. A length of time for extending voice engagement with the mixed reality system is modified based on the determined user focus. Detecting user interaction with the displayed content may include tracking eye movements to determine objects in the displayed content at which the user is looking and determining a context of a user dialog during the voice engagement.
    Type: Grant
    Filed: May 27, 2021
    Date of Patent: April 16, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Jonathan Kyle Palmer, Anthony James Ambrus, Mathew J. Lamb, Sheng Kai Tang, Sophie Stellmach
  • Patent number: 11824821
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Grant
    Filed: November 1, 2022
    Date of Patent: November 21, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin Seungmin Lee, Amy Mun Hong, Keiichi Matsuda, Anthony James Ambrus, Mathew Julian Lamb, Kenneth Mitchell Jakubzak
  • Patent number: 11656679
    Abstract: Examples are disclosed that relate to image reprojection. One example provides a method, comprising receiving a first rendered image comprising content associated with a viewer reference frame, receiving a second rendered image comprising content associated with a manipulator reference frame, and reprojecting the first rendered image based on a head pose of a user to thereby produce a first reprojected image. The method further comprises reprojecting the second rendered image based on the head pose of the user and a pose of the manipulator to thereby produce a second reprojected image, and outputting the first reprojected image and the second reprojected image for display as a composited image.
    Type: Grant
    Filed: August 27, 2020
    Date of Patent: May 23, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Benjamin Markus Thaut, Anthony James Ambrus
  • Patent number: 11630509
    Abstract: This disclosure relates to displaying a user interface for a computing device based upon a user intent determined via a spatial intent model. One example provides a computing device comprising a see-through display, a logic subsystem, and a storage subsystem. The storage subsystem comprises instructions executable by the logic machine to receive, via an eye-tracking sensor, eye tracking samples each corresponding to a gaze direction of a user, based at least on the eye tracking samples, determine a time-dependent attention value for a location in a field of view of the see-through display, based at least on the time-dependent attention value for the location, determine an intent of the user to interact with a user interface associated with the location that is at least partially hidden from a current view, and in response to determining the intent, display via the see-through display the user interface.
    Type: Grant
    Filed: December 11, 2020
    Date of Patent: April 18, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Anthony James Ambrus, Sheng Kai Tang, Keiichi Matsuda, Aleksandar Josic
  • Publication number: 20230111597
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Application
    Filed: November 1, 2022
    Publication date: April 13, 2023
    Inventors: Austin Seungmin LEE, Amy Mun HONG, Keiichi MATSUDA, Anthony James AMBRUS, Mathew Julian LAMB, Kenneth Mitchell JAKUBZAK
  • Publication number: 20220382510
    Abstract: A computer implemented method includes detecting user interaction with mixed reality displayed content in a mixed reality system. User focus is determined as a function of the user interaction based on the user interaction using a spatial intent model. A length of time for extending voice engagement with the mixed reality system is modified based on the determined user focus. Detecting user interaction with the displayed content may include tracking eye movements to determine objects in the displayed content at which the user is looking and determining a context of a user dialog during the voice engagement.
    Type: Application
    Filed: May 27, 2021
    Publication date: December 1, 2022
    Inventors: Austin S. LEE, Jonathan Kyle PALMER, Anthony James AMBRUS, Mathew J. LAMB, Sheng Kai TANG, Sophie STELLMACH
  • Patent number: 11509612
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Grant
    Filed: December 15, 2020
    Date of Patent: November 22, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin Seungmin Lee, Amy Mun Hong, Keiichi Matsuda, Anthony James Ambrus, Mathew Julian Lamb, Kenneth Mitchell Jakubzak
  • Patent number: 11487467
    Abstract: Rapid writing to and reading from even very large amounts of data, especially where the data evolves more slowly over time. For each of a sequence of commits of the data, the data is represented by identifying pages of the data that have changed since a prior commit in the sequence of commits. A sparce file is formulate for the commit, and contains each of identified pages, and for each identified page a mapping of the identified to a page address of the identified page in the address range. The sparce file is then stored as associated with the corresponding commit. Thus, an ordered sequence of sparce files can be created and layered on top of a base file that represents the entire page address range. The sparce files may be quite small as there may be relatively few pages (or perhaps even no pages) that changed since the prior commit in the sequence of commits. Reads occur by creating a sparce in-memory object, and checking for each page at each sparce.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: November 1, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anthony James Ambrus, Logan James Buesching
  • Patent number: 11429186
    Abstract: One example provides a computing device comprising instructions executable to receive information regarding one or more entities in the scene, to receive eye tracking a plurality of eye tracking samples, each eye tracking sample corresponding to a gaze direction of a user and, based at least on the eye tracking samples, determine a time-dependent attention value for each entity of the one or more entities at different locations in a use environment, the time-dependent attention value determined using a leaky integrator. The instructions are further executable to receive a user input indicating an intent to perform a location-dependent action, associate the user input to with a selected entity based at least upon the time-dependent attention value for each entity, and perform the location-dependent action based at least upon a location of the selected entity.
    Type: Grant
    Filed: November 18, 2020
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Mathew J. Lamb, Anthony James Ambrus, Amy Mun Hong, Jonathan Palmer, Sophie Stellmach
  • Publication number: 20220187907
    Abstract: This disclosure relates to displaying a user interface for a computing device based upon a user intent determined via a spatial intent model. One example provides a computing device comprising a see-through display, a logic subsystem, and a storage subsystem. The storage subsystem comprises instructions executable by the logic machine to receive, via an eye-tracking sensor, eye tracking samples each corresponding to a gaze direction of a user, based at least on the eye tracking samples, determine a time-dependent attention value for a location in a field of view of the see-through display, based at least on the time-dependent attention value for the location, determine an intent of the user to interact with a user interface associated with the location that is at least partially hidden from a current view, and in response to determining the intent, display via the see-through display the user interface.
    Type: Application
    Filed: December 11, 2020
    Publication date: June 16, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Austin S. LEE, Anthony James AMBRUS, Sheng Kai TANG, Keiichi MATSUDA, Aleksandar JOSIC
  • Publication number: 20220191157
    Abstract: Systems and methods are provided for facilitating the presentation of expressive intent and other status information with messaging and other communication applications. The expressive intent is based on expressive effect data associated with the message recipients and/or message senders. The expressive intent can be conveyed through avatars and modified message content. The avatars convey gestures, emotions and other status information and the presentation of the avatars can be reactive to detected state information of the message recipient(s), message sender(s) and/or corresponding messaging device(s).
    Type: Application
    Filed: December 15, 2020
    Publication date: June 16, 2022
    Inventors: Austin Seungmin LEE, Amy Mun HONG, Keiichi MATSUDA, Anthony James AMBRUS, Mathew Julian LAMB, Kenneth Mitchell JAKUBZAK
  • Publication number: 20220155857
    Abstract: One example provides a computing device comprising instructions executable to receive information regarding one or more entities in the scene, to receive eye tracking a plurality of eye tracking samples, each eye tracking sample corresponding to a gaze direction of a user and, based at least on the eye tracking samples, determine a time-dependent attention value for each entity of the one or more entities at different locations in a use environment, the time-dependent attention value determined using a leaky integrator. The instructions are further executable to receive a user input indicating an intent to perform a location-dependent action, associate the user input to with a selected entity based at least upon the time-dependent attention value for each entity, and perform the location-dependent action based at least upon a location of the selected entity.
    Type: Application
    Filed: November 18, 2020
    Publication date: May 19, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Austin S. LEE, Mathew J. LAMB, Anthony James AMBRUS, Amy Mun HONG, Jonathan PALMER, Sophie STELLMACH
  • Patent number: 11270672
    Abstract: Examples are disclosed herein relating to displaying a virtual assistant. One example provides an augmented reality display device comprising a see-through display, a logic subsystem, and a storage subsystem storing instructions executable by the logic subsystem to display via the see-through display a virtual assistant associated with a location in a real-world environment, detect a change in a field of view of the see-through display, and when the virtual assistant is out of the field of view of the see-through display after the change in the field of view, display the virtual assistant in a virtual window on the see-through display.
    Type: Grant
    Filed: November 2, 2020
    Date of Patent: March 8, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Anthony James Ambrus, Mathew Julian Lamb, Sophie Stellmach, Keiichi Matsuda
  • Publication number: 20220066546
    Abstract: Examples are disclosed that relate to image reprojection. One example provides a method, comprising receiving a first rendered image comprising content associated with a viewer reference frame, receiving a second rendered image comprising content associated with a manipulator reference frame, and reprojecting the first rendered image based on a head pose of a user to thereby produce a first reprojected image. The method further comprises reprojecting the second rendered image based on the head pose of the user and a pose of the manipulator to thereby produce a second reprojected image, and outputting the first reprojected image and the second reprojected image for display as a composited image.
    Type: Application
    Filed: August 27, 2020
    Publication date: March 3, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Benjamin Markus THAUT, Anthony James AMBRUS
  • Patent number: 10769856
    Abstract: In various embodiments, methods and systems for rendering augmented reality objects based on user heights are provided. Height data of a user of an augmented reality device can be determined. The height data relates to a viewing perspective from an eye level of the user. Placement data for a first augmented reality object is generated based on the user height data. The first augmented reality object is rendered based on the user height data, and a second augmented reality object is excluded from rendering based on the user height data.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: September 8, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: James Gerard Dack, Jeffrey Alan Kohler, Shawn Crispin Wright, Anthony James Ambrus
  • Patent number: 10620717
    Abstract: In embodiments of a camera-based input device, the input device includes an inertial measurement unit that collects motion data associated with velocity and acceleration of the input device in an environment, such as in three-dimensional (3D) space. The input device also includes at least two visual light cameras that capture images of the environment. A positioning application is implemented to receive the motion data from the inertial measurement unit, and receive the images of the environment from the at least two visual light cameras. The positioning application can then determine positions of the input device based on the motion data and the images correlated with a map of the environment, and track a motion of the input device in the environment based on the determined positions of the input device.
    Type: Grant
    Filed: June 30, 2016
    Date of Patent: April 14, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Daniel Joseph McCulloch, Nicholas Gervase Fajt, Adam G. Poulos, Christopher Douglas Edmonds, Lev Cherkashin, Brent Charles Allen, Constantin Dulu, Muhammad Jabir Kapasi, Michael Grabner, Michael Edward Samples, Cecilia Bong, Miguel Angel Susffalich, Varun Ramesh Mani, Anthony James Ambrus, Arthur C. Tomlin, James Gerard Dack, Jeffrey Alan Kohler, Eric S. Rehmeyer, Edward D. Parker
  • Publication number: 20190088029
    Abstract: In various embodiments, methods and systems for rendering augmented reality objects based on user heights are provided. Height data of a user of an augmented reality device can be determined. The height data relates to a viewing perspective from an eye level of the user. Placement data for a first augmented reality object is generated based on the user height data. The first augmented reality object is rendered based on the user height data, and a second augmented reality object is excluded from rendering based on the user height data.
    Type: Application
    Filed: November 19, 2018
    Publication date: March 21, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: James Gerard Dack, Jeffrey Alan Kohler, Shawn Crispin Wright, Anthony James Ambrus
  • Patent number: 10203781
    Abstract: In various embodiments, methods and systems for implementing integrated free space and surface inputs are provided. An integrated free space and surface input system includes a mixed-input pointing device for interacting and controlling interface objects using free space inputs and surface inputs, trigger buttons, pressure sensors, and haptic feedback associated with the mixed-input pointing device. Free space movement data and surface movement data are tracked and determined for the mixed-input pointing device. An interface input is detected for the mixed-input pointing device transitioning from a first input to a second input, such as, from a free space input to a surface input or from the surface input to the free space input. The interface input is processed based on accessing the free space movement data and the surface movement data. An output for the interface input is communicated from the mixed-input pointing device to interact and control an interface.
    Type: Grant
    Filed: June 24, 2016
    Date of Patent: February 12, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anatolie Gavriliuc, Shawn Crispin Wright, Jeffrey Alan Kohler, Quentin Simon Charles Miller, Scott Francis Fullam, Sergio Paolantonio, Michael Edward Samples, Anthony James Ambrus
  • Patent number: 10134190
    Abstract: In various embodiments, methods and systems for rendering augmented reality objects based on user heights are provided. Height data of a user of an augmented reality device can be determined. The height data relates to a viewing perspective from an eye level of the user. Placement data for an augmented reality object is generated based on a constraint configuration that is associated with the augmented reality object for user-height-based rendering. The constraint configuration includes rules that support generating placement data for rendering augmented reality objects based on the user height data. The augmented reality object is rendered based on the placement data. Augmented reality objects are rendered in a real world scene, such that, the augmented reality object is personalized for each user during an augmented reality experience. In shared experiences, with multiple users viewing a single augmented reality object, the object can be rendered based on a particular user's height.
    Type: Grant
    Filed: June 14, 2016
    Date of Patent: November 20, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: James Gerard Dack, Jeffrey Alan Kohler, Shawn Crispin Wright, Anthony James Ambrus
  • Patent number: 9928648
    Abstract: In various embodiments, computerized methods and systems for identifying object paths to navigate objects in scene-aware device environments are provided. An object path identification mechanism supports identifying object paths. In operation, a guide path for navigating an object from the start point to the end point in a scene-aware device environment is identified. A guide path can be predefined or recorded in real time. A visibility check, such as a look-ahead operation, is performed based on the guide path. Based on performing the visibility check, a path segment to advance the object from the start point towards the end point is determined. The path segment can be optionally modified or refined based on several factors. The object is caused to advance along the path segment. Iteratively performing visibility checks and traverse actions moves the object from the start point to the end point. The path segments define the object path.
    Type: Grant
    Filed: November 9, 2015
    Date of Patent: March 27, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anthony James Ambrus, Jeffrey Kohler