Patents by Inventor Aaron Mackay Burns

Aaron Mackay Burns has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10133345
    Abstract: A field of view of a virtual environment is visually presented via a near-eye display from a first perspective. A begin-adjustment command is received. A peripheral region of the field of view that surrounds a wearer's gaze target is dimmed, via the near-eye display, based on the begin-adjustment command. A navigation command is received. The field of view of the virtual environment is adjusted from the first perspective to a second perspective based on the navigation command. An end-adjustment command is received. The peripheral region of the field of view is brightened, via the near-eye display, based on the end-adjustment command. The field of view of the virtual environment is visually presented, via the near-eye display, from the second perspective.
    Type: Grant
    Filed: March 22, 2016
    Date of Patent: November 20, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Aaron Mackay Burns, Robert Ferrese, Brent J. Elmer, Timothy D. Kviz, Benjamin John Sugden
  • Patent number: 9972119
    Abstract: A method to display a virtual object includes receiving virtual-object data representing a virtual object, a first rendering of the virtual object being displayed on a display screen during a first interval. The method also includes receiving display-screen data indicating a physical size, position, and orientation of the display screen, and receiving display-device data indicating a position and orientation of a near-eye display device. The method includes composing a second, 3D rendering of the virtual object based on the virtual-object data and on the relative position and orientation of the display screen versus the near-eye display device, as indicated by the display-screen and display-device data, and, displaying the second rendering binocularly on the near-eye display device during a subsequent second interval, such that the second rendering coincides with the first rendering in an augmented field of view of the user.
    Type: Grant
    Filed: August 11, 2016
    Date of Patent: May 15, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Donna Long, Matthew Johnson, Aaron Mackay Burns, Benjamin John Sugden, Bryant Hawthorne
  • Publication number: 20180101986
    Abstract: In various implementations, methods and systems for drawing in a three-dimensional (3D) virtual reality environment are provided. An intersection between a user input and an object, associated with a three-dimensional (3D) virtual reality environment is identified. An anchor position is determined for a drawing surface based on the identified intersection. A gaze direction of a user in the 3D virtual reality environment is identified. A drawing surface configuration for the drawing surface with respect to the 3D virtual reality environment is determined based on the gaze direction, where the drawing surface configuration indicates how the drawing surface is defined in the 3D virtual reality environment. The drawing surface is defined in the 3D virtual reality environment at the determined anchor position with the determined drawing surface configuration. A drawing is generated on the drawing surface based on drawing input.
    Type: Application
    Filed: October 10, 2016
    Publication date: April 12, 2018
    Inventors: Aaron Mackay Burns, Donna Katherine Long, Matthew Steven Johnson, Benjamin J. Sugden, Bryant Daniel Hawthorne
  • Publication number: 20180089131
    Abstract: In various embodiments, methods and systems are provide for detecting a physical configuration of a device based on sensor data from one or more configuration sensors. The physical configuration includes a position of a first display region of the device with respect to a second display region of the device, where the position is physically adjustable. A configuration profile is selected from a plurality of configuration profiles based on the detected physical configuration of the device. Each configuration profile is a representation of at least one respective physical configuration of the device. An interaction mode corresponding to the selected configuration profile is activated, where the interaction mode includes a set of mode input/output (I/O) features available while the interaction mode is active. Device interfaces of the device are managed using at least some mode I/O features in the set of mode I/O features based on the activating of the interaction mode.
    Type: Application
    Filed: September 23, 2016
    Publication date: March 29, 2018
    Inventors: Aaron Mackay Burns, Riccardo Giraldi, Christian Klein, Roger Sebastian Kevin Sylvan, John Benjamin George Hesketh, Scott G. Wade
  • Publication number: 20180046352
    Abstract: A method for moving a virtual cursor on a virtual reality computing device including a display comprises presenting a virtual cursor at a first screen-space position that occludes a world-space position of a first object, the virtual cursor having a first world-space position based on the first screen-space position and the world-space position of the first object. Based on receiving an input, the method includes moving the virtual cursor from the first screen-space position to a second screen-space position that occludes a world-space position of a second object, the virtual cursor having a second world-space position based on the second screen-space position and the world-space position of the second object. While the virtual cursor is presented at an intermediate screen-space position, the method includes assigning an intermediate world-space position based on the intermediate screen-space position and simulated attractive forces for each of the first and second objects.
    Type: Application
    Filed: August 9, 2016
    Publication date: February 15, 2018
    Inventors: Matthew Johnson, Aaron Mackay Burns, Donna Long, Benjamin John Sugden, Bryant Hawthorne
  • Publication number: 20180047202
    Abstract: A method to display a virtual object includes receiving virtual-object data representing a virtual object, a first rendering of the virtual object being displayed on a display screen during a first interval. The method also includes receiving display-screen data indicating a physical size, position, and orientation of the display screen, and receiving display-device data indicating a position and orientation of a near-eye display device. The method includes composing a second, 3D rendering of the virtual object based on the virtual-object data and on the relative position and orientation of the display screen versus the near-eye display device, as indicated by the display-screen and display-device data, and, displaying the second rendering binocularly on the near-eye display device during a subsequent second interval, such that the second rendering coincides with the first rendering in an augmented field of view of the user.
    Type: Application
    Filed: August 11, 2016
    Publication date: February 15, 2018
    Inventors: Donna Long, Matthew Johnson, Aaron Mackay Burns, Benjamin John Sugden, Bryant Hawthorne
  • Patent number: 9804753
    Abstract: Various embodiments relating to selection of a user interface object displayed on a graphical user interface based on eye gaze are disclosed. In one embodiment, a selection input may be received. A plurality of eye gaze samples at different times within a time window may be evaluated. The time window may be selected based on a time at which the selection input is detected. A user interface object may be selected based on the plurality of eye gaze samples.
    Type: Grant
    Filed: March 20, 2014
    Date of Patent: October 31, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Scott Ramsby, Tony Ambrus, Michael Scavezze, Abby Lin Lee, Brian Mount, Ian Douglas McIntyre, Aaron Mackay Burns, Russ McMackin, Katelyn Elizabeth Doran, Gerhard Schneider, Quentin Simon Charles Miller
  • Publication number: 20170277256
    Abstract: A field of view of a virtual environment is visually presented via a near-eye display from a first perspective. A begin-adjustment command is received. A peripheral region of the field of view that surrounds a wearer's gaze target is dimmed, via the near-eye display, based on the begin-adjustment command. A navigation command is received. The field of view of the virtual environment is adjusted from the first perspective to a second perspective based on the navigation command. An end-adjustment command is received. The peripheral region of the field of view is brightened, via the near-eye display, based on the end-adjustment command. The field of view of the virtual environment is visually presented, via the near-eye display, from the second perspective.
    Type: Application
    Filed: March 22, 2016
    Publication date: September 28, 2017
    Inventors: Aaron Mackay Burns, Robert Ferrese, Brent J. Elmer, Timothy D. Kviz, Benjamin John Sugden
  • Patent number: 9563270
    Abstract: A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector.
    Type: Grant
    Filed: December 26, 2014
    Date of Patent: February 7, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: James Tichenor, Aaron Mackay Burns, Jamie Marconi, Bharat Ahluwalia, Bede Jordan
  • Publication number: 20160187971
    Abstract: A gaze vector of a human subject is translated to a targeting vector that defines focus within a graphical user interface. Sensor data is received from a sensor system indicating pitch angle of a head of the human subject defining the gaze vector. The pitch angle is translated to a scaled pitch angle according to a pitch scaling function that increases amplification of the pitch angle in one or more directions as the pitch angle exceeds a start angle threshold in each of the one or more directions. The scaled pitch angle is output as a component of the targeting vector.
    Type: Application
    Filed: December 26, 2014
    Publication date: June 30, 2016
    Inventors: James Tichenor, Aaron Mackay Burns, Jamie Marconi, Bharat Ahluwalia, Bede Jordan
  • Publication number: 20150268821
    Abstract: Various embodiments relating to selection of a user interface object displayed on a graphical user interface based on eye gaze are disclosed. In one embodiment, a selection input may be received. A plurality of eye gaze samples at different times within a time window may be evaluated. The time window may be selected based on a time at which the selection input is detected. A user interface object may be selected based on the plurality of eye gaze samples.
    Type: Application
    Filed: March 20, 2014
    Publication date: September 24, 2015
    Inventors: Scott Ramsby, Tony Ambrus, Michael Scavezze, Abby Lin Lee, Brian Mount, Ian Douglas McIntyre, Aaron Mackay Burns, Russ McMackin, Katelyn Elizabeth Doran, Gerhard Schneider, Quentin Simon Charles Miller