Patents by Inventor Benjamin J. Sugden

Benjamin J. Sugden has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180101986
    Abstract: In various implementations, methods and systems for drawing in a three-dimensional (3D) virtual reality environment are provided. An intersection between a user input and an object, associated with a three-dimensional (3D) virtual reality environment is identified. An anchor position is determined for a drawing surface based on the identified intersection. A gaze direction of a user in the 3D virtual reality environment is identified. A drawing surface configuration for the drawing surface with respect to the 3D virtual reality environment is determined based on the gaze direction, where the drawing surface configuration indicates how the drawing surface is defined in the 3D virtual reality environment. The drawing surface is defined in the 3D virtual reality environment at the determined anchor position with the determined drawing surface configuration. A drawing is generated on the drawing surface based on drawing input.
    Type: Application
    Filed: October 10, 2016
    Publication date: April 12, 2018
    Inventors: Aaron Mackay Burns, Donna Katherine Long, Matthew Steven Johnson, Benjamin J. Sugden, Bryant Daniel Hawthorne
  • Patent number: 9940720
    Abstract: Camera and sensor augmented reality techniques are described. In one or more implementations, sensor data is obtained from a sensor of a hardware device, the sensor data being associated with the hardware device that is located in an environment, such as in three-dimensional (3D) space. Images of the environment are captured with at least one camera of the hardware device. A position of the hardware device in the environment can then be determined based on at least one of the sensor data and the images of the environment. Further, an orientation of the hardware device in the environment can be determined based on at least one of the sensor data and the images of the environment.
    Type: Grant
    Filed: May 18, 2016
    Date of Patent: April 10, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Benjamin J. Sugden
  • Patent number: 9613463
    Abstract: Augmented reality extrapolation techniques are described. In one or more implementations, a frame of an augmented-reality display is rendered based at least in part on an optical basis that describes a current orientation or position of at least a part of a computing device. While the frame is rendered, an extrapolation based on a previous basis and a sensor basis generates an updated optical basis that describes a likely orientation or position of the part of the computing device, and the extrapolation is effective to account for a lag time duration between rendering the frame and displaying the frame of the augmented-reality display. The rendered frame of the augmented-reality display is updated before the rendered frame is displayed based at least in part on the updated optical basis that describes the likely orientation or position of the part of the computing device.
    Type: Grant
    Filed: January 8, 2016
    Date of Patent: April 4, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Benjamin J. Sugden
  • Publication number: 20160267662
    Abstract: Camera and sensor augmented reality techniques are described. In one or more implementations, sensor data is obtained from a sensor of a hardware device, the sensor data being associated with the hardware device that is located in an environment, such as in three-dimensional (3D) space. Images of the environment are captured with at least one camera of the hardware device. A position of the hardware device in the environment can then be determined based on at least one of the sensor data and the images of the environment. Further, an orientation of the hardware device in the environment can be determined based on at least one of the sensor data and the images of the environment.
    Type: Application
    Filed: May 18, 2016
    Publication date: September 15, 2016
    Applicant: Microsoft Technology Licensing, LLC
    Inventor: Benjamin J. Sugden
  • Patent number: 9355452
    Abstract: Camera and sensor augmented reality techniques are described. In one or more implementations, an optical basis is obtained that was generated from data obtained by a camera of a computing device and a sensor basis is obtained that was generated from data obtained from one or more sensors that are not a camera. The optical basis and the sensor basis describe a likely orientation or position of the camera and the one or more sensors, respectively, in a physical environment. The optical basis and the sensor basis are compared to verify the orientation or the position of the computing device in the physical environment.
    Type: Grant
    Filed: December 19, 2014
    Date of Patent: May 31, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Benjamin J. Sugden
  • Publication number: 20160125658
    Abstract: Augmented reality extrapolation techniques are described. In one or more implementations, a frame of an augmented-reality display is rendered based at least in part on an optical basis that describes a current orientation or position of at least a part of a computing device. While the frame is rendered, an extrapolation based on a previous basis and a sensor basis generates an updated optical basis that describes a likely orientation or position of the part of the computing device, and the extrapolation is effective to account for a lag time duration between rendering the frame and displaying the frame of the augmented-reality display. The rendered frame of the augmented-reality display is updated before the rendered frame is displayed based at least in part on the updated optical basis that describes the likely orientation or position of the part of the computing device.
    Type: Application
    Filed: January 8, 2016
    Publication date: May 5, 2016
    Inventor: Benjamin J. Sugden
  • Patent number: 9262950
    Abstract: Augmented reality extrapolation techniques are described. In one or more implementations, an augmented-reality display is rendered based at least in part on a first basis that describes a likely orientation or position of at least a part of the computing device. The rendered augmented-reality display is updated based at least in part on data that describes a likely orientation or position of the part of the computing device that was assumed during the rendering of the augmented-reality display.
    Type: Grant
    Filed: April 20, 2011
    Date of Patent: February 16, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventor: Benjamin J Sugden
  • Publication number: 20150103098
    Abstract: Camera and sensor augmented reality techniques are described. In one or more implementations, an optical basis is obtained that was generated from data obtained by a camera of a computing device and a sensor basis is obtained that was generated from data obtained from one or more sensors that are not a camera. The optical basis and the sensor basis describe a likely orientation or position of the camera and the one or more sensors, respectively, in a physical environment. The optical basis and the sensor basis are compared to verify the orientation or the position of the computing device in the physical environment.
    Type: Application
    Filed: December 19, 2014
    Publication date: April 16, 2015
    Inventor: Benjamin J. Sugden
  • Patent number: 8937663
    Abstract: Camera and sensor augmented reality techniques are described. In one or more implementations, an optical basis is obtained that was generated from data obtained by a camera of a computing device and a sensor basis is obtained that was generated from data obtained from one or more sensors that are not a camera. The optical basis and the sensor basis describe a likely orientation or position of the camera and the one or more sensors, respectively, in a physical environment. The optical basis and the sensor basis are compared to verify the orientation or the position of the computing device in the physical environment.
    Type: Grant
    Filed: April 1, 2011
    Date of Patent: January 20, 2015
    Assignee: Microsoft Corporation
    Inventor: Benjamin J. Sugden
  • Patent number: 8817046
    Abstract: Color channel optical marker techniques are described. In one or more implementations, a plurality of color channels obtained from a camera are examined, each of the color channels depicting an optical marker having a different scale than another optical maker depicted in another one of the color channels. At least one optical marker is identified in a respective one of the plurality of color channels and an optical basis is computed using the identified optical marker usable to describe at least a position or orientation of a part of the computing device.
    Type: Grant
    Filed: April 21, 2011
    Date of Patent: August 26, 2014
    Assignee: Microsoft Corporation
    Inventors: Benjamin J. Sugden, Thomas G. Salter
  • Publication number: 20120268491
    Abstract: Color channel optical marker techniques are described. In one or more implementations, a plurality of color channels obtained from a camera are examined, each of the color channels depicting an optical marker having a different scale than another optical maker depicted in another one of the color channels. At least one optical marker is identified in a respective one of the plurality of color channels and an optical basis is computed using the identified optical marker usable to describe at least a position or orientation of a part of the computing device.
    Type: Application
    Filed: April 21, 2011
    Publication date: October 25, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Benjamin J. Sugden, Thomas G. Salter
  • Publication number: 20120268490
    Abstract: Augmented reality extrapolation techniques are described. In one or more implementations, an augmented-reality display is rendered based at least in part on a first basis that describes a likely orientation or position of at least a part of the computing device. The rendered augmented-reality display is updated based at least in part on data that describes a likely orientation or position of the part of the computing device that was assumed during the rendering of the augmented-reality display.
    Type: Application
    Filed: April 20, 2011
    Publication date: October 25, 2012
    Applicant: MICROSOFT CORPORATION
    Inventor: Benjamin J. Sugden
  • Publication number: 20120249807
    Abstract: Camera and sensor augmented reality techniques are described. In one or more implementations, an optical basis is obtained that was generated from data obtained by a camera of a computing device and a sensor basis is obtained that was generated from data obtained from one or more sensors that are not a camera. The optical basis and the sensor basis describe a likely orientation or position of the camera and the one or more sensors, respectively, in a physical environment. The optical basis and the sensor basis are compared to verify the orientation or the position of the computing device in the physical environment.
    Type: Application
    Filed: April 1, 2011
    Publication date: October 4, 2012
    Applicant: Microsoft Corporation
    Inventor: Benjamin J. Sugden