Patents by Inventor Cameron Graeme Brown

Cameron Graeme Brown has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10134192
    Abstract: Methods and systems for displaying a computer generated image corresponding to the pose of a real-world object in a mixed reality system. The system may include of a head-mounted display (HMD) device, a magnetic track system and an optical system. Pose data detected by the two tracking systems can be synchronized by a timestamp that is embedded in an electromagnetic field transmitted by the magnetic tracking system. A processor may also be configured to calculate a future pose of the real world object based on a time offset based on the time needed by the HMD to calculate, buffer and generate display output and on data from the two tracking systems, such that the relative location of the computer generated image (CGI) corresponds with the actual location of the real-world object relative to the real world environment at the time the CGI actually appears in the display.
    Type: Grant
    Filed: October 17, 2016
    Date of Patent: November 20, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Arthur Tomlin, Adam Gabriel Poulos, Cameron Graeme Brown
  • Publication number: 20180315250
    Abstract: Motion and/or rotation of an input mechanism can be tracked and/or analyzed to determine limits on a user's range of motion and/or a user's range of rotation in three-dimensional space. The user's range of motion and/or the user's range of rotation in three-dimensional space may be limited by a personal restriction for the user (e.g., a broken arm). The user's range of motion and/or the user's range of rotation in three-dimensional space may additionally or alternatively be limited by an environmental restriction (e.g., a physical object in a room). Accordingly, the techniques described herein can take steps to accommodate the personal restriction and/or the environmental restriction thereby optimizing user interactions involving the input mechanism and a virtual object.
    Type: Application
    Filed: July 5, 2018
    Publication date: November 1, 2018
    Inventors: Adam Gabriel POULOS, Cameron Graeme BROWN, Andrew Austin JACKSON, Cheyne Rory Quinn MATHEY-OWENS, Michael Robert THOMAS, Arthur TOMLIN
  • Patent number: 10068376
    Abstract: Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.
    Type: Grant
    Filed: January 11, 2016
    Date of Patent: September 4, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jeff Smith, Cameron Graeme Brown, Marcus Ghaly, Andrei A. Borodin, Jonathan Gill, Cheyne Rory Quin Mathey-Owens, Andrew Jackson
  • Patent number: 10037626
    Abstract: Motion and/or rotation of an input mechanism can be tracked and/or analyzed to determine limits on a user's range of motion and/or a user's range of rotation in three-dimensional space. The user's range of motion and/or the user's range of rotation in three-dimensional space may be limited by a personal restriction for the user (e.g., a broken arm). The user's range of motion and/or the user's range of rotation in three-dimensional space may additionally or alternatively be limited by an environmental restriction (e.g., a physical object in a room). Accordingly, the techniques described herein can take steps to accommodate the personal restriction and/or the environmental restriction thereby optimizing user interactions involving the input mechanism and a virtual object.
    Type: Grant
    Filed: June 30, 2016
    Date of Patent: July 31, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Andrew Austin Jackson, Cheyne Rory Quin Mathey-Owens, Michael Robert Thomas, Arthur Tomlin
  • Patent number: 9952656
    Abstract: Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.
    Type: Grant
    Filed: August 21, 2015
    Date of Patent: April 24, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Aaron Daniel Krauss, Marcus Ghaly, Michael Thomas, Jonathan Paulovich, Daniel Joseph McCulloch
  • Publication number: 20180108179
    Abstract: Methods and systems for displaying a computer generated image corresponding to the pose of a real-world object in a mixed reality system. The system may include of a head-mounted display (HMD) device, a magnetic track system and an optical system. Pose data detected by the two tracking systems can be synchronized by a timestamp that is embedded in an electromagnetic field transmitted by the magnetic tracking system. A processor may also be configured to calculate a future pose of the real world object based on a time offset based on the time needed by the HMD to calculate, buffer and generate display output and on data from the two tracking systems, such that the relative location of the computer generated image (CGI) corresponds with the actual location of the real-world object relative to the real world environment at the time the CGI actually appears in the display.
    Type: Application
    Filed: October 17, 2016
    Publication date: April 19, 2018
    Inventors: Arthur Tomlin, Adam Gabriel Poulos, Cameron Graeme Brown
  • Publication number: 20180005443
    Abstract: Motion and/or rotation of an input mechanism can be tracked and/or analyzed to determine limits on a user's range of motion and/or a user's range of rotation in three-dimensional space. The user's range of motion and/or the user's range of rotation in three-dimensional space may be limited by a personal restriction for the user (e.g., a broken arm). The user's range of motion and/or the user's range of rotation in three-dimensional space may additionally or alternatively be limited by an environmental restriction (e.g., a physical object in a room). Accordingly, the techniques described herein can take steps to accommodate the personal restriction and/or the environmental restriction thereby optimizing user interactions involving the input mechanism and a virtual object.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Andrew Austin Jackson, Cheyne Rory Quin Mathey-Owens, Michael Robert Thomas, Arthur Tomlin
  • Publication number: 20170200312
    Abstract: Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.
    Type: Application
    Filed: January 11, 2016
    Publication date: July 13, 2017
    Inventors: Jeff Smith, Cameron Graeme Brown, Marcus Ghaly, Andrei A. Borodin, Jonathan Gill, Cheyne Rory Quin Mathey-Owens, Andrew Jackson
  • Publication number: 20170052507
    Abstract: Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.
    Type: Application
    Filed: August 21, 2015
    Publication date: February 23, 2017
    Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Aaron Daniel Krauss, Marcus Ghaly, Michael Thomas, Jonathan Paulovich, Daniel Joseph McCulloch