Patents by Inventor Marcus Ghaly

Marcus Ghaly has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10482663
    Abstract: A method includes determining a current pose of an augmented reality device in a physical space, and visually presenting, via a display of the augmented reality device, an augmented-reality view of the physical space including a predetermined pose cue indicating a predetermined pose in the physical space and a current pose cue indicating the current pose in the physical space.
    Type: Grant
    Filed: October 27, 2016
    Date of Patent: November 19, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Marcus Ghaly, Andrew Jackson, Jeff Smith, Michael Scavezze, Ronald Amador-Leon, Cameron Brown, Charlene Jeune
  • Patent number: 10268266
    Abstract: A user may select or interact with objects in a scene using gaze tracking and movement tracking. In some examples, the scene may comprise a virtual reality scene or a mixed reality scene. A user may move an input object in an environment and be facing in a direction towards the movement of the input object. A computing device may use sensors to obtain movement data corresponding to the movement of the input object, and gaze tracking data including to a location of eyes of the user. One or more modules of the computing device may use the movement data and gaze tracking data to determine a three-dimensional selection space in the scene. In some examples, objects included in the three-dimensional selection space may be selected or otherwise interacted with.
    Type: Grant
    Filed: June 29, 2016
    Date of Patent: April 23, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Cheyne Rory Quin Mathey-Owens, Arthur Tomlin, Marcus Ghaly
  • Patent number: 10235807
    Abstract: A system and method are disclosed for building virtual content from within a virtual environment using virtual tools to build and modify the virtual content.
    Type: Grant
    Filed: January 20, 2015
    Date of Patent: March 19, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Thomas, Jonathan Paulovich, Adam G. Poulos, Omer Bilal Orhan, Marcus Ghaly, Cameron G. Brown, Nicholas Gervase Fajt, Matthew Kaplan
  • Patent number: 10068376
    Abstract: Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.
    Type: Grant
    Filed: January 11, 2016
    Date of Patent: September 4, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jeff Smith, Cameron Graeme Brown, Marcus Ghaly, Andrei A. Borodin, Jonathan Gill, Cheyne Rory Quin Mathey-Owens, Andrew Jackson
  • Patent number: 9952656
    Abstract: Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.
    Type: Grant
    Filed: August 21, 2015
    Date of Patent: April 24, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Aaron Daniel Krauss, Marcus Ghaly, Michael Thomas, Jonathan Paulovich, Daniel Joseph McCulloch
  • Publication number: 20180004283
    Abstract: A user may select or interact with objects in a scene using gaze tracking and movement tracking. In some examples, the scene may comprise a virtual reality scene or a mixed reality scene. A user may move an input object in an environment and be facing in a direction towards the movement of the input object. A computing device may use sensors to obtain movement data corresponding to the movement of the input object, and gaze tracking data including to a location of eyes of the user. One or more modules of the computing device may use the movement data and gaze tracking data to determine a three-dimensional selection space in the scene. In some examples, objects included in the three-dimensional selection space may be selected or otherwise interacted with.
    Type: Application
    Filed: June 29, 2016
    Publication date: January 4, 2018
    Inventors: Cheyne Rory Quin Mathey-Owens, Arthur Tomlin, Marcus Ghaly
  • Publication number: 20170287221
    Abstract: A method includes determining a current pose of an augmented reality device in a physical space, and visually presenting, via a display of the augmented reality device, an augmented-reality view of the physical space including a predetermined pose cue indicating a predetermined pose in the physical space and a current pose cue indicating the current pose in the physical space.
    Type: Application
    Filed: October 27, 2016
    Publication date: October 5, 2017
    Inventors: Marcus Ghaly, Andrew Jackson, Jeff Smith, Michael Scavezze, Ronald Amador-Leon, Cameron Brown, Charlene Jeune
  • Patent number: 9778814
    Abstract: Disclosed is a method, implemented in a visualization device, to assist a user in placing 3D objects. In certain embodiments the method includes displaying, on a display area of the visualization device, to a user, various virtual 3D objects overlaid on a real-world view of a 3D physical space. The method can further include a holding function, in which a first object, of the various virtual 3D objects, is displayed on the display area so that it appears to move through the 3D physical space in response to input from the user, which may be merely a change in the user's gaze direction. A second object is then identified as a target object for a snap function, based on the detected gaze of the user, the snap function being an operation that causes the first object to move to a location on a surface of the target object.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: October 3, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Anthony Ambrus, Marcus Ghaly, Adam Poulos, Michael Thomas, Jon Paulovich
  • Publication number: 20170200312
    Abstract: Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.
    Type: Application
    Filed: January 11, 2016
    Publication date: July 13, 2017
    Inventors: Jeff Smith, Cameron Graeme Brown, Marcus Ghaly, Andrei A. Borodin, Jonathan Gill, Cheyne Rory Quin Mathey-Owens, Andrew Jackson
  • Publication number: 20170052507
    Abstract: Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.
    Type: Application
    Filed: August 21, 2015
    Publication date: February 23, 2017
    Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Aaron Daniel Krauss, Marcus Ghaly, Michael Thomas, Jonathan Paulovich, Daniel Joseph McCulloch
  • Publication number: 20160210781
    Abstract: A system and method are disclosed for building virtual content from within a virtual environment using virtual tools to build and modify the virtual content.
    Type: Application
    Filed: January 20, 2015
    Publication date: July 21, 2016
    Inventors: Michael Thomas, Jonathan Paulovich, Adam G. Poulos, Omer Bilal Orhan, Marcus Ghaly, Cameron G. Brown, Nicholas Gervase Fajt, Matthew Kaplan
  • Publication number: 20160179336
    Abstract: Disclosed is a method, implemented in a visualization device, to assist a user in placing 3D objects. In certain embodiments the method includes displaying, on a display area of the visualization device, to a user, various virtual 3D objects overlaid on a real-world view of a 3D physical space. The method can further include a holding function, in which a first object, of the various virtual 3D objects, is displayed on the display area so that it appears to move through the 3D physical space in response to input from the user, which may be merely a change in the user's gaze direction. A second object is then identified as a target object for a snap function, based on the detected gaze of the user, the snap function being an operation that causes the first object to move to a location on a surface of the target object.
    Type: Application
    Filed: January 30, 2015
    Publication date: June 23, 2016
    Inventors: Anthony Ambrus, Marcus Ghaly, Adam Poulos, Michael Thomas, Jon Paulovich