Patents by Inventor Marcus Ghaly
Marcus Ghaly has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10482663Abstract: A method includes determining a current pose of an augmented reality device in a physical space, and visually presenting, via a display of the augmented reality device, an augmented-reality view of the physical space including a predetermined pose cue indicating a predetermined pose in the physical space and a current pose cue indicating the current pose in the physical space.Type: GrantFiled: October 27, 2016Date of Patent: November 19, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Marcus Ghaly, Andrew Jackson, Jeff Smith, Michael Scavezze, Ronald Amador-Leon, Cameron Brown, Charlene Jeune
-
Patent number: 10268266Abstract: A user may select or interact with objects in a scene using gaze tracking and movement tracking. In some examples, the scene may comprise a virtual reality scene or a mixed reality scene. A user may move an input object in an environment and be facing in a direction towards the movement of the input object. A computing device may use sensors to obtain movement data corresponding to the movement of the input object, and gaze tracking data including to a location of eyes of the user. One or more modules of the computing device may use the movement data and gaze tracking data to determine a three-dimensional selection space in the scene. In some examples, objects included in the three-dimensional selection space may be selected or otherwise interacted with.Type: GrantFiled: June 29, 2016Date of Patent: April 23, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Cheyne Rory Quin Mathey-Owens, Arthur Tomlin, Marcus Ghaly
-
Patent number: 10235807Abstract: A system and method are disclosed for building virtual content from within a virtual environment using virtual tools to build and modify the virtual content.Type: GrantFiled: January 20, 2015Date of Patent: March 19, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Michael Thomas, Jonathan Paulovich, Adam G. Poulos, Omer Bilal Orhan, Marcus Ghaly, Cameron G. Brown, Nicholas Gervase Fajt, Matthew Kaplan
-
Patent number: 10068376Abstract: Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.Type: GrantFiled: January 11, 2016Date of Patent: September 4, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jeff Smith, Cameron Graeme Brown, Marcus Ghaly, Andrei A. Borodin, Jonathan Gill, Cheyne Rory Quin Mathey-Owens, Andrew Jackson
-
Patent number: 9952656Abstract: Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.Type: GrantFiled: August 21, 2015Date of Patent: April 24, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Adam Gabriel Poulos, Cameron Graeme Brown, Aaron Daniel Krauss, Marcus Ghaly, Michael Thomas, Jonathan Paulovich, Daniel Joseph McCulloch
-
Publication number: 20180004283Abstract: A user may select or interact with objects in a scene using gaze tracking and movement tracking. In some examples, the scene may comprise a virtual reality scene or a mixed reality scene. A user may move an input object in an environment and be facing in a direction towards the movement of the input object. A computing device may use sensors to obtain movement data corresponding to the movement of the input object, and gaze tracking data including to a location of eyes of the user. One or more modules of the computing device may use the movement data and gaze tracking data to determine a three-dimensional selection space in the scene. In some examples, objects included in the three-dimensional selection space may be selected or otherwise interacted with.Type: ApplicationFiled: June 29, 2016Publication date: January 4, 2018Inventors: Cheyne Rory Quin Mathey-Owens, Arthur Tomlin, Marcus Ghaly
-
Publication number: 20170287221Abstract: A method includes determining a current pose of an augmented reality device in a physical space, and visually presenting, via a display of the augmented reality device, an augmented-reality view of the physical space including a predetermined pose cue indicating a predetermined pose in the physical space and a current pose cue indicating the current pose in the physical space.Type: ApplicationFiled: October 27, 2016Publication date: October 5, 2017Inventors: Marcus Ghaly, Andrew Jackson, Jeff Smith, Michael Scavezze, Ronald Amador-Leon, Cameron Brown, Charlene Jeune
-
Patent number: 9778814Abstract: Disclosed is a method, implemented in a visualization device, to assist a user in placing 3D objects. In certain embodiments the method includes displaying, on a display area of the visualization device, to a user, various virtual 3D objects overlaid on a real-world view of a 3D physical space. The method can further include a holding function, in which a first object, of the various virtual 3D objects, is displayed on the display area so that it appears to move through the 3D physical space in response to input from the user, which may be merely a change in the user's gaze direction. A second object is then identified as a target object for a snap function, based on the detected gaze of the user, the snap function being an operation that causes the first object to move to a location on a surface of the target object.Type: GrantFiled: January 30, 2015Date of Patent: October 3, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Anthony Ambrus, Marcus Ghaly, Adam Poulos, Michael Thomas, Jon Paulovich
-
Publication number: 20170200312Abstract: Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.Type: ApplicationFiled: January 11, 2016Publication date: July 13, 2017Inventors: Jeff Smith, Cameron Graeme Brown, Marcus Ghaly, Andrei A. Borodin, Jonathan Gill, Cheyne Rory Quin Mathey-Owens, Andrew Jackson
-
Publication number: 20170052507Abstract: Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.Type: ApplicationFiled: August 21, 2015Publication date: February 23, 2017Inventors: Adam Gabriel Poulos, Cameron Graeme Brown, Aaron Daniel Krauss, Marcus Ghaly, Michael Thomas, Jonathan Paulovich, Daniel Joseph McCulloch
-
Publication number: 20160210781Abstract: A system and method are disclosed for building virtual content from within a virtual environment using virtual tools to build and modify the virtual content.Type: ApplicationFiled: January 20, 2015Publication date: July 21, 2016Inventors: Michael Thomas, Jonathan Paulovich, Adam G. Poulos, Omer Bilal Orhan, Marcus Ghaly, Cameron G. Brown, Nicholas Gervase Fajt, Matthew Kaplan
-
Publication number: 20160179336Abstract: Disclosed is a method, implemented in a visualization device, to assist a user in placing 3D objects. In certain embodiments the method includes displaying, on a display area of the visualization device, to a user, various virtual 3D objects overlaid on a real-world view of a 3D physical space. The method can further include a holding function, in which a first object, of the various virtual 3D objects, is displayed on the display area so that it appears to move through the 3D physical space in response to input from the user, which may be merely a change in the user's gaze direction. A second object is then identified as a target object for a snap function, based on the detected gaze of the user, the snap function being an operation that causes the first object to move to a location on a surface of the target object.Type: ApplicationFiled: January 30, 2015Publication date: June 23, 2016Inventors: Anthony Ambrus, Marcus Ghaly, Adam Poulos, Michael Thomas, Jon Paulovich