Patents by Inventor Enrico William Guld
Enrico William Guld has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12182944Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: GrantFiled: July 23, 2021Date of Patent: December 31, 2024Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
-
Publication number: 20240296635Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: May 10, 2024Publication date: September 5, 2024Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Patent number: 12002164Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: GrantFiled: July 23, 2021Date of Patent: June 4, 2024Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
-
Publication number: 20220157026Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: July 23, 2021Publication date: May 19, 2022Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Patent number: 11087548Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: GrantFiled: September 26, 2019Date of Patent: August 10, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
-
Patent number: 11014242Abstract: Various methods and systems are provided for puppeteering in augmented reality. Generally, an augmented or virtual reality device for each user generates a virtual 3D environment comprising a virtual representation of a physical room and a 3D asset. An author can record a 3D path for a puppeteering animation of the 3D asset using a 3D interface. At the same time, a coordinated rendering of a corresponding 3D image moving along the 3D path is updated among author devices substantially in real-time. Distinct states of the 3D asset can be assigned to different portions of the 3D path, and authors can set behavior parameters to assign template behaviors such as obstacle avoidance, particle effects, path visualizations and physical effects. The behavior parameters are distributed among presenter and audience devices, and a coordinated rendering of an animation of the 3D image corresponding to the puppeteering animation is triggered.Type: GrantFiled: January 26, 2018Date of Patent: May 25, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Jonathon Burnham Cobb, Dean Alan Wadsworth, Weihua Huang
-
Patent number: 10824294Abstract: Various methods and systems, for implementing three-dimensional resource integration, are provided. 3D resource integration includes integration of 3D resources into different types of functionality, such as, operating system, file explorer, application and augmented reality functionality. In operation, an indication to perform an operation with a 3D object is received. One or more 3D resource controls, associated with the operation, are accessed. The 3D resource control is a defined set of instructions on how to integrate 3D resources with 3D objects for generating 3D-based graphical interfaces associated with application features and operating system features. An input based on one or more control elements of the one or more 3D resource controls is received. The input includes the one or more control elements that operate to generate a 3D-based graphical interface for the operation. Based on receiving the input, the operation is executed with the 3D object and the 3D-based graphical interface.Type: GrantFiled: May 16, 2017Date of Patent: November 3, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Enrico William Guld, Jason A. Carter, Heather Joanne Alekson, Andrew Jackson Klein, David J. W. Seymour, Kathleen P. Mulcahy, Charla M. Pereira, Evan Lewis Jones, William Axel Olsen, Adam Roy Mitchell, Daniel Lee Osborn, Zachary D. Wiesnoski, Struan Andrew Robertson, Michael Edward Harnisch, William Robert Schnurr, Helen Joan Hem Lam, Darren Alexander Bennett, Kin Hang Chu
-
Publication number: 20200160604Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: September 26, 2019Publication date: May 21, 2020Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Patent number: 10438414Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: GrantFiled: January 26, 2018Date of Patent: October 8, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
-
Publication number: 20190232500Abstract: Various methods and systems are provided for puppeteering in augmented reality. Generally, an augmented or virtual reality device for each user generates a virtual 3D environment comprising a virtual representation of a physical room and a 3D asset. An author can record a 3D path for a puppeteering animation of the 3D asset using a 3D interface. At the same time, a coordinated rendering of a corresponding 3D image moving along the 3D path is updated among author devices substantially in real-time. Distinct states of the 3D asset can be assigned to different portions of the 3D path, and authors can set behavior parameters to assign template behaviors such as obstacle avoidance, particle effects, path visualizations and physical effects. The behavior parameters are distributed among presenter and audience devices, and a coordinated rendering of an animation of the 3D image corresponding to the puppeteering animation is triggered.Type: ApplicationFiled: January 26, 2018Publication date: August 1, 2019Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Jonathon Burnham COBB, Dean Alan WADSWORTH, Weihua HUANG
-
Publication number: 20190236842Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: January 26, 2018Publication date: August 1, 2019Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Patent number: 10264320Abstract: Embodiments described herein enable user interaction with a video segment. A hit-zone file, which includes hit-zone data, is produced and stored for a video segment, wherein the hit-zone data corresponds to spatial regions that define hit-zones for hidden objects included in the video segment. The hit-zone file is provided to a computing system so that when the computing system displays the video segment the hit-zone file adds hit-zones for the hidden objects included in the video segment. The hit-zone file is produced separate from the video segment. Each of the hit-zones is defined by a different portion of the hit-zone data and corresponds to a different one of the hidden objects included in the video segment. The spatial regions that define the hit-zones for hidden objects are not visible to a user of the computing system that views the video segment with the hit-zones added.Type: GrantFiled: January 21, 2015Date of Patent: April 16, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Kirk Christopher Gibbons, David Seymour, Preetinderpal Singh Mangat, William Michael Mozell, William Ben Hanke, Dashan Yue, Jeremy Bruce Kersey, Henry Stuart Denison Watson, Enrico William Guld
-
Publication number: 20180113597Abstract: Various methods and systems, for implementing three-dimensional resource integration, are provided. 3D resource integration includes integration of 3D resources into different types of functionality, such as, operating system, file explorer, application and augmented reality functionality. In operation, an indication to perform an operation with a 3D object is received. One or more 3D resource controls, associated with the operation, are accessed. The 3D resource control is a defined set of instructions on how to integrate 3D resources with 3D objects for generating 3D-based graphical interfaces associated with application features and operating system features. An input based on one or more control elements of the one or more 3D resource controls is received. The input includes the one or more control elements that operate to generate a 3D-based graphical interface for the operation. Based on receiving the input, the operation is executed with the 3D object and the 3D-based graphical interface.Type: ApplicationFiled: May 16, 2017Publication date: April 26, 2018Inventors: Enrico William GULD, Jason A. CARTER, Heather Joanne ALEKSON, Andrew Jackson KLEIN, David J.W. SEYMOUR, Kathleen P. MULCAHY, Charla M. PEREIRA, Evan Lewis JONES, William Axel OLSEN, Adam Roy MITCHELL, Daniel Lee OSBORN, Zachary D. WIESNOSKI, Struan Andrew ROBERTSON, Michael Edward HARNISCH, William Robert SCHNURR, Helen Joan Hem Lam, Darren Alexander BENNETT, Kin Hang CHU
-
Publication number: 20150355826Abstract: Embodiments described herein enable user interaction with a video segment. A hit-zone file, which includes hit-zone data, is produced and stored for a video segment, wherein the hit-zone data corresponds to spatial regions that define hit-zones for hidden objects included in the video segment. The hit-zone file is provided to a computing system so that when the computing system displays the video segment the hit-zone file adds hit-zones for the hidden objects included in the video segment. The hit-zone file is produced separate from the video segment. Each of the hit-zones is defined by a different portion of the hit-zone data and corresponds to a different one of the hidden objects included in the video segment. The spatial regions that define the hit-zones for hidden objects are not visible to a user of the computing system that views the video segment with the hit-zones added.Type: ApplicationFiled: January 21, 2015Publication date: December 10, 2015Applicant: MICROSOFT CORPORATIONInventors: Kirk Christopher Gibbons, David Seymour, Preetinderpal Singh Mangat, William Michael Mozell, William Ben Hanke, Dashan Yue, Jeremy Bruce Kersey, Henry Stuart Denison Watson, Enrico William Guld