Patents by Inventor Alexander Bennett
Alexander Bennett has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230313045Abstract: A system for separating bitumen from shingle powder includes: a blender; a solvent tank in fluid communication with the blender; a water tank in fluid communication with the blender; a decanting tank in fluid communication with the blender; a distillation tank in fluid communication with the decanting tank; and a plurality of pumps for pumping fluids between various components of the system or out of the system. When the system is in use, the blender receives and mixes solvent from the solvent tank with shingle powder containing bitumen to produce a bitumen-infused solvent. Water is supplied to the blender from the water tank to wash an interior of the blender. Bitumen-infused solvent received in the decanting tank from the blender is separated from excess water. Bitumen within the bitumen-infused solvent is isolated within the distillation tank.Type: ApplicationFiled: March 29, 2023Publication date: October 5, 2023Inventors: William Lovell Lawrence, Edward Merrel Brownlee, Alexander Bennett Hoekstra, Michael Thomas Earl, Jonathon Daniel Horton
-
Publication number: 20220369444Abstract: A method for intelligently controlling a lighting accessory coupled to a host device includes determining a posture of the lighting accessory, the posture being one of multiple user-selectable physical configurations; and selectively configuring a setting of an application executing on the host device based at least in part on the determined posture.Type: ApplicationFiled: May 13, 2021Publication date: November 17, 2022Inventors: David E. WASHINGTON, Danielle TENE, Whitney J. GIAIMO, Alexander BENNETT, Ann MCINROY, Natalia URBANOWICZ, Simon DEARSLEY
-
Patent number: 11497103Abstract: A method for intelligently controlling a lighting accessory coupled to a host device includes determining a posture of the lighting accessory, the posture being one of multiple user-selectable physical configurations; and selectively configuring a setting of an application executing on the host device based at least in part on the determined posture.Type: GrantFiled: May 13, 2021Date of Patent: November 8, 2022Assignee: Microsoft Technology Licensing, LLCInventors: David E. Washington, Danielle Tene, Whitney J. Giaimo, Alexander Bennett, Ann McInroy, Natalia Urbanowicz, Simon Dearsley
-
Patent number: 11467709Abstract: The disclosed technology is generally directed to mixed-reality devices. In one example of the technology, a first mixed-reality guide is provided mixed-reality devices, enabling the mixed-reality devices to operate the first mixed-reality guide while providing a mixed-reality view, such that: while the first mixed-reality guide is navigated to a step of the set of steps of the first mixed-reality guide, the mixed-reality view includes a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. From each mixed-reality device, mixed-reality data is received based on use of at least the first mixed-reality guide on the mixed-reality device. The mixed-reality data includes spatial telemetry data collected for at least one step of the first mixed-reality guide. A presentation that is based on the mixed-reality data is provided. The first mixed-reality guide is enabled to be altered based on the mixed-reality data.Type: GrantFiled: February 7, 2020Date of Patent: October 11, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Alexandre Pierre Michel Godin, Andrew Jackson Klein, Arni Mar Thrastarson, Charla Marie Pereira, Cydney Brooke Nielsen, Darren Alexander Bennett, Jason Drew Vantomme, Joel Jamon Rendon, Kjartan Olafsson, Mahesh Keshav Kamat, Maya Alethea Miller-Vedam, Ryan Martin Nadel, Robert István Butterworth
-
Publication number: 20220157026Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: July 23, 2021Publication date: May 19, 2022Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Patent number: 11087548Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: GrantFiled: September 26, 2019Date of Patent: August 10, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
-
Patent number: 11014242Abstract: Various methods and systems are provided for puppeteering in augmented reality. Generally, an augmented or virtual reality device for each user generates a virtual 3D environment comprising a virtual representation of a physical room and a 3D asset. An author can record a 3D path for a puppeteering animation of the 3D asset using a 3D interface. At the same time, a coordinated rendering of a corresponding 3D image moving along the 3D path is updated among author devices substantially in real-time. Distinct states of the 3D asset can be assigned to different portions of the 3D path, and authors can set behavior parameters to assign template behaviors such as obstacle avoidance, particle effects, path visualizations and physical effects. The behavior parameters are distributed among presenter and audience devices, and a coordinated rendering of an animation of the 3D image corresponding to the puppeteering animation is triggered.Type: GrantFiled: January 26, 2018Date of Patent: May 25, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Jonathon Burnham Cobb, Dean Alan Wadsworth, Weihua Huang
-
Patent number: 10982970Abstract: The present disclosure relates to a method (300) for displaying a perspective view of the surrounding of an aircraft (100) in an aircraft. The method comprises accessing (310) surrounding information from a database. The surrounding information is photo-based and three-dimensional. The method further comprises processing (320) the accessed surrounding information so that a perspective view of the surrounding of the aircraft is provided. The perspective view of the surrounding correlates to the position of the aircraft and is photo-based with spatially correlated photo-based textures. The method further comprises transmitting (360) the provided perspective view of the surrounding of the aircraft to a displaying unit so that it can be displayed in the aircraft. The present disclosure also relates to a system, an aircraft, a use, a computer program, and a computer program product.Type: GrantFiled: July 7, 2016Date of Patent: April 20, 2021Assignee: SAAB ABInventors: Nigel Pippard, Robert Alexander Bennett, Jonas Dehlin, Adam Nilsson
-
Patent number: 10936146Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator, wherein the mixed-reality view includes both a real-world environment of the operator and holographic aspects. The operator is enabled to navigate among a plurality of steps of a task, such that for at least one step of the plurality of steps of the task, while the operator is navigated to the step of the task: the mixed-reality view is caused to include at least one instruction associated with the step. The mixed-reality view is caused to include a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. The mixed-reality view is caused to continually include a visual tether from the instruction to the real-world location.Type: GrantFiled: May 30, 2019Date of Patent: March 2, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Darren Alexander Bennett, Charla Marie Pereira, Andrew Jackson Klein, Robert István Butterworth, Sean Robert Puller, Tsz Fung Wan, Kevin Thomas Mather, Dean Alan Wadsworth
-
Patent number: 10824294Abstract: Various methods and systems, for implementing three-dimensional resource integration, are provided. 3D resource integration includes integration of 3D resources into different types of functionality, such as, operating system, file explorer, application and augmented reality functionality. In operation, an indication to perform an operation with a 3D object is received. One or more 3D resource controls, associated with the operation, are accessed. The 3D resource control is a defined set of instructions on how to integrate 3D resources with 3D objects for generating 3D-based graphical interfaces associated with application features and operating system features. An input based on one or more control elements of the one or more 3D resource controls is received. The input includes the one or more control elements that operate to generate a 3D-based graphical interface for the operation. Based on receiving the input, the operation is executed with the 3D object and the 3D-based graphical interface.Type: GrantFiled: May 16, 2017Date of Patent: November 3, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Enrico William Guld, Jason A. Carter, Heather Joanne Alekson, Andrew Jackson Klein, David J. W. Seymour, Kathleen P. Mulcahy, Charla M. Pereira, Evan Lewis Jones, William Axel Olsen, Adam Roy Mitchell, Daniel Lee Osborn, Zachary D. Wiesnoski, Struan Andrew Robertson, Michael Edward Harnisch, William Robert Schnurr, Helen Joan Hem Lam, Darren Alexander Bennett, Kin Hang Chu
-
Publication number: 20200273255Abstract: The disclosed technology is generally directed to mixed-reality devices. In one example of the technology, a first mixed-reality guide is provided mixed-reality devices, enabling the mixed-reality devices to operate the first mixed-reality guide while providing a mixed-reality view, such that: while the first mixed-reality guide is navigated to a step of the set of steps of the first mixed-reality guide, the mixed-reality view includes a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. From each mixed-reality device, mixed-reality data is received based on use of at least the first mixed-reality guide on the mixed-reality device. The mixed-reality data includes spatial telemetry data collected for at least one step of the first mixed-reality guide. A presentation that is based on the mixed-reality data is provided. The first mixed-reality guide is enabled to be altered based on the mixed-reality data.Type: ApplicationFiled: February 7, 2020Publication date: August 27, 2020Inventors: Alexandre Pierre Michel GODIN, Andrew Jackson KLEIN, Arni Mar THRASTARSON, Charla Marie PEREIRA, Cydney Brooke NIELSEN, Darren Alexander BENNETT, Jason Drew VANTOMME, Joel Jamon RENDON, Kjartan OLAFSSON, Mahesh Keshav KAMAT, Maya Alethea MILLER-VEDAM, Ryan Martin NADEL, Robert István BUTTERWORTH
-
Publication number: 20200273254Abstract: The disclosed technology is generally directed to mixed reality devices. In one example of the technology, a mixed-reality view is caused to be provided to an operator, wherein the mixed-reality view includes both a real-world environment of the operator and holographic aspects. The operator is enabled to navigate among a plurality of steps of a task, such that for at least one step of the plurality of steps of the task, while the operator is navigated to the step of the task: the mixed-reality view is caused to include at least one instruction associated with the step. The mixed-reality view is caused to include a hologram at a real-world location in the real-world environment at which work associated with the step is to be performed. The mixed-reality view is caused to continually include a visual tether from the instruction to the real-world location.Type: ApplicationFiled: May 30, 2019Publication date: August 27, 2020Inventors: Darren Alexander BENNETT, Charla Marie PEREIRA, Andrew Jackson KLEIN, Robert István BUTTERWORTH, Sean Robert PULLER, Tsz Fung WAN, Kevin Thomas MATHER, Dean Alan WADSWORTH
-
Publication number: 20200160604Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: September 26, 2019Publication date: May 21, 2020Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Publication number: 20190310105Abstract: The present disclosure relates to a method (300) for displaying a perspective view of the surrounding of an aircraft (100) in an aircraft. The method comprises accessing (310) surrounding information from a database. The surrounding information is photo-based and three-dimensional. The method further comprises processing (320) the accessed surrounding information so that a perspective view of the surrounding of the aircraft is provided. The perspective view of the surrounding correlates to the position of the aircraft and is photo-based with spatially correlated photo-based textures. The method further comprises transmitting (360) the provided perspective view of the surrounding of the aircraft to a displaying unit so that it can be displayed in the aircraft. The present disclosure also relates to a system, an aircraft, a use, a computer program, and a computer program product.Type: ApplicationFiled: July 7, 2016Publication date: October 10, 2019Inventors: Nigel Pippard, Robert Alexander Bennett, Jonas Dehlin, Adam Nilsson
-
Patent number: 10438414Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: GrantFiled: January 26, 2018Date of Patent: October 8, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Alexander Bennett, David J. W. Seymour, Charla M. Pereira, Enrico William Guld, Kin Hang Chu, Julia Faye Taylor-Hell, Jonathon Burnham Cobb, Helen Joan Hem Lam, You-Da Yang, Dean Alan Wadsworth, Andrew Jackson Klein
-
Publication number: 20190236842Abstract: Various methods and systems are provided for authoring and presenting 3D presentations. Generally, an augmented or virtual reality device for each author, presenter and audience member includes 3D presentation software. During authoring mode, one or more authors can use 3D and/or 2D interfaces to generate a 3D presentation that choreographs behaviors of 3D assets into scenes and beats. During presentation mode, the 3D presentation is loaded in each user device, and 3D images of the 3D assets and corresponding asset behaviors are rendered among the user devices in a coordinated manner. As such, one or more presenters can navigate the scenes and beats of the 3D presentation to deliver the 3D presentation to one or more audience members wearing augmented reality headsets.Type: ApplicationFiled: January 26, 2018Publication date: August 1, 2019Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Julia Faye TAYLOR-HELL, Jonathon Burnham COBB, Helen Joan Hem LAM, You-Da YANG, Dean Alan WADSWORTH, Andrew Jackson KLEIN
-
Publication number: 20190232500Abstract: Various methods and systems are provided for puppeteering in augmented reality. Generally, an augmented or virtual reality device for each user generates a virtual 3D environment comprising a virtual representation of a physical room and a 3D asset. An author can record a 3D path for a puppeteering animation of the 3D asset using a 3D interface. At the same time, a coordinated rendering of a corresponding 3D image moving along the 3D path is updated among author devices substantially in real-time. Distinct states of the 3D asset can be assigned to different portions of the 3D path, and authors can set behavior parameters to assign template behaviors such as obstacle avoidance, particle effects, path visualizations and physical effects. The behavior parameters are distributed among presenter and audience devices, and a coordinated rendering of an animation of the 3D image corresponding to the puppeteering animation is triggered.Type: ApplicationFiled: January 26, 2018Publication date: August 1, 2019Inventors: Darren Alexander BENNETT, David J.W. SEYMOUR, Charla M. PEREIRA, Enrico William GULD, Kin Hang CHU, Jonathon Burnham COBB, Dean Alan WADSWORTH, Weihua HUANG
-
Publication number: 20180113597Abstract: Various methods and systems, for implementing three-dimensional resource integration, are provided. 3D resource integration includes integration of 3D resources into different types of functionality, such as, operating system, file explorer, application and augmented reality functionality. In operation, an indication to perform an operation with a 3D object is received. One or more 3D resource controls, associated with the operation, are accessed. The 3D resource control is a defined set of instructions on how to integrate 3D resources with 3D objects for generating 3D-based graphical interfaces associated with application features and operating system features. An input based on one or more control elements of the one or more 3D resource controls is received. The input includes the one or more control elements that operate to generate a 3D-based graphical interface for the operation. Based on receiving the input, the operation is executed with the 3D object and the 3D-based graphical interface.Type: ApplicationFiled: May 16, 2017Publication date: April 26, 2018Inventors: Enrico William GULD, Jason A. CARTER, Heather Joanne ALEKSON, Andrew Jackson KLEIN, David J.W. SEYMOUR, Kathleen P. MULCAHY, Charla M. PEREIRA, Evan Lewis JONES, William Axel OLSEN, Adam Roy MITCHELL, Daniel Lee OSBORN, Zachary D. WIESNOSKI, Struan Andrew ROBERTSON, Michael Edward HARNISCH, William Robert SCHNURR, Helen Joan Hem Lam, Darren Alexander BENNETT, Kin Hang CHU
-
Patent number: D999177Type: GrantFiled: March 16, 2022Date of Patent: September 19, 2023Assignee: Microsoft CorporationInventors: Alexander Bennett, Huaian Cheng, Jinwon Yook, Simon Cameron Dearsley
-
Patent number: D1021774Type: GrantFiled: March 16, 2022Date of Patent: April 9, 2024Assignee: Microsoft CorporationInventors: Alexander Bennett, Huaian Cheng, Jinwon Yook, Simon Cameron Dearsley