Patents by Inventor Blade A. OLSON
Blade A. OLSON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250246291Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: ApplicationFiled: January 27, 2025Publication date: July 31, 2025Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Publication number: 20250191309Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: ApplicationFiled: November 18, 2024Publication date: June 12, 2025Inventors: Blade A. Olson, Bernhard A. Fuerst
-
Patent number: 12237072Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: GrantFiled: August 14, 2023Date of Patent: February 25, 2025Assignee: Verb Surgical Inc.Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Patent number: 12165268Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: GrantFiled: August 24, 2023Date of Patent: December 10, 2024Assignee: Verb Surgical Inc.Inventors: Blade A. Olson, Bernhard A. Fuerst
-
Publication number: 20240046589Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: ApplicationFiled: August 24, 2023Publication date: February 8, 2024Inventors: Blade A. Olson, Bernhard A. Fuerst
-
Publication number: 20240038367Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: ApplicationFiled: August 14, 2023Publication date: February 1, 2024Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Patent number: 10257482Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.Type: GrantFiled: September 12, 2017Date of Patent: April 9, 2019Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric Haseltine, Blade A. Olson
-
Patent number: 10092827Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.Type: GrantFiled: June 16, 2016Date of Patent: October 9, 2018Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Blade A. Olson
-
Patent number: 10065124Abstract: Systems, methods and articles of manufacture to perform an operation comprising receiving speech data via a network, modifying the speech data based on a profile associated with a toy device, wherein the modified speech data represents a voice of the toy device, and outputting the modified speech data via a speaker.Type: GrantFiled: January 15, 2016Date of Patent: September 4, 2018Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Blade A. Olson
-
Patent number: 9933851Abstract: There is provided a system having a feedback device including a sensory feedback element, a non-transitory memory storing an executable code and a virtual object having a virtual surface, and a hardware processor. The hardware processor is configured to execute the executable code to determine a position of a hand of a first user, determine a location of the virtual surface of the virtual object, and transmit a first activation signal to the feedback device to cause a sensory feedback to be provided to the first user using the sensory feedback element of the feedback device based on the position of the hand of the first user relative to the location of the virtual surface of the virtual object.Type: GrantFiled: February 22, 2016Date of Patent: April 3, 2018Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric C. Haseltine, Blade A. Olson
-
Publication number: 20180027221Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus.Type: ApplicationFiled: September 12, 2017Publication date: January 25, 2018Inventors: MICHAEL P. GOSLIN, ERIC HASELTINE, BLADE A. OLSON
-
Publication number: 20170361213Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.Type: ApplicationFiled: June 16, 2016Publication date: December 21, 2017Inventors: Michael P. GOSLIN, Blade A. OLSON
-
Patent number: 9800851Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.Type: GrantFiled: January 26, 2016Date of Patent: October 24, 2017Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric Haseltine, Blade A. Olson
-
Publication number: 20170242483Abstract: There is provided a system having a feedback device including a sensory feedback element, a non-transitory memory storing an executable code and a virtual object having a virtual surface, and a hardware processor. The hardware processor is configured to execute the executable code to determine a position of a hand of a first user, determine a location of the virtual surface of the virtual object, and transmit a first activation signal to the feedback device to cause a sensory feedback to be provided to the first user using the sensory feedback element of the feedback device based on the position of the hand of the first user relative to the location of the virtual surface of the virtual object.Type: ApplicationFiled: February 22, 2016Publication date: August 24, 2017Inventors: Michael P. Goslin, Eric C. Haseltine, Blade A. Olson
-
Publication number: 20170214896Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.Type: ApplicationFiled: January 26, 2016Publication date: July 27, 2017Inventors: MICHAEL P. GOSLIN, ERIC HASELTINE, BLADE A. OLSON
-
Publication number: 20170203221Abstract: Systems, methods and articles of manufacture to perform an operation comprising receiving speech data via a network, modifying the speech data based on a profile associated with a toy device, wherein the modified speech data represents a voice of the toy device, and outputting the modified speech data via a speaker.Type: ApplicationFiled: January 15, 2016Publication date: July 20, 2017Inventors: Michael P. GOSLIN, Blade A. OLSON