Patents by Inventor Blade A. OLSON

Blade A. OLSON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250246291
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Application
    Filed: January 27, 2025
    Publication date: July 31, 2025
    Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Publication number: 20250191309
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Application
    Filed: November 18, 2024
    Publication date: June 12, 2025
    Inventors: Blade A. Olson, Bernhard A. Fuerst
  • Patent number: 12237072
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Grant
    Filed: August 14, 2023
    Date of Patent: February 25, 2025
    Assignee: Verb Surgical Inc.
    Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Patent number: 12165268
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Grant
    Filed: August 24, 2023
    Date of Patent: December 10, 2024
    Assignee: Verb Surgical Inc.
    Inventors: Blade A. Olson, Bernhard A. Fuerst
  • Publication number: 20240046589
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Application
    Filed: August 24, 2023
    Publication date: February 8, 2024
    Inventors: Blade A. Olson, Bernhard A. Fuerst
  • Publication number: 20240038367
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Application
    Filed: August 14, 2023
    Publication date: February 1, 2024
    Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Patent number: 10257482
    Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.
    Type: Grant
    Filed: September 12, 2017
    Date of Patent: April 9, 2019
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric Haseltine, Blade A. Olson
  • Patent number: 10092827
    Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.
    Type: Grant
    Filed: June 16, 2016
    Date of Patent: October 9, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Blade A. Olson
  • Patent number: 10065124
    Abstract: Systems, methods and articles of manufacture to perform an operation comprising receiving speech data via a network, modifying the speech data based on a profile associated with a toy device, wherein the modified speech data represents a voice of the toy device, and outputting the modified speech data via a speaker.
    Type: Grant
    Filed: January 15, 2016
    Date of Patent: September 4, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Blade A. Olson
  • Patent number: 9933851
    Abstract: There is provided a system having a feedback device including a sensory feedback element, a non-transitory memory storing an executable code and a virtual object having a virtual surface, and a hardware processor. The hardware processor is configured to execute the executable code to determine a position of a hand of a first user, determine a location of the virtual surface of the virtual object, and transmit a first activation signal to the feedback device to cause a sensory feedback to be provided to the first user using the sensory feedback element of the feedback device based on the position of the hand of the first user relative to the location of the virtual surface of the virtual object.
    Type: Grant
    Filed: February 22, 2016
    Date of Patent: April 3, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric C. Haseltine, Blade A. Olson
  • Publication number: 20180027221
    Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus.
    Type: Application
    Filed: September 12, 2017
    Publication date: January 25, 2018
    Inventors: MICHAEL P. GOSLIN, ERIC HASELTINE, BLADE A. OLSON
  • Publication number: 20170361213
    Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.
    Type: Application
    Filed: June 16, 2016
    Publication date: December 21, 2017
    Inventors: Michael P. GOSLIN, Blade A. OLSON
  • Patent number: 9800851
    Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.
    Type: Grant
    Filed: January 26, 2016
    Date of Patent: October 24, 2017
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric Haseltine, Blade A. Olson
  • Publication number: 20170242483
    Abstract: There is provided a system having a feedback device including a sensory feedback element, a non-transitory memory storing an executable code and a virtual object having a virtual surface, and a hardware processor. The hardware processor is configured to execute the executable code to determine a position of a hand of a first user, determine a location of the virtual surface of the virtual object, and transmit a first activation signal to the feedback device to cause a sensory feedback to be provided to the first user using the sensory feedback element of the feedback device based on the position of the hand of the first user relative to the location of the virtual surface of the virtual object.
    Type: Application
    Filed: February 22, 2016
    Publication date: August 24, 2017
    Inventors: Michael P. Goslin, Eric C. Haseltine, Blade A. Olson
  • Publication number: 20170214896
    Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.
    Type: Application
    Filed: January 26, 2016
    Publication date: July 27, 2017
    Inventors: MICHAEL P. GOSLIN, ERIC HASELTINE, BLADE A. OLSON
  • Publication number: 20170203221
    Abstract: Systems, methods and articles of manufacture to perform an operation comprising receiving speech data via a network, modifying the speech data based on a profile associated with a toy device, wherein the modified speech data represents a voice of the toy device, and outputting the modified speech data via a speaker.
    Type: Application
    Filed: January 15, 2016
    Publication date: July 20, 2017
    Inventors: Michael P. GOSLIN, Blade A. OLSON