Patents by Inventor Blade Olson

Blade Olson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250120787
    Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.
    Type: Application
    Filed: December 23, 2024
    Publication date: April 17, 2025
    Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
  • Patent number: 12186138
    Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.
    Type: Grant
    Filed: September 30, 2020
    Date of Patent: January 7, 2025
    Assignee: Verb Surgical Inc.
    Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
  • Patent number: 11769302
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Grant
    Filed: June 5, 2020
    Date of Patent: September 26, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Blade Olson, Bernhard A. Fuerst
  • Patent number: 11756672
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Grant
    Filed: June 5, 2020
    Date of Patent: September 12, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Patent number: 11712315
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Grant
    Filed: August 16, 2022
    Date of Patent: August 1, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Patent number: 11663783
    Abstract: This disclosure relates to systems and methods for using augmented reality with the internet of things. An augmented reality experience may be provided based on an operation of an object. Operation status information of a detected object may be obtained and a visual effect may be determined based on the operation status information. An object may be controlled using augmented reality. Operation status information of a detected object may be obtained and a control option may be determined based on the operation status information. A visual effect may be determined based on the control option and a user input regarding the control option may be obtained. A control information configured to effectuate a change in the operation of the object may be transmitted to the object.
    Type: Grant
    Filed: February 10, 2017
    Date of Patent: May 30, 2023
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric Haseltine, Joseph Olson, Timothy Panec, Katherine M. Bassett, Blade Olson
  • Publication number: 20220387122
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Application
    Filed: August 16, 2022
    Publication date: December 8, 2022
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Patent number: 11457986
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Grant
    Filed: September 30, 2020
    Date of Patent: October 4, 2022
    Assignee: VERB SURGICAL INC.
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Publication number: 20220096187
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Publication number: 20220096197
    Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
  • Publication number: 20210378768
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Application
    Filed: June 5, 2020
    Publication date: December 9, 2021
    Inventors: Blade Olson, Bernhard A. Fuerst
  • Publication number: 20210378754
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Application
    Filed: June 5, 2020
    Publication date: December 9, 2021
    Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Patent number: 10555153
    Abstract: This disclosure relates to systems and methods for simulating an internet of things capability in an object. Storage media may store reactions executable by a wireless communication device. The wireless communication device may include one or more sensors and one or more feedback capabilities. Individual reactions may be characterized by reaction criteria detectable by the one or more sensors and reaction effects executable through the one or more feedback capabilities. One or more processors may be configured by machine readable instructions to receive input indicating an object association between the wireless communication device and the object, and, responsive to reception of the input, activate a set of reactions for the wireless communication device.
    Type: Grant
    Filed: March 1, 2017
    Date of Patent: February 4, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Blade Olson, Timothy Panec, Katherine M. Bassett, Thomas McWilliams
  • Patent number: 10039975
    Abstract: An immersive play environment providing techniques for generating an imaginary enemy within the environment is disclosed. A tracker device is provided in the play environment for determining a location of a user. A controller device is provided to generate an invisible interactive object (an “imaginary enemy”), to engage the user during gameplay. The controller device generates tracking information indicating a location of the imaginary enemy. The controller device determines the location of the enemy relative to the location of the user and transmits that information to the tracker device. The tracker device can communicate the location of the imaginary enemy to the user for engagement.
    Type: Grant
    Filed: January 13, 2015
    Date of Patent: August 7, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Joseph Olson, Blade Olson, Michael P. Goslin, Charles Moneypenny, Ivone Alexandre
  • Publication number: 20170257270
    Abstract: This disclosure relates to systems and methods for simulating an internet of things capability in an object. Storage media may store reactions executable by a wireless communication device. The wireless communication device may include one or more sensors and one or more feedback capabilities. Individual reactions may be characterized by reaction criteria detectable by the one or more sensors and reaction effects executable through the one or more feedback capabilities. One or more processors may be configured by machine readable instructions to receive input indicating an object association between the wireless communication device and the object, and, responsive to reception of the input, activate a set of reactions for the wireless communication device.
    Type: Application
    Filed: March 1, 2017
    Publication date: September 7, 2017
    Inventors: Michael P. Goslin, Blade Olson, Timothy Panec, Katherine M. Bassett, Thomas McWilliams
  • Publication number: 20170228936
    Abstract: This disclosure relates to systems and methods for using augmented reality with the internet of things. An augmented reality experience may be provided based on an operation of an object. Operation status information of a detected object may be obtained and a visual effect may be determined based on the operation status information. An object may be controlled using augmented reality. Operation status information of a detected object may be obtained and a control option may be determined based on the operation status information. A visual effect may be determined based on the control option and a user input regarding the control option may be obtained. A control information configured to effectuate a change in the operation of the object may be transmitted to the object.
    Type: Application
    Filed: February 10, 2017
    Publication date: August 10, 2017
    Inventors: Michael P. Goslin, Eric Haseltine, Joseph Olson, Timothy Panec, Katherine M. Bassett, Blade Olson
  • Publication number: 20160199730
    Abstract: An immersive play environment providing techniques for generating an imaginary enemy within the environment is disclosed. A tracker device is provided in the play environment for determining a location of a user. A controller device is provided to generate an invisible interactive object (an “imaginary enemy”), to engage the user during gameplay. The controller device generates tracking information indicating a location of the imaginary enemy. The controller device determines the location of the enemy relative to the location of the user and transmits that information to the tracker device. The tracker device can communicate the location of the imaginary enemy to the user for engagement.
    Type: Application
    Filed: January 13, 2015
    Publication date: July 14, 2016
    Inventors: Joseph OLSON, Blade Olson, Michael P. GOSLIN, Charles MONEYPENNY, Ivone ALEXANDRE