Patents by Inventor Blade A. OLSON

Blade A. OLSON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240046589
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Application
    Filed: August 24, 2023
    Publication date: February 8, 2024
    Inventors: Blade A. Olson, Bernhard A. Fuerst
  • Publication number: 20240038367
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Application
    Filed: August 14, 2023
    Publication date: February 1, 2024
    Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Patent number: 11769302
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Grant
    Filed: June 5, 2020
    Date of Patent: September 26, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Blade Olson, Bernhard A. Fuerst
  • Patent number: 11756672
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Grant
    Filed: June 5, 2020
    Date of Patent: September 12, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Patent number: 11712315
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Grant
    Filed: August 16, 2022
    Date of Patent: August 1, 2023
    Assignee: Verb Surgical Inc.
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Patent number: 11663783
    Abstract: This disclosure relates to systems and methods for using augmented reality with the internet of things. An augmented reality experience may be provided based on an operation of an object. Operation status information of a detected object may be obtained and a visual effect may be determined based on the operation status information. An object may be controlled using augmented reality. Operation status information of a detected object may be obtained and a control option may be determined based on the operation status information. A visual effect may be determined based on the control option and a user input regarding the control option may be obtained. A control information configured to effectuate a change in the operation of the object may be transmitted to the object.
    Type: Grant
    Filed: February 10, 2017
    Date of Patent: May 30, 2023
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric Haseltine, Joseph Olson, Timothy Panec, Katherine M. Bassett, Blade Olson
  • Publication number: 20220387122
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Application
    Filed: August 16, 2022
    Publication date: December 8, 2022
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Patent number: 11457986
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Grant
    Filed: September 30, 2020
    Date of Patent: October 4, 2022
    Assignee: VERB SURGICAL INC.
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Publication number: 20220096197
    Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
  • Publication number: 20220096187
    Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Yiming Xu, Berk Gonenc, Blade Olson
  • Publication number: 20210378754
    Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.
    Type: Application
    Filed: June 5, 2020
    Publication date: December 9, 2021
    Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
  • Publication number: 20210378768
    Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.
    Type: Application
    Filed: June 5, 2020
    Publication date: December 9, 2021
    Inventors: Blade Olson, Bernhard A. Fuerst
  • Patent number: 10555153
    Abstract: This disclosure relates to systems and methods for simulating an internet of things capability in an object. Storage media may store reactions executable by a wireless communication device. The wireless communication device may include one or more sensors and one or more feedback capabilities. Individual reactions may be characterized by reaction criteria detectable by the one or more sensors and reaction effects executable through the one or more feedback capabilities. One or more processors may be configured by machine readable instructions to receive input indicating an object association between the wireless communication device and the object, and, responsive to reception of the input, activate a set of reactions for the wireless communication device.
    Type: Grant
    Filed: March 1, 2017
    Date of Patent: February 4, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Blade Olson, Timothy Panec, Katherine M. Bassett, Thomas McWilliams
  • Patent number: 10257482
    Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.
    Type: Grant
    Filed: September 12, 2017
    Date of Patent: April 9, 2019
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric Haseltine, Blade A. Olson
  • Patent number: 10092827
    Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.
    Type: Grant
    Filed: June 16, 2016
    Date of Patent: October 9, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Blade A. Olson
  • Patent number: 10065124
    Abstract: Systems, methods and articles of manufacture to perform an operation comprising receiving speech data via a network, modifying the speech data based on a profile associated with a toy device, wherein the modified speech data represents a voice of the toy device, and outputting the modified speech data via a speaker.
    Type: Grant
    Filed: January 15, 2016
    Date of Patent: September 4, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Blade A. Olson
  • Patent number: 10039975
    Abstract: An immersive play environment providing techniques for generating an imaginary enemy within the environment is disclosed. A tracker device is provided in the play environment for determining a location of a user. A controller device is provided to generate an invisible interactive object (an “imaginary enemy”), to engage the user during gameplay. The controller device generates tracking information indicating a location of the imaginary enemy. The controller device determines the location of the enemy relative to the location of the user and transmits that information to the tracker device. The tracker device can communicate the location of the imaginary enemy to the user for engagement.
    Type: Grant
    Filed: January 13, 2015
    Date of Patent: August 7, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Joseph Olson, Blade Olson, Michael P. Goslin, Charles Moneypenny, Ivone Alexandre
  • Patent number: 9933851
    Abstract: There is provided a system having a feedback device including a sensory feedback element, a non-transitory memory storing an executable code and a virtual object having a virtual surface, and a hardware processor. The hardware processor is configured to execute the executable code to determine a position of a hand of a first user, determine a location of the virtual surface of the virtual object, and transmit a first activation signal to the feedback device to cause a sensory feedback to be provided to the first user using the sensory feedback element of the feedback device based on the position of the hand of the first user relative to the location of the virtual surface of the virtual object.
    Type: Grant
    Filed: February 22, 2016
    Date of Patent: April 3, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael P. Goslin, Eric C. Haseltine, Blade A. Olson
  • Publication number: 20180027221
    Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus.
    Type: Application
    Filed: September 12, 2017
    Publication date: January 25, 2018
    Inventors: MICHAEL P. GOSLIN, ERIC HASELTINE, BLADE A. OLSON
  • Publication number: 20170361213
    Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.
    Type: Application
    Filed: June 16, 2016
    Publication date: December 21, 2017
    Inventors: Michael P. GOSLIN, Blade A. OLSON