Patents by Inventor Blade Olson
Blade Olson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250120787Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.Type: ApplicationFiled: December 23, 2024Publication date: April 17, 2025Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
-
Patent number: 12186138Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.Type: GrantFiled: September 30, 2020Date of Patent: January 7, 2025Assignee: Verb Surgical Inc.Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
-
Patent number: 11769302Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: GrantFiled: June 5, 2020Date of Patent: September 26, 2023Assignee: Verb Surgical Inc.Inventors: Blade Olson, Bernhard A. Fuerst
-
Patent number: 11756672Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: GrantFiled: June 5, 2020Date of Patent: September 12, 2023Assignee: Verb Surgical Inc.Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Patent number: 11712315Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: GrantFiled: August 16, 2022Date of Patent: August 1, 2023Assignee: Verb Surgical Inc.Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Patent number: 11663783Abstract: This disclosure relates to systems and methods for using augmented reality with the internet of things. An augmented reality experience may be provided based on an operation of an object. Operation status information of a detected object may be obtained and a visual effect may be determined based on the operation status information. An object may be controlled using augmented reality. Operation status information of a detected object may be obtained and a control option may be determined based on the operation status information. A visual effect may be determined based on the control option and a user input regarding the control option may be obtained. A control information configured to effectuate a change in the operation of the object may be transmitted to the object.Type: GrantFiled: February 10, 2017Date of Patent: May 30, 2023Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric Haseltine, Joseph Olson, Timothy Panec, Katherine M. Bassett, Blade Olson
-
Publication number: 20220387122Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: ApplicationFiled: August 16, 2022Publication date: December 8, 2022Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Patent number: 11457986Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: GrantFiled: September 30, 2020Date of Patent: October 4, 2022Assignee: VERB SURGICAL INC.Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Publication number: 20220096187Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: ApplicationFiled: September 30, 2020Publication date: March 31, 2022Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Publication number: 20220096197Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.Type: ApplicationFiled: September 30, 2020Publication date: March 31, 2022Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
-
Publication number: 20210378768Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: ApplicationFiled: June 5, 2020Publication date: December 9, 2021Inventors: Blade Olson, Bernhard A. Fuerst
-
Publication number: 20210378754Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: ApplicationFiled: June 5, 2020Publication date: December 9, 2021Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Patent number: 10555153Abstract: This disclosure relates to systems and methods for simulating an internet of things capability in an object. Storage media may store reactions executable by a wireless communication device. The wireless communication device may include one or more sensors and one or more feedback capabilities. Individual reactions may be characterized by reaction criteria detectable by the one or more sensors and reaction effects executable through the one or more feedback capabilities. One or more processors may be configured by machine readable instructions to receive input indicating an object association between the wireless communication device and the object, and, responsive to reception of the input, activate a set of reactions for the wireless communication device.Type: GrantFiled: March 1, 2017Date of Patent: February 4, 2020Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Blade Olson, Timothy Panec, Katherine M. Bassett, Thomas McWilliams
-
Patent number: 10039975Abstract: An immersive play environment providing techniques for generating an imaginary enemy within the environment is disclosed. A tracker device is provided in the play environment for determining a location of a user. A controller device is provided to generate an invisible interactive object (an “imaginary enemy”), to engage the user during gameplay. The controller device generates tracking information indicating a location of the imaginary enemy. The controller device determines the location of the enemy relative to the location of the user and transmits that information to the tracker device. The tracker device can communicate the location of the imaginary enemy to the user for engagement.Type: GrantFiled: January 13, 2015Date of Patent: August 7, 2018Assignee: Disney Enterprises, Inc.Inventors: Joseph Olson, Blade Olson, Michael P. Goslin, Charles Moneypenny, Ivone Alexandre
-
Publication number: 20170257270Abstract: This disclosure relates to systems and methods for simulating an internet of things capability in an object. Storage media may store reactions executable by a wireless communication device. The wireless communication device may include one or more sensors and one or more feedback capabilities. Individual reactions may be characterized by reaction criteria detectable by the one or more sensors and reaction effects executable through the one or more feedback capabilities. One or more processors may be configured by machine readable instructions to receive input indicating an object association between the wireless communication device and the object, and, responsive to reception of the input, activate a set of reactions for the wireless communication device.Type: ApplicationFiled: March 1, 2017Publication date: September 7, 2017Inventors: Michael P. Goslin, Blade Olson, Timothy Panec, Katherine M. Bassett, Thomas McWilliams
-
Publication number: 20170228936Abstract: This disclosure relates to systems and methods for using augmented reality with the internet of things. An augmented reality experience may be provided based on an operation of an object. Operation status information of a detected object may be obtained and a visual effect may be determined based on the operation status information. An object may be controlled using augmented reality. Operation status information of a detected object may be obtained and a control option may be determined based on the operation status information. A visual effect may be determined based on the control option and a user input regarding the control option may be obtained. A control information configured to effectuate a change in the operation of the object may be transmitted to the object.Type: ApplicationFiled: February 10, 2017Publication date: August 10, 2017Inventors: Michael P. Goslin, Eric Haseltine, Joseph Olson, Timothy Panec, Katherine M. Bassett, Blade Olson
-
Publication number: 20160199730Abstract: An immersive play environment providing techniques for generating an imaginary enemy within the environment is disclosed. A tracker device is provided in the play environment for determining a location of a user. A controller device is provided to generate an invisible interactive object (an “imaginary enemy”), to engage the user during gameplay. The controller device generates tracking information indicating a location of the imaginary enemy. The controller device determines the location of the enemy relative to the location of the user and transmits that information to the tracker device. The tracker device can communicate the location of the imaginary enemy to the user for engagement.Type: ApplicationFiled: January 13, 2015Publication date: July 14, 2016Inventors: Joseph OLSON, Blade Olson, Michael P. GOSLIN, Charles MONEYPENNY, Ivone ALEXANDRE