Patents by Inventor Blade A. OLSON
Blade A. OLSON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240046589Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: ApplicationFiled: August 24, 2023Publication date: February 8, 2024Inventors: Blade A. Olson, Bernhard A. Fuerst
-
Publication number: 20240038367Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: ApplicationFiled: August 14, 2023Publication date: February 1, 2024Inventors: Blade A. Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Patent number: 11769302Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: GrantFiled: June 5, 2020Date of Patent: September 26, 2023Assignee: Verb Surgical Inc.Inventors: Blade Olson, Bernhard A. Fuerst
-
Patent number: 11756672Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: GrantFiled: June 5, 2020Date of Patent: September 12, 2023Assignee: Verb Surgical Inc.Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Patent number: 11712315Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: GrantFiled: August 16, 2022Date of Patent: August 1, 2023Assignee: Verb Surgical Inc.Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Patent number: 11663783Abstract: This disclosure relates to systems and methods for using augmented reality with the internet of things. An augmented reality experience may be provided based on an operation of an object. Operation status information of a detected object may be obtained and a visual effect may be determined based on the operation status information. An object may be controlled using augmented reality. Operation status information of a detected object may be obtained and a control option may be determined based on the operation status information. A visual effect may be determined based on the control option and a user input regarding the control option may be obtained. A control information configured to effectuate a change in the operation of the object may be transmitted to the object.Type: GrantFiled: February 10, 2017Date of Patent: May 30, 2023Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric Haseltine, Joseph Olson, Timothy Panec, Katherine M. Bassett, Blade Olson
-
Publication number: 20220387122Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: ApplicationFiled: August 16, 2022Publication date: December 8, 2022Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Patent number: 11457986Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: GrantFiled: September 30, 2020Date of Patent: October 4, 2022Assignee: VERB SURGICAL INC.Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Publication number: 20220096197Abstract: Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.Type: ApplicationFiled: September 30, 2020Publication date: March 31, 2022Inventors: Tianyu Song, Blade Olson, Bernhard A. Fuerst, Danyal Fer
-
Publication number: 20220096187Abstract: A surgical robotic system that includes a robotic arm, glove configured to be worn on a hand of a user and including a force-feedback mechanism, a tracking device, a processor, and memory. The memory includes instructions which when executed by the processor causes the system to determine that the user is performing a hand gesture with the glove to grasp a virtual user input device (UID) based on the tracking device, and in response to the user grasping the virtual UID, apply, via the force-feedback mechanism, a force upon the glove that corresponds to a physical representation of the virtual UID, and engage the robotic arm to be controlled by the virtual UID.Type: ApplicationFiled: September 30, 2020Publication date: March 31, 2022Inventors: Yiming Xu, Berk Gonenc, Blade Olson
-
Publication number: 20210378754Abstract: A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.Type: ApplicationFiled: June 5, 2020Publication date: December 9, 2021Inventors: Blade Olson, Bernhard A. Fuerst, Alexander Barthel, Yiming Xu, Giacomo Taylor, Tianyu Song
-
Publication number: 20210378768Abstract: A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.Type: ApplicationFiled: June 5, 2020Publication date: December 9, 2021Inventors: Blade Olson, Bernhard A. Fuerst
-
Patent number: 10555153Abstract: This disclosure relates to systems and methods for simulating an internet of things capability in an object. Storage media may store reactions executable by a wireless communication device. The wireless communication device may include one or more sensors and one or more feedback capabilities. Individual reactions may be characterized by reaction criteria detectable by the one or more sensors and reaction effects executable through the one or more feedback capabilities. One or more processors may be configured by machine readable instructions to receive input indicating an object association between the wireless communication device and the object, and, responsive to reception of the input, activate a set of reactions for the wireless communication device.Type: GrantFiled: March 1, 2017Date of Patent: February 4, 2020Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Blade Olson, Timothy Panec, Katherine M. Bassett, Thomas McWilliams
-
Patent number: 10257482Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus. An object may be augmented with a projected image by detecting a landmark associated with the object, determining a modified version of an image to compensate for an orientation of the projection apparatus to the landmark, and implementing a light generated by the projection apparatus to project the modified version of the image on a surface of the object.Type: GrantFiled: September 12, 2017Date of Patent: April 9, 2019Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric Haseltine, Blade A. Olson
-
Patent number: 10092827Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.Type: GrantFiled: June 16, 2016Date of Patent: October 9, 2018Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Blade A. Olson
-
Patent number: 10065124Abstract: Systems, methods and articles of manufacture to perform an operation comprising receiving speech data via a network, modifying the speech data based on a profile associated with a toy device, wherein the modified speech data represents a voice of the toy device, and outputting the modified speech data via a speaker.Type: GrantFiled: January 15, 2016Date of Patent: September 4, 2018Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Blade A. Olson
-
Patent number: 10039975Abstract: An immersive play environment providing techniques for generating an imaginary enemy within the environment is disclosed. A tracker device is provided in the play environment for determining a location of a user. A controller device is provided to generate an invisible interactive object (an “imaginary enemy”), to engage the user during gameplay. The controller device generates tracking information indicating a location of the imaginary enemy. The controller device determines the location of the enemy relative to the location of the user and transmits that information to the tracker device. The tracker device can communicate the location of the imaginary enemy to the user for engagement.Type: GrantFiled: January 13, 2015Date of Patent: August 7, 2018Assignee: Disney Enterprises, Inc.Inventors: Joseph Olson, Blade Olson, Michael P. Goslin, Charles Moneypenny, Ivone Alexandre
-
Patent number: 9933851Abstract: There is provided a system having a feedback device including a sensory feedback element, a non-transitory memory storing an executable code and a virtual object having a virtual surface, and a hardware processor. The hardware processor is configured to execute the executable code to determine a position of a hand of a first user, determine a location of the virtual surface of the virtual object, and transmit a first activation signal to the feedback device to cause a sensory feedback to be provided to the first user using the sensory feedback element of the feedback device based on the position of the hand of the first user relative to the location of the virtual surface of the virtual object.Type: GrantFiled: February 22, 2016Date of Patent: April 3, 2018Assignee: Disney Enterprises, Inc.Inventors: Michael P. Goslin, Eric C. Haseltine, Blade A. Olson
-
Publication number: 20180027221Abstract: This disclosure relates to apparatus and methods for bringing an object to life using a projection apparatus.Type: ApplicationFiled: September 12, 2017Publication date: January 25, 2018Inventors: MICHAEL P. GOSLIN, ERIC HASELTINE, BLADE A. OLSON
-
Publication number: 20170361213Abstract: Systems, methods and articles of manufacture for controlling electronic devices in an interactive gaming environment. Embodiments detect a first interactive game is playing within the physical environment using one or more electronic devices. User movement is monitored using at least one sensor device within the physical environment and user speech is monitored using one or more microphone sensor devices within the physical environment. Upon determining that the user movement matches a predefined series of user actions and that the user speech matches a corresponding predefined speech pattern, embodiments determine a gameplay action corresponding to both the predefined series of user actions and the predefined speech pattern and transmit an instruction, to at least one of the one or more electronic devices within the physical environment, instructing the electronic device to perform the determined gameplay action.Type: ApplicationFiled: June 16, 2016Publication date: December 21, 2017Inventors: Michael P. GOSLIN, Blade A. OLSON