Patents by Inventor Denise Ann Miller
Denise Ann Miller has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11960645Abstract: A method for disengaging a surgical instrument of a surgical robotic system comprising receiving a gaze input from an eye tracker; determining, by one or more processors, whether the gaze input indicates the gaze of the user is outside or inside of the display; in response to determining the gaze input indicates the gaze of the user is outside of the display, determining an amount of time the gaze of the user is outside of the display; in response to determining the gaze of the user is outside of the display for less than a maximum amount of time, pause the surgical robotic system from a teleoperation mode; and in response to determining the gaze of the user is outside of the display for more than the maximum amount of time, disengage the surgical robotic system from the teleoperation mode.Type: GrantFiled: November 16, 2021Date of Patent: April 16, 2024Assignee: Verb Surgical Inc.Inventors: Anette Lia Freiin von Kapri, Denise Ann Miller, Paolo Invernizzi, Joan Savall, John Magnasco
-
Patent number: 11944402Abstract: Surgical robotic systems, and methods of verifying functionality of a user interface device of such systems, are described. During a surgical procedure, the user interface device controls motion of a surgical tool. Proximity sensors of the user interface device generate proximity measures throughout the surgical procedure. The proximity measures are used to detect whether the user interface device is dropped and to responsively halt motion of the surgical tool. To verify an accuracy of the proximity sensors that provide the drop detection, a test is performed when the user interface device is placed in a dock. The test compares the generated proximity measures to expected proximity data. When the proximity measures match the expected proximity data, the system determines that the proximity sensors are functioning accurately and verifies that the user interface device is functioning safely. Other embodiments are described and claimed.Type: GrantFiled: September 18, 2020Date of Patent: April 2, 2024Assignee: Verb Surgical Inc.Inventors: Karan Handa, Denise Ann Miller, Joan Savall, Mufaddal Jhaveri
-
Publication number: 20240099555Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: ApplicationFiled: October 10, 2023Publication date: March 28, 2024Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Publication number: 20240081929Abstract: A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.Type: ApplicationFiled: September 19, 2023Publication date: March 14, 2024Inventors: Joan Savall, Denise Ann Miller, Anette Lia Freiin von Kapri, Paolo Invernizzi, John Magnasco
-
Publication number: 20240033023Abstract: A method performed by a surgical robotic system that includes a seat that is arranged for a user to sit and a display column that includes at least one display for displaying a three-dimensional (3D) surgical presentation. The method includes receiving an indication that the user has manually adjusted the seat and in response, determining, while the user is sitting on the seat, a position of the user's eyes, determining a configuration for the display column based on the determined position of the user's eyes, and adjusting the display column by actuating one or more actuators of the display column according to the determined configuration.Type: ApplicationFiled: October 16, 2023Publication date: February 1, 2024Inventors: Anette Lia FREIIN VON KAPRI, Joan SAVALL, Denise Ann MILLER
-
Patent number: 11826115Abstract: A method performed by a surgical robotic system that includes a seat that is arranged for a user to sit and a display column that includes at least one display for displaying a three-dimensional (3D) surgical presentation. The method includes receiving an indication that the user has manually adjusted the seat and in response, determining, while the user is sitting on the seat, a position of the user's eyes, determining a configuration for the display column based on the determined position of the user's eyes, and adjusting the display column by actuating one or more actuators of the display column according to the determined configuration.Type: GrantFiled: September 14, 2020Date of Patent: November 28, 2023Assignee: Verb Surgical Inc.Inventors: Anette Lia Freiin Von Kapri, Joan Savall, Denise Ann Miller
-
Patent number: 11819188Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: GrantFiled: February 8, 2023Date of Patent: November 21, 2023Assignee: Verb Surgical Inc.Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Patent number: 11806104Abstract: A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.Type: GrantFiled: April 20, 2022Date of Patent: November 7, 2023Assignee: Verb Surgical Inc.Inventors: Joan Savall, Denise Ann Miller, Anette Lia Freiin von Kapri, Paolo Invernizzi, John Magnasco
-
Publication number: 20230346198Abstract: Communication apparatus and devices for surgical robotic systems are described. The communication apparatus can include a user console in communication with a communication device having a surgical tool. The communication device can include a microphone to convert a sound input into an acoustic input signal. The communication device can transmit the acoustic input signal to the user console for reproduction as a sound output for a remote operator. The surgical tool can include an endoscope having several microphones mounted on a housing. The surgical tool can be a sterile barrier having a microphone and a drape. The microphone(s) of the surgical tools can face a surrounding environment such that a tableside staff is a source of the sound input that causes the sound output, and a surgeon and the tableside staff can communicate in a noisy environment. Other embodiments are also described and claimed.Type: ApplicationFiled: June 30, 2023Publication date: November 2, 2023Inventors: Denise Ann Miller, Joan Savall, Geoffrey Robert Russell
-
Patent number: 11786315Abstract: Surgical robotic systems, and methods of controlling such systems based on a user's grip on a user interface device, are described. During a surgical procedure, a surgical robotic system generates control commands to actuate a component of the surgical robotic system. The component can be a surgical tool that is actuated to move based on tracking data from the user interface device. The component can be the user interface device, which is actuated to render haptic feedback to the user's hand based on load data from a surgical robotic arm holding the surgical tool. In either case, the actuation is based on a center of rotation of the user interface device that corresponds to the user's grip on the user interface device. Other embodiments are described and claimed.Type: GrantFiled: September 18, 2020Date of Patent: October 17, 2023Assignee: Verb Surgical Inc.Inventors: Joan Savall, Denise Ann Miller, Karan Handa
-
Publication number: 20230293251Abstract: Surgical systems including a user console for controlling a surgical robotic tool are described. A witness sensor and a reference sensor can be mounted on the user console to measure an electromagnetic field distortion near a location, and to measure deformation of the location, respectively. Distortion in the electromagnetic field can be detected based on the measurements from the witness sensor and the reference sensor. An alert can be generated, or teleoperation of the surgical tool can be adjusted or paused, when a user interface device used to control the surgical tool is within a range of the distortion. The distortion can be from a known source, such as from actuation of a haptic motor of the user interface device, and the user console can adjust the actuation to reduce the likelihood that the distortion will disrupt surgical tool control. Other embodiments are described and claimed.Type: ApplicationFiled: April 21, 2023Publication date: September 21, 2023Inventors: Joan Savall, Denise Ann Miller, Hamid Reza Sani
-
Publication number: 20230263365Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: ApplicationFiled: February 8, 2023Publication date: August 24, 2023Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Patent number: 11723515Abstract: Communication apparatus and devices for surgical robotic systems are described. The communication apparatus can include a user console in communication with a communication device having a surgical tool. The communication device can include a microphone to convert a sound input into an acoustic input signal. The communication device can transmit the acoustic input signal to the user console for reproduction as a sound output for a remote operator. The surgical tool can include an endoscope having several microphones mounted on a housing. The surgical tool can be a sterile barrier having a microphone and a drape. The microphone(s) of the surgical tools can face a surrounding environment such that a tableside staff is a source of the sound input that causes the sound output, and a surgeon and the tableside staff can communicate in a noisy environment. Other embodiments are also described and claimed.Type: GrantFiled: March 30, 2021Date of Patent: August 15, 2023Assignee: Verb Surgical Inc.Inventors: Denise Ann Miller, Joan Savall, Geoffrey Robert Russell
-
Patent number: 11696807Abstract: Disclosed herein are methods to detect a free-falling or other non-surgical motions of the user interface device (UID) of a surgical robotic system so that the surgical robotic system may pause the robotic arm controlled by the UID to prevent the robotic arm from mimicking the unintentional movement of the UID. Contact sensors embedded in the UID may be used to detect conditions indicating that a user does not possess full control of the UID. After determining that the user does not have full control of the UID, the UID may detect if the UID is experiencing non-surgical motions using motion sensors such as inertial sensors. By conditioning analysis of the data from the motion sensors by the initial determination that the UID is not being held based on the contact sensors, the method increases the robustness of the detection of non-surgical motions and reduces the probability of false positives.Type: GrantFiled: March 17, 2020Date of Patent: July 11, 2023Assignee: Verb Surgical Inc.Inventors: Denise Ann Miller, Joan Savaii, Randall Blake Hellman
-
Patent number: 11653988Abstract: Surgical systems including a user console for controlling a surgical robotic tool are described. A witness sensor and a reference sensor can be mounted on the user console to measure an electromagnetic field distortion near a location, and to measure deformation of the location, respectively. Distortion in the electromagnetic field can be detected based on the measurements from the witness sensor and the reference sensor. An alert can be generated, or teleoperation of the surgical tool can be adjusted or paused, when a user interface device used to control the surgical tool is within a range of the distortion. The distortion can be from a known source, such as from actuation of a haptic motor of the user interface device, and the user console can adjust the actuation to reduce the likelihood that the distortion will disrupt surgical tool control. Other embodiments are described and claimed.Type: GrantFiled: July 29, 2019Date of Patent: May 23, 2023Assignee: Verb Surgical Inc.Inventors: Joan Savall, Denise Ann Miller, Hamid Reza Sani
-
Patent number: 11576743Abstract: Embodiments described herein provide various examples of a machine-learning-based visual-haptic system for constructing visual-haptic models for various interactions between surgical tools and tissues. In one aspect, a process for constructing a visual-haptic model is disclosed. This process can begin by receiving a set of training videos. The process then processes each training video in the set of training videos to extract one or more video segments that depict a target tool-tissue interaction from the training video, wherein the target tool-tissue interaction involves exerting a force by one or more surgical tools on a tissue. Next, for each video segment in the set of video segments, the process annotates each video image in the video segment with a set of force levels predefined for the target tool-tissue interaction. The process subsequently trains a machine-learning model using the annotated video images to obtain a trained machine-learning model for the target tool-tissue interaction.Type: GrantFiled: June 29, 2021Date of Patent: February 14, 2023Assignee: VERB SURGICAL INC.Inventors: Jagadish Venkataraman, Denise Ann Miller
-
Patent number: 11504200Abstract: Wearable user interface devices are described. A wearable user interface device can include a wearable base connected to a trackable device component by a linkage. The linkage can connect to a pivoted support that the trackable device is mounted on, and which maintains poses when the user interface device is not manipulated by a user's hand. The pivoted support has several orthogonal axes intersecting at a center of rotation located inside a device body of the trackable device. Other embodiments are also described and claimed.Type: GrantFiled: January 24, 2019Date of Patent: November 22, 2022Assignee: VERB SURGICAL INC.Inventors: Joan Savall, Richard Edward DeMartini, Randall Blake Hellman, Denise Ann Miller, Anette Lia Freiin von Kapri, Pablo E. Garcia Kilroy
-
Publication number: 20220323168Abstract: A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.Type: ApplicationFiled: April 20, 2022Publication date: October 13, 2022Inventors: Joan Savall, Denise Ann Miller, Anette Lia Freiin von Kapri, Paolo Invernizzi, John Magnasco
-
Publication number: 20220179483Abstract: A method for disengaging a surgical instrument of a surgical robotic system comprising receiving a gaze input from an eye tracker; determining, by one or more processors, whether the gaze input indicates the gaze of the user is outside or inside of the display; in response to determining the gaze input indicates the gaze of the user is outside of the display, determining an amount of time the gaze of the user is outside of the display; in response to determining the gaze of the user is outside of the display for less than a maximum amount of time, pause the surgical robotic system from a teleoperation mode; and in response to determining the gaze of the user is outside of the display for more than the maximum amount of time, disengage the surgical robotic system from the teleoperation mode.Type: ApplicationFiled: November 16, 2021Publication date: June 9, 2022Inventors: Anette Lia Freiin von Kapri, Denise Ann Miller, Paolo Invernizzi, Joan Savall, John Magnasco
-
Patent number: 11337767Abstract: A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.Type: GrantFiled: May 17, 2019Date of Patent: May 24, 2022Assignee: VERB SURGICAL INC.Inventors: Joan Savall, Denise Ann Miller, Anette Lia Freiin von Kapri, Paolo Invernizzi, John Magnasco