Patents by Inventor Guanghua G. Zhang

Guanghua G. Zhang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11672606
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Grant
    Filed: April 26, 2019
    Date of Patent: June 13, 2023
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Patent number: 11478308
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined. using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, of external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Grant
    Filed: June 24, 2020
    Date of Patent: October 25, 2022
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Publication number: 20220071721
    Abstract: A medical system comprises a control system and a display device coupled to the control system. The control system comprises a processor and a memory comprising machine readable instructions that, when executed by the processor, cause the control system to generate a primary image of an anatomic structure and manipulate a therapeutic instrument to deliver a therapy to the anatomic structure. The control system also determines an effect of the therapy on the anatomic structure and displays a representation of the effect of the therapy in an auxiliary image registered with the primary image of the anatomic structure.
    Type: Application
    Filed: November 18, 2021
    Publication date: March 10, 2022
    Inventors: Brian D. Hoffman, JR., Rajesh Kumar, David Q. Larkin, Nitish Swarup, Guanghua G. Zhang
  • Patent number: 11197731
    Abstract: A medical system may comprise a stereo display and an input device. The medical system may also comprise a processor configured to generate a three-dimensional image of an anatomical object and cause the three-dimensional image of the anatomical object and a two-dimensional window to be displayed. The processor may also be configured to cause a position and an orientation of the two-dimensional window relative to the three-dimensional image of the anatomical object to be changed on the stereo display by manipulation of the input device. The processor may also be configured to define a cut-plane to indicate a two-dimensional slice of the three-dimensional image of the anatomical object. The processor may also be configured to cause the two-dimensional slice of the three-dimensional image of the anatomical object to be displayed. An orientation of the displayed two-dimensional slice may be different than an orientation of the cut-plane with the three-dimensional image.
    Type: Grant
    Filed: September 9, 2019
    Date of Patent: December 14, 2021
    Assignee: INTUITIVE SURGICAL OPERATIONS, INC.
    Inventors: Brian David Hoffman, Rajesh Kumar, David Q. Larkin, Nitish Swarup, Guanghua G. Zhang
  • Patent number: 11116578
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Grant
    Filed: December 6, 2016
    Date of Patent: September 14, 2021
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Publication number: 20210000546
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined. using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, of external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Application
    Filed: June 24, 2020
    Publication date: January 7, 2021
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Patent number: 10842571
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Grant
    Filed: July 14, 2017
    Date of Patent: November 24, 2020
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Patent number: 10792107
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Grant
    Filed: December 6, 2016
    Date of Patent: October 6, 2020
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Patent number: 10555775
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Grant
    Filed: May 16, 2005
    Date of Patent: February 11, 2020
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Brian David Hoffman, David Q. Larkin, Giuseppe Prisco, Guanghua G Zhang, Rajesh Kumar
  • Publication number: 20190388169
    Abstract: To assist a surgeon performing a medical procedure, auxiliary images generally indicating internal details of an anatomic structure being treated are displayed and manipulated by the surgeon on a computer display screen to supplement primary images generally of an external view of the anatomic structure. A master input device controlling a robotic arm in a first mode may be switched by the surgeon to a second mode in order to function instead as a mouse-like pointing device to facilitate the surgeon performing such auxiliary information display and manipulation.
    Type: Application
    Filed: September 9, 2019
    Publication date: December 26, 2019
    Inventors: Brian David Hoffman, Rajesh Kumar, David Q. Larkin, Nitish Swarup, Guanghua G. Zhang
  • Publication number: 20190247131
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Application
    Filed: April 26, 2019
    Publication date: August 15, 2019
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Publication number: 20170312036
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Application
    Filed: July 14, 2017
    Publication date: November 2, 2017
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Maria Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Publication number: 20170079725
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Application
    Filed: December 6, 2016
    Publication date: March 23, 2017
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Publication number: 20170079726
    Abstract: Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
    Type: Application
    Filed: December 6, 2016
    Publication date: March 23, 2017
    Inventors: Brian D. Hoffman, David Q. Larkin, Giuseppe Prisco, Guanghua G. Zhang, Rajesh Kumar
  • Publication number: 20160235496
    Abstract: To assist a surgeon performing a medical procedure, auxiliary images generally indicating internal details of an anatomic structure being treated are displayed and manipulated by the surgeon on a computer display screen to supplement primary images generally of an external view of the anatomic structure. A master input device controlling a robotic arm in a first mode may be switched by the surgeon to a second mode in order to function instead as a mouse-like pointing device to facilitate the surgeon performing such auxiliary information display and manipulation.
    Type: Application
    Filed: April 27, 2016
    Publication date: August 18, 2016
    Applicant: Intuitive Surgical Operations, Inc.
    Inventors: Brian David Hoffman, Rajesh Kumar, David Q. Larkin, Giuseppe Prisco, Nitish Swarup, Guanghua G. Zhang
  • Patent number: 8542219
    Abstract: An apparatus and method for processing pose data derived from a pose of an elongate object such as, for example, a jotting implement, a pointer, a robotic arm or a cane. The elongate object has a tip contacting a plane surface with one or more invariant features. The pose of the elongate object is measured optically from on-board by an optical measuring system with the aid of the invariant feature. The pose is used for preparing a corresponding pose data and a subset of the pose data is identified and transmitted to an application such as a user application, where the subset can serve as command data or input data. Since the elongate object moves while its tip is contacting the surface the pose is measured periodically at sufficiently frequent measurement times ti to describe the motion at a desired temporal resolution. The subset can include all or a portion of the orientation data that describe the orientation of the elongate object in space and/or position data of the tip on the surface.
    Type: Grant
    Filed: January 30, 2004
    Date of Patent: September 24, 2013
    Assignee: Electronic Scripting Products, Inc.
    Inventors: Stewart R. Carl, Michael J. Mandella, Guanghua G. Zhang, Hector H. Gonzalez-Banos
  • Patent number: 7088440
    Abstract: A method and apparatus for determining a pose of an elongate object and an absolute position of its tip while the tip is in contact with a plane surface having invariant features. The surface and features are illuminated with a probe radiation and a scattered portion, e.g., the back-scattered portion, of the probe radiation returning from the plane surface and the feature to the elongate object at an angle ? with respect to an axis of the object is detected. The pose is derived from a response of the scattered portion to the surface and the features and the absolute position of the tip on the surface is obtained from the pose and knowledge about the feature. The probe radiation can be directed from the object to the surface at an angle ? to the axis of the object in the form of a scan beam. The scan beam can be made to follow a scan pattern with the aid of a scanning arrangement with one or more arms and one or more uniaxial or biaxial scanners. Angle ? can also be varied, e.g.
    Type: Grant
    Filed: December 22, 2003
    Date of Patent: August 8, 2006
    Assignee: Electronic Scripting Products, Inc.
    Inventors: Dale H. Buermann, Michael J. Mandella, Stewart R. Carl, Guanghua G. Zhang, Hector H. Gonzalez-Banos
  • Patent number: 7023536
    Abstract: An apparatus and method employing principles of stereo vision for determining one or more orientation parameters and especially the second and third Euler angles ?, ? of an elongate object whose tip is contacting a surface at a contact point. The apparatus has a projector mounted on the elongate object for illuminating the surface with a probe radiation in a known pattern from a first point of view and a detector mounted on the elongate object for detecting a scattered portion of the probe radiation returning from the surface to the elongate object from a second point of view. The orientation parameters are determined from a difference between the projected and detected probe radiation such as the difference between the shape of the feature produced by the projected probe radiation and the shape of the feature detected by the detector.
    Type: Grant
    Filed: March 8, 2004
    Date of Patent: April 4, 2006
    Assignee: Electronic Scripting Products, Inc.
    Inventors: Guanghua G. Zhang, Dale H. Buermann, Michael J. Mandella, Hector H. Gonzalez-Banos, Stewart R. Carl