Patents by Inventor Guo-Qing Wei

Guo-Qing Wei has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250082420
    Abstract: Method, system, medium, and implementation for robot-guided instrument insertion. A preplanned path between a three-dimensional (3D) entry pose on skin of a patient to a 3D pose of a target is generated for a surgery with a target inside the patient. The preplanned path is used by a robot as guidance to insert a surgical instrument from the 3D entry pose to reach the 3D pose of the target. During the surgery, a next pose is determined based on a current pose of the surgical instrument and the preplanned path as well as spatial relationship to surrounding anatomical structures and is used to move the surgical instrument thereto. An updated current pose of the instrument is then obtained via tracking when the robot is advancing the surgical instrument to the next pose. The process repeats until the instrument reaches the target pose.
    Type: Application
    Filed: September 12, 2023
    Publication date: March 13, 2025
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 12245823
    Abstract: The present teaching relates to automated trocar/robot base location determination. An input related to a surgical operation includes a three-dimensional (3D) model for an organ with cut points on the organ forming a surgical trajectory. A surgical instrument is controlled by a robot to reach the cut points to carry out the operation. An insertion location for inserting the surgical instrument is identified. Based on the identified insertion location, a robot base location is determined with respect to the identified insertion location to deploy the robot for controlling the surgical instrument to reach the cut points. Signals for configuring the surgical setting are then generated based on the insertion location for the surgical instrument and the base location for the robot.
    Type: Grant
    Filed: February 2, 2023
    Date of Patent: March 11, 2025
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Yash Evalekar, Yuanfeng Mao, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20250046010
    Abstract: Method and system for estimating 3D camera pose based on 2D features. 3D virtual camera poses are generated, each of which is used to determine a perspective to project a 3D model of a target organ to create a 2D image of the target organ. 2D features are extracted from each 2D image and paired with the corresponding 3D virtual camera pose to represent a mapping. A 2D feature-camera pose mapping model is obtained based on the pairs. Input 2D features extracted from a real-time 2D image of the target organ are used to map, via the 2D feature-camera pose mapping model, to a 3D pose estimate of a laparoscopic camera, which is then refined to derive an estimated 3D camera pose of the laparoscopic camera via differential rendering of the 3D model with respect to the 3D pose estimate.
    Type: Application
    Filed: July 31, 2024
    Publication date: February 6, 2025
    Inventors: Xiaotao Guo, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20240277416
    Abstract: The present teaching relates to automated generation of a surgical tool-based visual guide. Two-dimensional (2D) images capturing anatomical structures and a surgical instrument therein are provided during a surgery. The type and pose of a tool attached to the surgical instrument are detected based on the 2D images. Focused information is determined based on the type and pose of the detected tool and is used to generate a visual guide to assist a user to perform a surgical task using the tool.
    Type: Application
    Filed: February 22, 2023
    Publication date: August 22, 2024
    Inventors: Xiaonan Zang, Guo-Qing Wei, Li FAN, Xiaolan ZENG, Jianzhong QIAN
  • Publication number: 20240277411
    Abstract: The present teaching relates to automatic surgical tool based model fusion. Two-dimensional (2D) images are received, capturing a surgical instrument. A 2D location of a surgical tool attached to the instrument is detected. Specific type of focused information for is determined based on the surgical tool for assisting a surgical task using the surgical tool. 2D/3D corresponding focused regions are identified via model fusion. A visual guide is created based on the type of focused information from the 3D focused region of a 3D model for an organ and surrounding anatomical structures by projecting it onto the 2D focused region.
    Type: Application
    Filed: February 22, 2023
    Publication date: August 22, 2024
    Inventors: Xiaonan Zang, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20240261038
    Abstract: The present teaching relates to automated trocar/robot base location determination. An input relates to a surgical operation with a 3D model for an organ including cut points thereon forming a surgical trajectory. A surgical instrument is controlled by a robot to reach the cut points to carry out the surgical operation. Candidate combinations of insertion location for inserting the surgical instrument and base location for deploying the robot are generated. One of the candidate combinations is identified based on evaluation vectors associated therewith. Each evaluation vector provides assessment information on characteristics and spatial relationships of the respective candidate locations. The selected combination of locations is used for the surgical operation.
    Type: Application
    Filed: February 2, 2023
    Publication date: August 8, 2024
    Inventors: Yash Evalekar, Yuanfeng Mao, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20240261028
    Abstract: The present teaching relates to automated trocar/robot base location determination. An input related to a surgical operation includes a three-dimensional (3D) model for an organ with cut points on the organ forming a surgical trajectory. A surgical instrument is controlled by a robot to reach the cut points to carry out the operation. An insertion location for inserting the surgical instrument is identified. Based on the identified insertion location, a robot base location is determined with respect to the identified insertion location to deploy the robot for controlling the surgical instrument to reach the cut points. Signals for configuring the surgical setting are then generated based on the insertion location for the surgical instrument and the base location for the robot.
    Type: Application
    Filed: February 2, 2023
    Publication date: August 8, 2024
    Inventors: Yash Evalekar, Yuanfeng Mao, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20240261045
    Abstract: The present teaching relates to automated robot base location determination. An instrument access map is obtained with respect to a surgical instrument for performing a surgical operation on an organ and defines a surface area on the organ with cut points that form a surgery trajectory. Information about a robot and a surgical environment is received and used to generate a robot access map with multiple robot base locations. The robot is for controlling the surgical instrument to perform the surgical operation along the surgery trajectory. A robot base location is selected based on evaluation parameters derived for each of the robot base locations. Control signals are generated for configuring the robot at the selected robot base location to facilitate the surgical operation.
    Type: Application
    Filed: February 2, 2023
    Publication date: August 8, 2024
    Inventors: Yash Evalekar, Yuanfeng Mao, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20240261034
    Abstract: The present teaching relates to surgical position marking. A 3D model for an organ includes cut points forming a surgical trajectory. Each cut point has a 3D coordinate and a surface norm in the model space. When projected into a workspace, a mapped cut point is created with a mapped 3D coordinate and a mapped surface norm in the workspace. With a surgical instrument with a tip and a force sensor attached thereto, some mapped cut points are marked along a direction determined based on a reaction force sensed by the force sensor when the tip touches the cut point.
    Type: Application
    Filed: February 2, 2023
    Publication date: August 8, 2024
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20240249417
    Abstract: The present teaching relates to method, system, medium, and implementations for region boundary identification and overlay display. An input associated with a 3D object is received. A 3D volumetric model is obtained for the 3D object with multiple regions, each of which includes multiple labeled voxels. A mesh representation is generated for the surface of the 3D object. Region boundaries on surface of the 3D object are identified based on the 3D volumetric model and the mesh representation. The 3D object is rendered, and the region boundaries are overlaid on the rendered 3D object.
    Type: Application
    Filed: January 23, 2023
    Publication date: July 25, 2024
    Inventors: Cheng-Chung Liang, Guo-Qing Wei, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 12016632
    Abstract: A method, system, medium, and implementations for computer-aided preoperative surgical planning are described. Input data acquired with respect to a part of a patient is received by the system. The part corresponds to an organ, e.g., lung, of the patient to be operated on and includes one or more lesions to be removed during an operation. Then, an anatomic 3D model of the part of the patient is generated. Based on the generated anatomic 3D model, a preoperative plan for linear-cutting stapler resection of the one or more lesions from the organ to be carried out during the operation is obtained. The stapler cartridge size and the staple length are estimated based on the preoperative plan. Further, the resection based on the preoperative plan is visualized.
    Type: Grant
    Filed: October 23, 2020
    Date of Patent: June 25, 2024
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Guo-Qing Wei, Cheng-Chung Liang, Xiaolan Zeng, Li Fan, Jianzhong Qian
  • Patent number: 11989833
    Abstract: The present teaching relates to method, system, medium, and implementations for fusing a 3D virtual model with a 2D image associated with an organ of a patient. A key-pose is determined as an approximate position and orientation of a medical instrument with respect to the patient's organ. Based on the key-pose, an overlay is generated on a 2D image of the patient's organ, acquired by the medical instrument, by projecting the 3D virtual model on to the 2D image. A pair of feature points includes a 2D feature point from the 2D image and a corresponding 3D feature point from the 3D virtual model. The 3D coordinate of the 3D feature point is determined based on the 2D coordinate of the 2D feature point. The depth of the 3D coordinate is on a line of sight of the 2D feature point and is determined so that the projection of the 3D virtual model from the depth creates an overlay approximately matching the organ observed in the 2D image.
    Type: Grant
    Filed: May 16, 2022
    Date of Patent: May 21, 2024
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Xiaonan Zang, Guo-Qing Wei, Cheng-Chung Liang, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11919170
    Abstract: The present teaching relates to a method and system for path planning. Information of a current pose of a robotic arm having a plurality of operable segments is obtained. The information includes a plurality of values, each of which corresponds to an angle formed between consecutive operable segments of the robotic arm. A desired pose where the robotic arm needs to reach is also obtained. An angle step-value is computed for the current pose of the robotic arm based on a function of a distance between the current pose and the desired pose, wherein the angle step value is to be used to determine a plurality of candidate next poses of the plurality of operable segments. One or more of candidate next poses is selected based on at least one criterion, and a trajectory is determined from the current pose to the desired pose based on the selected next poses.
    Type: Grant
    Filed: December 13, 2019
    Date of Patent: March 5, 2024
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Firdous Saleheen, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11900541
    Abstract: The present teaching relates to method, system, medium, and implementations for estimating 3D coordinate of a 3D virtual model. Two pairs of feature points are obtained. Each of the pairs includes a respective 2D feature point on an organ observed in a 2D image, acquired during a medical procedure, and a respective corresponding 3D feature point from a 3D virtual model, constructed for the organ prior to the procedure based on a plurality of images of the organ. The depths of the first and the second 3D feature points are substantially the same. A first 3D coordinate of the first 3D feature point and a second 3D coordinate of the second 3D feature point are automatically determined based on the pairs of feature points so that a first distance between the determined first 3D coordinate and the determined second 3D coordinate equals to a second distance between a first actual 3D coordinate of the first 3D feature point and a second actual 3D coordinate of the second 3D feature point in the 3D virtual model.
    Type: Grant
    Filed: May 16, 2022
    Date of Patent: February 13, 2024
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Xiaonan Zang, Guo-Qing Wei, Cheng-Chung Liang, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11822340
    Abstract: The present teaching relates to method, system, medium, and implementations for robot path planning. Depth data of obstacles, acquired by depth sensors deployed in a 3D robot workspace and represented with respect to a sensor coordinate system, is transformed into depth data with respect to a robot coordinate system. The 3D robot workspace is discretized to generate 3D grid points representing a discretized 3D robot workspace. Based on the depth data with respect to the robot coordinate system, binarized values are assigned to at least some of 3D grid points to generate a binarized representation for the obstacles present in the 3D robot workspace. With respect to one or more sensing points associated with a part of a robot, it is determined whether the part is to collide with any obstacle. Based on the determining, a path is planned for the robot to move along while avoiding any obstacle.
    Type: Grant
    Filed: March 5, 2021
    Date of Patent: November 21, 2023
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Guo-Qing Wei, Yuanfeng Mao, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20230077638
    Abstract: The present teaching relates to a method and system for path planning. A target is tracked via one or more sensors. Information of a desired pose of an end-effector with respect to the target and a current pose of the end-effector is obtained. Also, a minimum distance permitted between an arm including the end-effector and each of at least one obstacle identified between the current pose of the end-effector and the target is obtained. A weighting factor previously learned is retrieved and a cost based on a cost function is computed in accordance with a weighted smallest distance between the arm including the end-effector and the at least one obstacle, wherein the smallest distance is weighted by the weighting factor. A trajectory is computed from the current pose to the desired pose by minimizing the cost function.
    Type: Application
    Filed: November 18, 2022
    Publication date: March 16, 2023
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Firdous Saleheen, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20220378507
    Abstract: A method, system, medium, and implementations for computer-aided preoperative surgical planning are described. Input data acquired with respect to a part of a patient is received by the system. The part corresponds to an organ, e.g., lung, of the patient to be operated on and includes one or more lesions to be removed during an operation. Then, an anatomic 3D model of the part of the patient is generated. Based on the generated anatomic 3D model, a preoperative plan for linear-cutting stapler resection of the one or more lesions from the organ to be carried out during the operation is obtained. The stapler cartridge size and the staple length are estimated based on the preoperative plan. Further, the resection based on the preoperative plan is visualized.
    Type: Application
    Filed: October 23, 2020
    Publication date: December 1, 2022
    Inventors: Guo-Qing WEI, Cheng-Chung LIANG, Xiaolan ZENG, Li FAN, Jianzhong QIAN
  • Publication number: 20220375173
    Abstract: The present teaching relates to method, system, medium, and implementations for estimating 3D coordinate of a 3D virtual model. Two pairs of feature points are obtained. Each of the pairs includes a respective 2D feature point on an organ observed in a 2D image, acquired during a medical procedure, and a respective corresponding 3D feature point from a 3D virtual model, constructed for the organ prior to the procedure based on a plurality of images of the organ. The depths of the first and the second 3D feature points are substantially the same. A first 3D coordinate of the first 3D feature point and a second 3D coordinate of the second 3D feature point are automatically determined based on the pairs of feature points so that a first distance between the determined first 3D coordinate and the determined second 3D coordinate equals to a second distance between a first actual 3D coordinate of the first 3D feature point and a second actual 3D coordinate of the second 3D feature point in the 3D virtual model.
    Type: Application
    Filed: May 16, 2022
    Publication date: November 24, 2022
    Inventors: Xiaonan Zang, Guo-Qing Wei, Cheng-Chung Liang, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Patent number: 11504849
    Abstract: The present teaching relates to a method and system for path planning. A target is tracked via one or more sensors. Information of a desired pose of an end-effector with respect to the target and a current pose of the end-effector is obtained. Also, a minimum distance permitted between an arm including the end-effector and each of at least one obstacle identified between the current pose of the end-effector and the target is obtained. A weighting factor previously learned is retrieved and a cost based on a cost function is computed in accordance with a weighted smallest distance between the arm including the end-effector and the at least one obstacle, wherein the smallest distance is weighted by the weighting factor. A trajectory is computed from the current pose to the desired pose by minimizing the cost function.
    Type: Grant
    Filed: November 22, 2019
    Date of Patent: November 22, 2022
    Assignee: EDDA TECHNOLOGY, INC.
    Inventors: Yuanfeng Mao, Guo-Qing Wei, Firdous Saleheen, Li Fan, Xiaolan Zeng, Jianzhong Qian
  • Publication number: 20220366649
    Abstract: The present teaching relates to method, system, medium, and implementations for estimating 3D coordinate of a 3D virtual model. Two pairs of feature points are obtained. Each of the pairs includes a respective 2D feature point on an organ observed in a 2D image, acquired during a medical procedure, and a respective corresponding 3D feature point from a 3D virtual model, constructed for the organ prior to the procedure based on a plurality of images of the organ. The first and the second 3D feature points have different depths. A 3D coordinate of a 3D feature point is determined based on the pairs of feature points so that a projection of the 3D virtual model from the 3D coordinate substantially matches the organ observed in the 2D image.
    Type: Application
    Filed: May 16, 2022
    Publication date: November 17, 2022
    Inventors: Xiaonan Zang, Guo-Qing Wei, Cheng-Chung Liang, Li Fan, Xiaolan Zeng, Jianzhong Qian