Abstract: An endoscopic navigation method includes the steps of: receiving an image from an endoscopic navigation system; performing image classification to determined whether the image is usable; performing a first image process on the image to filter out dark areas of the image to produce a first processed image; performing a first determination procedure to identify the dark areas of the first processed image; producing a first result image for indicating lumen direction; performing a second image process to filter out fold curves of the image to produce a second processed image when there is no dark area in the image; performing a second determination procedure to identify the fold curves of the second processed image; producing a second result image for indicating lumen direction according to the fold curves; and outputting the first or the second result image to a display device to assist users in operating the photographic device.
Type:
Application
Filed:
June 22, 2010
Publication date:
June 30, 2011
Applicants:
NATIONAL YUNLIN UNIVERSITY OF SCIENCE AND TECHNOLOGY, HSIU-PO WANG
Inventors:
Tsung-Chun Lee, Syu-Jyun Peng, Hsuan-Ting Chang, Hsiu-Po Wang
Abstract: Systems and methods for navigation and identification for endoscopic kidney surgery may include generating a map of an internal space of a patient's collecting system, including segmentation preoperative CT scans, using localization and three-dimensional reconstruction techniques on endoscopic video to create a point cloud, and registering the point cloud to the segmented CT scans. The systems and methods may include tracking a tip of the endoscope during the endoscopic kidney surgery using localization and three-dimensional reconstruction techniques. The systems and methods may include identifying and tracking kidney stones during the endoscopic kidney surgery using computational models.
Type:
Application
Filed:
March 29, 2024
Publication date:
October 3, 2024
Applicant:
Vanderbilt University
Inventors:
Nicholas L. Kavoussi, Ipek Oguz, Zachary Stoebner, Ayberk Acar, Jie Ying Wu, Daiwei Lu
Abstract: A medical system comprises an elongate instrument including a camera configured to capture at least one real-time image of anatomy within a patient anatomy. The system further comprises a processor configured to display, on one or more display screens, a three-dimensional patient computer model of the patient anatomy. The processor is further configured to display, over the three-dimensional patient computer model, a representation of a view angle of the elongate instrument. The representation of the view angle is displayed so as to appear to project from a synthetic representation of a distal tip of the elongate instrument. The processor is further configured to display the at least one captured real-time image.
Abstract: A medical system comprises an elongate instrument including a camera configured to capture at least one real-time image of anatomy within a patient anatomy. The medical system further comprises a processor configured to display, on one or more display screens: a three-dimensional patient computer model of the patient anatomy; a synthetic representation of the elongate instrument registered to the three-dimensional patient computer model; over the patient computer model, a representation of a view angle of the elongate instrument, the representation of the view angle being displayed so as to appear to project from a distal tip of the synthetic representation of the elongate instrument; and in a position based on the registration of the synthetic representation of the elongate instrument to the patient computer model, the at least one captured real-time image so as to appear to project from the distal tip of the synthetic representation of the elongate instrument.
Abstract: Navigation guidance is provided to an operator of an endoscope by determining a current position and shape of the endoscope relative to a reference frame, generating an endoscope computer model according to the determined position and shape, and displaying the endoscope computer model along with a patient computer model referenced to the reference frame so as to be viewable by the operator while steering the endoscope within the patient.
Abstract: Navigation guidance is provided to an operator of an endoscope by determining a current position and shape of the endoscope relative to a reference frame, generating an endoscope computer model according to the determined position and shape, and displaying the endoscope computer model along with a patient computer model referenced to the reference frame so as to be viewable by the operator while steering the endoscope within the patient.
Abstract: A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Abstract: A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Type:
Application
Filed:
August 22, 2011
Publication date:
March 22, 2012
Inventors:
Mark Hunter, Marc Wennogle, Troy Holsing
Abstract: A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Type:
Application
Filed:
January 6, 2016
Publication date:
December 8, 2016
Inventors:
Mark Hunter, Marc Wennogle, Troy Holsing
Abstract: A surgical instrument navigation system is provided that visually simulates a virtual volumetric scene of a body cavity of a patient from a point of view of a surgical instrument residing in the cavity of the patient. The surgical instrument navigation system includes: a surgical instrument; an imaging device which is operable to capture scan data representative of an internal region of interest within a given patient; a tracking subsystem that employs electro-magnetic sensing to capture in real-time position data indicative of the position of the surgical instrument; a data processor which is operable to render a volumetric, perspective image of the internal region of interest from a point of view of the surgical instrument; and a display which is operable to display the volumetric perspective image of the patient.
Abstract: A method and system for scene parsing and model fusion in laparoscopic and endoscopic 2D/2.5D image data is disclosed. A current frame of an intra-operative image stream including a 2D image channel and a 2.5D depth channel is received. A 3D pre-operative model of a target organ segmented in pre-operative 3D medical image data is fused to the current frame of the intra-operative image stream. Semantic label information is propagated from the pre-operative 3D medical image data to each of a plurality of pixels in the current frame of the intra-operative image stream based on the fused pre-operative 3D model of the target organ, resulting in a rendered label map for the current frame of the intra-operative image stream. A semantic classifier is trained based on the rendered label map for the current frame of the intra-operative image stream.
Type:
Application
Filed:
June 5, 2015
Publication date:
June 21, 2018
Inventors:
Stefan Kluckner, Ali Kamen, Terrence Chen