MEDICAL SYSTEM AND MEDICAL SYSTEM OPERATING METHOD

- Olympus

A medical system and an operating method of a medical system are disclosed. The medical system includes: an endoscope; a controller that can operate the endoscope and generate a display image from a captured image; and a display for displaying the display image. The controller can record a model of an organ of concern generated in a preoperative plan, extract feature points in the model, extract a distance between each feature point and a reference position set in the model before surgery, extract corresponding points in the display image, and associate a first coordinate system of the model with a second coordinate system of a display space, based on the feature and corresponding points, to set a path for the endoscope to move such that a region of interest in the organ of concern corresponding to the reference position is displayed in the display image.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation application based on a PCT Patent Application No. PCT/JP2019/006601, filed on Feb. 21, 2019, whose priority is claimed on U.S. Provisional Patent Application No. 62/633,190, filed Feb. 21, 2018. The contents of both the PCT Patent Application and the United States Patent Application are incorporated herein by reference.

BACKGROUND Technical Field

The present disclosure relates to a medical system for performing a treatment through a hole formed in an abdominal wall or the like and a method for operating the medical system.

Background Art

Conventionally, in laparoscopic surgery, a method of performing a treatment by inserting a treatment tool, an endoscope, or the like through separate holes (openings) formed in an abdominal wall has been used. A scopist operating an endoscope needs to promptly provide an operator with a field of view of the endoscope optimal for treatment, for example, a field of view including a region of interest such as an organ of concern.

In order to provide the operator with the optimal endoscope field of view quickly, it is necessary for the scopist to quickly move the distal end of the insertion portion of the endoscope to the vicinity of the region of interest in the organ of concern so that the image captured by the endoscope includes the region of interest of the operator.

SUMMARY

The present disclosure provides a medical system and a method of operating a medical system that support the operation of inserting the distal end of an insertion portion of an endoscope from an arbitrary initial position to a target position using a model generated by a preoperative plan.

According to an aspect of the present disclosure, a medical system includes: an endoscope that has an imaging section and can be operated by being electrically driven; a controller that can operate the endoscope and generate a display image from an image captured by the imaging section; and a display that can display the display image. The controller can record a model of an organ of concern generated in a preoperative plan, extract a plurality of feature points in the model, and extract a distance between each of the feature points and a reference position set in the model before surgery. The controller can also extract a plurality of corresponding points corresponding to the plurality of feature points in the display image, and associate a first coordinate system of the model with a second coordinate system of a display space displayed by the display image, based on the plurality of feature points and the plurality of corresponding points, to set a path for the endoscope to move such that a region of interest in the organ of concern that corresponds to the reference position in the model is displayed in the display image.

The present disclosure also provides an operating method of the medical system that includes recording a model of an organ of concern generated in a preoperative plan, extracting a plurality of feature points in the model, extracting a distance between each of the feature points and a reference position set in the model before surgery, extracting a plurality of corresponding points corresponding to the plurality of feature points in the display image, and associating a first coordinate system of the model with a second coordinate system of a display space displayed by the display image, based on the plurality of feature points and the plurality of corresponding points, to set a path for the endoscope to move such that a region of interest in the organ of concern that corresponds to the reference position in the model is displayed in the display image.

According to the medical system and the medical system operating method disclosed herein, specific operation of the visual field of an endoscope with respect to an organ of concern can be supported.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a diagram showing an overall configuration of a medical system according to an exemplary embodiment.

FIG. 2 is a diagram showing a hardware configuration of the medical system.

FIG. 3 is a diagram showing an example of the overall configuration of a control section of the medical system.

FIG. 4 is a diagram showing an example of the overall configuration of a control section of the medical system.

FIG. 5 shows a model of an organ of concern generated in a preoperative plan.

FIG. 6 is a perspective view when a model is imaged from the virtual imaging section shown in FIG. 5.

FIG. 7 shows a plurality of feature points extracted in the same model.

FIG. 8 is a diagram showing an insertion portion of an endoscope inserted into an abdominal cavity and an organ of concern.

FIG. 9 is a display image generated by the control device from the captured image captured by the imaging section.

FIG. 10 is a flowchart of a control in the automatic mode of the control section.

FIG. 11 is a diagram showing the insertion section of the endoscope and the organ of concern after the automatic mode is completed.

FIG. 12 is a display image after the automatic mode is completed.

FIG. 13 shows an overall configuration example of an exemplary endoscope.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

An embodiment of the present disclosure will be described with reference to FIGS. 1 to 12. Dimensions and the like of each component are appropriately adjusted to make the drawings easy to see.

FIG. 1 is a diagram showing an overall configuration of a medical system 100 according to the present embodiment.

As shown in FIG. 1, the medical system 100 includes a treatment tool 1, an endoscope 2, a control device 3, a display device 4, and an input device 5. The medical system 100 is a system that supports a treatment performed by inserting the treatment tool 1 and the endoscope 2 through separate holes (openings) formed in the abdominal wall in laparoscopic surgery.

As shown in FIG. 1, the treatment tool 1 has a long insertion portion 10 that can be inserted into a patient's abdominal cavity, and an operation portion 11 provided at a proximal end of the insertion portion 10. The operator passes the insertion portion 10 through a trocar punctured in the abdomen of the patient, and introduces the insertion portion 10 into the abdominal cavity. Depending on the type of treatment and the condition of the affected part, the operator may introduce a plurality of treatment tools 1 into the abdominal cavity.

As shown in FIG. 1, the insertion portion 10 has a treatment portion 12 at the distal end that treats an affected part of a patient. In the present embodiment, the treatment portion 12 is a gripping mechanism including a pair of gripping members 12a.

The operation portion 11 is a member that operates the pair of gripping members 12a. The operation portion 11 has a handle. The pair of gripping members 12a of the treatment portion 12 are opened and closed by moving the handle relatively to the other parts of the operation portion 11. The operator can operate the treatment portion 12 while holding the operation portion 11 with one hand.

FIG. 2 is a diagram showing a hardware configuration of the medical system 100 excluding the treatment tool 1.

As shown in FIGS. 1 and 2, the endoscope 2 has a long insertion portion 20 that can be inserted into a patient's abdominal cavity, and an arm 21. The operator passes the insertion portion 20 through a trocar punctured in the abdomen of the patient, and introduces the insertion portion 20 into the abdominal cavity.

The insertion portion 20 is provided at the distal end thereof with an imaging section 22 having a lens and an imaging element for photographing the inside of the patient's abdomen. The insertion portion 20 introduced into the abdominal cavity is arranged at a position where the imaging section 22 can photograph the affected part of the abdomen to be treated. The imaging section 22 may have a function of an optical zoom or an electronic zoom.

The insertion portion 20 may further have an active bending portion which bends actively. By bending the active bending portion provided in a part of the insertion portion 20, the direction of the lens and the imaging element of the imaging section 22 can be changed.

The arm 21 is an electrically driven robot arm having at least one or more joints 23 as shown in FIG. 1. The distal end of the arm 21 is connected to the proximal end of the insertion portion 20 of the endoscope, and the arm 21 can move the insertion portion 20.

The joint 23 is a part that bends around the rotation axis, and may be one that actively bends by a motor or the like, or one that passively bends by advancing or retreating a connected wire or the like. Inside the arm 21, a control signal line, a wire, and the like for controlling the bending operation of the joint 23 are wired. Also, inside the arm 21, a control signal line that controls the imaging section 22 and a transmission signal that transfers a captured image captured by the imaging section 22 are wired.

The control device 3 has a drive section 31, an image processing section 32, and a control section 33, as shown in FIG. 2. The control device 3 controls the arm 21 and the like based on an input from the input device 5. The control device 3 generates a display image from the captured image captured by the imaging section 22 of the endoscope 2 and transfers the display image to the display device 4.

The drive section 31 drives the joint 23 of the arm 21 and the insertion portion 20. In a case where the joint 23 actively bends, the control section 33 generates a control signal to the drive section 31 to operate the joint 23. As a result, the joint 23 can be bent by the drive section 31. As another aspect, in a case where the insertion portion 20 has an active bending portion, the control section 33 generates a control signal for controlling the active bending portion. The drive section 31 generates power for operating the active bending portion according to the generated control signal. As a result, the active bending portion can be bent by the power transmitted through the arm 21.

That is, the drive section 31 can change the field of view of the endoscope 2 by driving at least one of the arm 21 and the insertion portion 20.

The image processing section 32 is connected to a transmission signal of a captured image captured by the imaging section 22, and acquires the captured image via the transmission signal. Further, the image processing section 32 generates a display image for display from the captured image. The image processing section 32 may perform image processing such as image format conversion and contrast adjustment on the captured image as needed. The generated display image is transferred to the display device 4 at a predetermined transfer timing.

The image processing section 32 can generate a display image by replacing an image such as a figure or character generated by the control section 33 with a captured image or by superimposing the image on the captured image. For example, the image processing section 32 can generate a display image by superimposing a character image related to a warning to the operator or operation support on the captured image. The image such as the graphic or the character may be generated not by the control section 33 but by the image processing section 32 based on an instruction from the control section 33.

The control section 33 receives an operation of the input device 5 and an image acquired by the image processing section 32 as inputs, and controls the drive section 31 and the image processing section 32 based on the inputs.

In the present embodiment, the control section 33 has two types of operation modes, a manual mode and an automatic mode. The control section 33 controls the drive section 31 and the image processing section 32 based on one operation mode selected from the two operation modes.

The manual mode is an operation mode in which the scopist operates the input device 5 to directly operate the joint 23 of the arm 21 of the endoscope 2 and the like.

The automatic mode is an operation mode in which the joints 23 and the like of the arm 21 of the endoscope 2 are automatically operated by the control section 33 based on the image acquired by the image processing section 32, and the visual field of the endoscope 2 is automatically adjusted.

FIGS. 3 and 4 are diagrams showing an example of the overall configuration of the control section 33.

As shown in FIG. 3, the control section 33 is a program executable device (computer) including a CPU (Central Processing section) 34, a memory 35 capable of reading a program, a storage section 36, and an input/output control section 37.

The function of the control section 33 is realized by the CPU 34 executing a program provided to the control section 33. At least a part of the function of the control section 33 may be configured by a dedicated logic circuit or the like.

The storage section 36 is a non-volatile recording medium that stores the above-described programs and necessary data. The storage section 36 includes, for example, a ROM, a hard disk, and the like. The program recorded in the storage section 36 is read into the memory 35 and executed by the CPU 34.

The input/output control section 37 receives input data from the input device 5 and the image processing section 32, and transfers the data to the CPU 34 and the like. In addition, when the CPU 34 controls the drive section 31 and the image processing section 32, the input/output control section 37 generates a control signal and the like for the drive section 31 and the image processing section 32 based on an instruction from the CPU 34.

Here, the control section 33 is not limited to a device provided in one piece of hardware. For example, the control section 33 may be configured by separating the CPU 34, the memory 35, the storage section 36, and the input/output control section 37 as separate hardware, and connecting the hardware via a communication line. Alternatively, the control section 33 may be realized as a cloud system by separating the storage section 36 and connecting the storage section 36 via a communication line.

Here, when the image processing section 32 performs the processing of the captured image, the memory 35 of the control section 33 may be used to store temporary data being processed. Further, a part or all of the processing of the captured image performed by the image processing section 32 may be performed by the CPU 34 of the control section 33 by executing a program.

The control section 33 may further include components necessary for the operation of the control device 3, other than the CPU 34, the memory 35, the storage section 36, and the input/output control section 37 shown in FIG. 3. For example, as shown in FIG. 4, the control section 33 may further include an image calculation section 38 that performs a part or all of the specific image processing and the image recognition processing. By further including the image calculation section 38, the control section 33 can execute specific image processing and image recognition processing at high speed.

The display device 4 is a device that displays a display image generated by the image processing section 32. As the display device 4, a known display device such as an LCD display can be used. The display device 4 may be a head-mounted display or a projector.

The input device 5 includes an operation input section 51, and a mode selection section 52, as shown in FIG. 2. The input device 5 is a device for inputting information necessary for the operation of the medical system 100.

The operation input section 51 is a device for inputting an operation of the joint 23 of the arm 21 of the endoscope 2. When the imaging section 22 has a zoom function, the operation input section 51 can also operate the zoom function. Also, the scopist can operate the operation input section 51 to operate the joint 23 of the arm 21 and the like.

The operation input section 51 may be configured by a joystick or may be configured by a touch panel, as shown in FIG. 1. It may be an operation input device having an arm shape similar to the arm 21. The display device 4 of the LCD display and the operation input section 51 of the touch panel may be integrally configured.

By operating the operation input section 51, the contents of the operation are transferred to the control section 33. The control section 33 calculates an operation amount of the arm joint 23 corresponding to the operation content. The control section 33 controls the drive section 31 so that the joint 23 operates with the calculated amount of operation.

When the operation mode of the control section 33 is the manual mode, the joint 23 and the like of the arm 21 of the endoscope 2 are directly operated by the operation of the operation input section 51.

On the other hand, when the operation mode of the control section 33 is the automatic mode, the operation of the operation input section 51 is invalidated by the control section 33, and the joint 23 and the like of the arm 21 of the endoscope 2 cannot be operated. The joint 23 and the like of the arm 21 of the endoscope 2 are operated by the control section 33.

The mode selection section 52 is a device that selects which of the two operation modes of the control section 33 the control section 33 operates in. The mode selection section 52 may be configured by a switch or may be configured by a touch panel. Further, the mode selection section 52 may be configured integrally with the operation input section 51. The operation mode selection of the control section 33 by the mode selection section 52 can be performed at any time.

(Operation of Medical System 100)

The operation and the operating method of the medical system 100 will be described with reference to FIGS. 5 to 12 by taking laparoscopic surgery as an example. In the present embodiment, the organ of concern is the gallbladder G.

Prior to laparoscopic surgery, the operator or the like performs a preoperative plan for generating a model (shape data or image) of the organ of concern using a known method. The operator or the like generates three-dimensional shape data of the organ of concern from a plurality of CT images, for example. The three-dimensional coordinate system of the three-dimensional shape data generated in the preoperative plan is referred to as “model coordinate system (first coordinate system) C1”.

FIG. 5 shows a model M of the organ of concern (gallbladder G) and peripheral organs (liver L, pancreas P, stomach S) generated in the preoperative plan. The model M is associated with three-dimensional coordinates (X1, Y1, and Z1 coordinates) in the model coordinate system C1, and the position of each part of the model M can be specified by the three-dimensional coordinates in the model coordinate system C1.

The imaging center point O shown in FIG. 5 is an imaging center point of the virtual imaging section V installed in the model coordinate system C1. FIG. 6 is a perspective view when the model M is imaged from the virtual imaging section V shown in FIG. 5. The surgeon or the like sets a region to be treated or observed in laparoscopic surgery as a “region of interest R” in the model M in the preoperative plan. In the present embodiment, the region of interest R is set in a part of the organ of concern (gallbladder G) as shown in FIG. 6.

The model M of the organ of concern generated in the preoperative plan is recorded in the storage section 36 of the control section 33 of the control device 3 (model recording step). The range and position of the region of interest R set in the model M are also recorded in the storage section 36.

FIG. 7 shows a plurality of feature points F extracted from the model M. The control device 3 extracts and stores a plurality of feature points F in the model M (a feature point extraction step). The plurality of feature points F are extracted using a known feature point extraction method. The plurality of feature points F are identified and stored in the storage section 36 together with the feature amount calculated according to a predetermined standard suitable for representing the feature, together with the three-dimensional coordinates in the model coordinate system C1. The extraction and recording of the plurality of feature points F may be performed before surgery or may be performed during surgery.

As shown in FIG. 7, the plurality of feature points F may form a graph in which adjacent feature points F are connected. The control device 3 sets “0” to a feature point closest to the region of interest R in the organ of concern (the gallbladder G). The distance on the graph (“1”, “2”, “3”, etc.) is set for each feature point F based on the three-dimensional information of the model M, using the feature point F of the number “0” as a reference.

The operation of the medical system 100 during laparoscopic surgery will be described. The surgeon provides a plurality of holes (openings) for placing a trocar on the abdomen of the patient, and punctures the trocar into the holes. Next, the operator passes the insertion portion 10 of the treatment tool 1 through a trocar punctured in the abdomen of the patient, and introduces the insertion portion 10 into the abdominal cavity.

Next, the scopist operates the mode selection section 52 to set the operation mode of the control section 33 to the manual mode. By operating the operation input section 51 and operating the endoscope 2, the scopist passes the insertion portion 20 of the endoscope 2 through a trocar punctured in the abdomen of the patient, and introduces the insertion portion 20 into the abdominal cavity.

FIG. 8 is a diagram showing the insertion section 20 of the endoscope 2 inserted into the abdominal cavity and the organ of concern T. FIG. 9 is a display image generated by the control device 3 from a captured image captured by the imaging section 22. In the display image shown in FIG. 9, the region of interest R of the organ of concern T is not displayed. The three-dimensional coordinate system of the display space displayed by the display image is referred to as “display coordinate system (second coordinate system) C2”.

The operator wants to operate the endoscope 2 so that the region of interest R is displayed on the display screen. Therefore, the surgeon or the scopist operates the mode selection section 52 to change the operation mode of the control section 33 to the automatic mode. The operation of the operation input section 51 is invalidated by the control section 33, and a scopist or the like cannot operate the joint 23 or the like of the arm 21 of the endoscope 2. Hereinafter, a description will be given along a control flowchart in the automatic mode shown in FIG. 10.

As shown in FIG. 10, when the operation mode of the control section 33 is set to the automatic mode, the control section 33 starts control in the automatic mode (step S10). Next, the control section 33 executes step S11.

In step S11, the control section 33 extracts a plurality of corresponding points A corresponding to a plurality of feature points F in the display image (corresponding point extracting step). The control section 33 extracts a plurality of corresponding points A in the display image based on the feature amounts of the plurality of feature points F stored in the storage section 36 in advance. For the extraction processing, a technique appropriately selected from known template matching techniques and the like is used. When the control section 33 includes the image calculation section 38 that performs part or all of the image matching processing at high speed, the above-described matching processing can be executed at high speed. Next, the control section 33 executes step S12.

In step S12, the control section 33 associates the model coordinate system C1 of the model M with the display coordinate system C2 of the display space in which the display image is displayed, based on the plurality of feature points F and the plurality of corresponding points A (association step). In the association process, a method appropriately selected from known coordinate conversion methods and the like is used. When the associating step is completed, the coordinate position of the model M that can be reached in the first coordinate system C1 can be converted to the display coordinate system C2 in the display space. Next, the control section 33 executes step S13.

In step S13, the control section 33 calculates the position of the region of interest R in the display coordinate system C2 from the position of the region of interest R set in the model M in the model coordinate system C1 (relative position calculation step). Next, the control section 33 executes step S14.

In step S14, the control section 33 operates the endoscope 2 so that the display image includes the region of interest R (endoscope operation step). The control section 33 uses the position of the region of interest R in the display coordinate system C2 obtained in step S13 to operate the arm 21 of the endoscope 2 so that the position of the region of interest R and the imaging center point O of the display coordinate system C2 approach each other. The control section 33 can make the position of the region of interest R closer to the imaging center point O of the display coordinate system C2 by the shortest path by using the distance on the graph set for each feature point F. After the arm 21 operates for a predetermined distance or a predetermined time, the control section 33 executes step S15.

In step S15, the control section 33 again extracts a plurality of corresponding points A corresponding to the plurality of feature points F in the updated display image, and determines whether there is a deviation in the association between the coordinate system C1 and the display coordinate system C2, based on the plurality of feature points F and the plurality of corresponding points A.

When it is determined that there is a deviation, the control section 33 executes step s12 again.

When it is determined that there is no deviation, the control section 33 next executes step S16.

In step S16, the control section 33 determines whether the position of the region of interest R is sufficiently close to the imaging center point O of the display coordinate system C2 and the display image includes the region of interest R.

When it is determined that it is not included, the control section 33 executes step S14 again.

When it is determined that it is included, the control section 33 next executes step S17 and ends the automatic mode.

FIG. 11 is a diagram showing the insertion section 20 of the endoscope 2 and the organ of concern T after the completion of the automatic mode. FIG. 12 shows a display image after the automatic mode is completed.

After the control in the automatic mode is completed, the region of interest R set in the model M generated by the preoperative plan is displayed on the display image.

According to the medical system 100 of the present embodiment, by using the model M generated by the preoperative plan, the operation of inserting the distal end of the insertion section 20 of the endoscope 2, which is inserted from an arbitrary hole formed in the abdominal wall, from an arbitrary initial position to a target position where the region of interest R is displayed on the display image can be supported.

In the medical system 100, the operator can set the region of interest R in the model M in advance, so that it is possible to quickly secure the visual field during the operation, and reduce the burden on the operator and the scopist.

In the automatic mode, the medical system 100 performs the association step again when there is a deviation in the association between the model coordinate system C1 and the display coordinate system C2. Even when the actual organ of concern T is deformed with respect to the model M generated in advance during the operation, by repeatedly correcting the association, the region of interest R can be reliably displayed on the display image.

As described above, an embodiment of the present system and method have been described in detail with reference to the drawings. However, the specific configuration is not limited to this embodiment, and may include a design change or the like without departing from the gist of the present disclosure. The components shown in the above-described embodiment and the modifications described below can be appropriately combined and configured.

In the above embodiment, the control device 3 automatically operates the endoscope in the relative position calculation step and the endoscope operation step. However, the control device may change the operation mode from the automatic mode to the manual mode after the association step. The scopist instructs the moving direction and the moving amount of the endoscope in the model coordinate system C1 of the model M, and the control device converts the instructed moving direction and the moving amount into the display coordinate system C2 of the display space, and operates the actual endoscope.

In the above embodiment, the display image is changed by operating the arm 21 of the endoscope 2 to change the imaging position of the endoscope 2, but the method of changing the display image is not limited to this. The image processing section may have a function of generating a display image by cutting out a partial area from the captured image of the endoscope, and may change the display image by changing the position where the image is cut out. The image processing section may change the display image by controlling the zoom function of the imaging section. Even in the case of an endoscope having no arm, the display image can be changed.

FIG. 13 is an overall configuration example showing an endoscope 2B which is a modification of the endoscope 2. The endoscope 2B has an active bending portion 23B at the distal end of the insertion section 20. The surgeon can change the position and direction of the imaging section 22 of the endoscope 2 by moving the endoscope 2B with holding the endoscope 2B. By bending the active bending portion 23B, the position and the direction of the imaging section 22 can be changed. Even when the endoscope 2B is used instead of the endoscope 2, the control section 33 can change the display image by driving the active bending portion 23B.

In the above embodiment, the endoscope 2 is automatically operated so that the region of interest R is located at the center of the display image. However, the target position of the endoscope 2 is not limited to the center of the display image. The target position may be other than the center of the display image. For example, when the display device 4 includes two LCD monitors, the center of the display image on one of the monitors may be set as the target position.

In the above embodiment, the endoscope is automatically operated in the endoscope operation step, but the method of operating the endoscope is not limited to this. The control device may change the operation mode from the automatic mode to the manual mode in the endoscope operation step, and superimpose the navigation information indicating the position of the region of interest R on the display image. The scopist may operate the operation input section 51 to display the region of interest R on the display image based on the navigation information on the display image.

The present disclosure is applicable to the medical system which has an endoscope, and the operating method of a medical system.

Claims

1. A medical system, comprising:

an endoscope that has an imaging section including a lens and is configured to be operated by being electrically driven;
a controller configured to operate the endoscope and generate a display image from an image captured by the imaging section; and
a display configured to display the display image,
wherein the controller is configured to: record a model of an organ of concern generated in a preoperative plan, the model being arranged with respect to a first coordinate system, extract feature points in the model, extract a distance between each of the feature points and a reference position set in the model before surgery, extract corresponding points in the display image that correspond to the feature points in the model, and associate the first coordinate system of the model with a second coordinate system of a display space displayed by the display image, based on the feature points and the corresponding points, to set a path for the endoscope to move such that a region of interest in the organ of concern that corresponds to the reference position in the model is displayed in the display image.

2. The medical system according to claim 1, wherein the controller is further configured to form a graph in which adjacent feature points are connected, and the distance between each of the feature points and the reference position is set on the graph.

3. The medical system according to claim 1, wherein the controller is further configured to drive the endoscope so that the region of interest is displayed in the display image.

4. The medical system according to claim 1, wherein the controller is further configured to set a shortest path of movement of the endoscope so that the region of interest and an imaging center point of the second coordinate system of the display space approach each other, and drive the endoscope along the shortest path.

5. The medical system according to claim 3, wherein the controller is further configured to disable an operation input device of the endoscope while the controller is driving the endoscope.

6. The medical system according to claim 3, wherein the controller is further configured to set a region in the model that is defined by some of the feature points so as to include the reference point, and drive the endoscope so that a region in the organ of concern that corresponds to the region in the model and includes the region of interest is displayed in the display image.

7. The medical system according to claim 6, wherein the controller is further configured to drive the endoscope along a shortest path of movement so that the corresponding region and an imaging center point of the second coordinate system of the display space approach each other.

8. The medical system according to claim 3, wherein, after driving the endoscope a predetermined distance or for a predetermined time, the controller is further configured to determine whether there is a deviation between the first coordinate system and the second coordinate system, and correct the deviation when present.

9. The medical system according to claim 8, wherein the controller is further configured to determine whether the region of interest is displayed in the display image.

10. The medical system according to claim 1, wherein the controller is further configured to superimpose navigation information with respect to the set path on the display image to guide an operator to move the endoscope along the set path to display the region of interest on the display image.

11. The medical system according to claim 1, wherein the controller is further configured to calculate a position of the region of interest in the second coordinate system of the display space relative to the reference position in the first coordinate system of the model.

12. An operating method of a medical system that includes an endoscope that has an imaging section including a lens and is configured to be driven and operated by electric power, a controller configured to operate the endoscope and generate a display image from an image captured by the imaging section, and a display configured to display the display image, the method comprising:

recording a model of an organ of concern generated in a preoperative plan, the model being arranged with respect to a first coordinate system,
extracting feature points in the model,
extracting a distance between each of the feature points and a reference position set in the model before surgery,
extracting corresponding points in the display image that correspond to the feature points in the model, and
associating the first coordinate system of the model with a second coordinate system of a display space displayed by the display image, based on the feature points and the corresponding points, to set a path for the endoscope to move such that a region of interest in the organ of concern that corresponds to the reference position in the model is displayed in the display image.

13. The operating method according to claim 12, further comprising forming a graph in which adjacent feature points are connected, and the distance between each of the feature points and the reference position is set on the graph.

14. The operating method according to claim 12, further comprising driving the endoscope so that the region of interest is displayed in the display image.

15. The operating method according to claim 12, further comprising setting a shortest path of movement of the endoscope so that the region of interest and an imaging center point of the second coordinate system of the display space approach each other, and drive the endoscope along the shortest path.

16. The operating method according to claim 14, wherein an operation input device of the endoscope is disabled while the endoscope is being driven.

17. The operating method according to claim 14, further comprising setting a region in the model that is defined by some of the feature points so as to include the reference point, and driving the endoscope so that a region in the organ of concern that corresponds to the region in the model and includes the region of interest is displayed in the display image.

18. The operating method according to claim 17, further comprising driving the endoscope along a shortest path of movement so that the corresponding region and an imaging center point of the second coordinate system of the display space approach each other.

19. The operating method according to claim 14, further comprising determining whether there is a deviation between the first coordinate system and the second coordinate system after driving the endoscope a predetermined distance or for a predetermined time, and correcting the deviation when present.

20. The operating method according to claim 19, further comprising determining whether the region of interest is displayed in the display image.

21. The operating method according to claim 12, further comprising superimposing navigation information with respect to the set path on the display image to guide an operator to move the endoscope along the set path to display the region of interest on the display image.

22. The operating method according to claim 12, further comprising calculating a position of the region of interest in the second coordinate system of the display space relative to the reference position in the coordinate system of the model.

Patent History
Publication number: 20210030476
Type: Application
Filed: Aug 18, 2020
Publication Date: Feb 4, 2021
Applicant: OLYMPUS CORPORATION (Tokyo)
Inventors: Kohei NAKASHIMA (Tokyo), Kosuke KISHI (Tokyo)
Application Number: 16/996,508
Classifications
International Classification: A61B 34/10 (20060101); A61B 34/20 (20060101); A61B 34/30 (20060101); A61B 90/00 (20060101);