ENDOSCOPE SYSTEM, CONTROLLER, CONTROL METHOD, AND RECORDING MEDIUM

- Olympus

An endoscope system includes an endoscope, a moving device that moves the endoscope, a storage unit, and a processor. The storage unit stores first position information and first rotation angle information, which defines a rotation angle of an endoscope image of a first region, on the first region in a subject and second position information and second rotation angle information, which defines a rotation angle of an endoscope image of a second region, on the second region in the subject. The processor calculates third rotation angle information on a third region in the subject based on the first and second position information, the first and second rotation angle information, and third position information on the third region. If the third region includes the current imaging region, the processor rotates the endoscope image based on the third rotation angle information and outputs the rotated endoscope image to the display device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
TECHNICAL FIELD

The present invention relates to an endoscope system, a controller, a control method, and a recording medium.

The present application claims priority under the provisional U.S. patent application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/033210 which is hereby incorporated by reference herein in its entirety.

BACKGROUND ART

Conventionally, an endoscope system that controls an electric holder so as to move an endoscope held by the holder has been known (for example, see PTL 1).

An endoscope system of PTL 1 stores time series variations in the rotation angle of each joint of the holder in a manual mode while an operator moves the endoscope, and the endoscope system reversely reproduces the time series variations in the rotation angle of each joint in an automatic return mode. Thus, the endoscope moves reversely along a movement path in the manual mode and automatically returns to the initial position and orientation.

CITATION LIST Patent Literature

{PTL 1} The publication of Japanese Patent No. 6161687

SUMMARY OF INVENTION

An aspect of the present invention is an endoscope system including an endoscope that is inserted into a subject (into the body cavity of a patient) and captures an endoscope image in the subject; a moving device that holds the endoscope and moves the endoscope; a storage unit; and a controller including at least one processor, wherein the storage unit stores first position information and first rotation angle information on a first region in the subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the at least one processor calculates third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions, and the at least one processor rotates the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and the at least one processor outputs the rotated endoscope image to the display device.

Another aspect of the present invention is a controller configured to control an endoscope image that is captured by an endoscope and is displayed on a display device, the controller including: a storage unit; and at least one processor, wherein the storage unit stores first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the at least one processor calculates third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions, and the at least one processor rotates the endoscope image on the basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and the at least one processor outputs the rotated endoscope image to the display device.

Another aspect of the present invention is a control method for controlling an endoscope image that is captured by an endoscope and is displayed on a display device, by using first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the control method including the steps of: calculating third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions; rotating the endoscope image on the basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and outputting the rotated endoscope image to the display device.

Another aspect of the present invention is a computer-readable non-transitory recording medium in which a control program for causing a computer to perform the control method is recorded.

BRIEF DESCRIPTION OF DRAWINGS

FIG. 1A is an appearance illustrating the overall configuration of an endoscope system.

FIG. 1B is an explanatory drawing of a movement of an endoscope inserted in an abdominal cavity.

FIG. 1C illustrates the tip portion of a robot arm and the endoscope.

FIG. 2 is a block diagram illustrating the overall configuration of the endoscope system.

FIG. 3A is a sequence diagram of a control method according to a first embodiment and an explanatory drawing of a user operation and the processing of a processor in a manual mode.

FIG. 3B is a flowchart of the control method according to the first embodiment and an explanatory drawing of the processing of the processor in an autonomous mode.

FIG. 4A is an explanatory drawing of an endoscope operation in the step of determining first position information and first rotation angle information.

FIG. 4B is an explanatory drawing of an endoscope operation in the step of determining second position information and second rotation angle information.

FIG. 5A illustrates an endoscope image at O-point.

FIG. 5B illustrates an endoscope image at B-point.

FIG. 5C illustrates the endoscope image of FIG. 5B when the vertical direction is adjusted by a rotation.

FIG. 6A illustrates an endoscope image at A-point.

FIG. 6B illustrates the endoscope image of FIG. 6A when the vertical direction is adjusted by a rotation.

FIG. 7 indicates position information and rotation angle information that are stored in a storage unit in the manual mode.

FIG. 8A is a sequence diagram of a control method according to a second embodiment and an explanatory drawing of a user operation and the processing of a processor in a manual mode.

FIG. 8B is a flowchart of the control method according to the second embodiment and an explanatory drawing of the processing of the processor in an autonomous mode.

FIG. 9 is a flowchart of a control method according to a third embodiment and an explanatory drawing of the processing of a processor in an autonomous mode.

FIG. 10 illustrates an oblique endoscope according to a first modification.

FIG. 11A is a sequence diagram of a control method1 according to the first modification and an explanatory drawing of a user operation and the processing of the processor in the manual mode.

FIG. 11B is a flowchart of the control method according to the first modification and an explanatory drawing of the processing of the processor in the autonomous mode.

FIG. 12 illustrates an endoscope with a curved portion according to a second modification.

FIG. 13A is a sequence diagram of a control method according to another modification and an explanatory drawing of a user operation and the processing of the processor in the manual mode.

FIG. 13B is a flowchart of the control method according to another modification and an explanatory drawing of the processing of the processor in the autonomous mode.

FIG. 14A is an appearance illustrating the overall configuration of a modification of the endoscope system in FIG. 1A.

FIG. 14B is an appearance illustrating the overall configuration of another modification of the endoscope system in FIG. 1A.

DESCRIPTION OF EMBODIMENTS (First Embodiment)

An endoscope system, a controller, a control method, and a recording medium according to a first embodiment of the present invention will be described below with reference to the accompanying drawings.

As illustrated in FIG. 1A, an endoscope system 10 according to the present embodiment is used for a surgical operation in which an endoscope 2 and at least one surgical instrument 6 are inserted into the body of a patient X serving as a subject and an affected part is treated with the surgical instrument 6 while the surgical instrument 6 is observed through the endoscope 2. The endoscope system 10 is used for, for example, laparoscopic surgery.

As illustrated in FIG. 1B, the endoscope 2 is inserted into the subject, for example, an abdominal cavity through a hole H formed on the body wall. Thus, the endoscope 2 is fixed to the subject, is supported by the body wall at the position of the hole H serving as a pivot point, and is pivotable about a pivot axis (first pivot axis) P1 passing through the pivot point H. In laparoscopic surgery illustrated in FIGS. 1A and 1B, the pivot axis P1 extends in the anteroposterior direction of the patient X from the abdomen to the back. The endoscope 2 pivots about the pivot axis P1 so as to move an imaging region of the endoscope 2 between a first region including an aorta F and a second region including a pelvis G.

The endoscope 2 and the surgical instrument 6 may be inserted into the subject through a cannula passing through the hole H. The cannula is a cylindrical instrument opened at both ends. In this case, the endoscope 2 is supported by the cannula at the position of the hole H.

As illustrated in FIGS. 1A and 2, the endoscope system 10 includes the endoscope 2, a moving device 3 that holds the endoscope 2 and moves the endoscope 2 in the subject, an endoscope processor 4 that is connected to the endoscope 2 and processes an endoscope image E captured by the endoscope 2, a controller 1 that is connected to the moving device 3 and the endoscope processor 4 and controls the moving device 3, and a display device 5 that is connected to the endoscope processor 4 and displays the endoscope image E.

The endoscope 2 is a direct-view endoscope having a visual axis (optical axis) C coaxial with a longitudinal axis I of the endoscope 2. The endoscope 2 is, for example, a rigid endoscope. The endoscope 2 including an image sensor 2a captures an image in a subject X, for example, an abdominal cavity and acquires the endoscope image E including the tip of the surgical instrument 6 (see FIGS. 5A to 6B). The image sensor 2a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 2 and captures a stereo image as the endoscope image E. The image sensor 2a is an image sensor such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. The image sensor 2a generates an image of a predetermined region by converting received light from the predetermined region into an electric signal through photoelectric conversion. A stereo image as the endoscope image E is generated by performing image processing on two images with a parallax through the endoscope processor 4 or the like. In this case, the tip portion of the endoscope 2 has a stereo optical system.

The endoscope image E is transmitted from the endoscope 2 to the endoscope processor 4, is subjected to necessary processing in the endoscope processor 4, is transmitted from the endoscope processor 4 to the display device 5, and is displayed on a display screen 5a of the display device 5. The display device 5 may be any display, for example, a liquid crystal display or an organic electroluminescent display. A surgeon operates the surgical instrument 6 in a body while observing the endoscope image E displayed on the display screen 5a. The display device 5 may include an audio system, for example, a speaker.

In addition to the display device 5, a user terminal for communications with the controller 1 and the endoscope processor 4 via a communication network may be provided to display the endoscope image E at the terminal. The terminal is, for example, a notebook computer, a laptop computer, a tablet computer, or a smartphone but is not particularly limited thereto.

The moving device 3 includes a robot arm 3a (including an electric scope holder) that holds the endoscope 2 and three-dimensionally controls the position and orientation of the endoscope 2. The moving device 3 includes a plurality of joints 3b and 3c that operate to move the endoscope 2 with the pivot axis P1 serving as a supporting point, thereby three-dimensionally changing the position and orientation of the endoscope 2.

As illustrated in FIG. 10, the joint 3c is a rotary joint that rotates the endoscope 2 about the longitudinal axis I and is provided at the tip portion of the robot arm 3a. In response to the rotation of the joint 3c, the endoscope 2 rotates about the optical axis C coaxial with the longitudinal axis I, thereby changing the rotation angle of a subject in the endoscope image E, that is, the vertical direction of the endoscope image E.

The moving device 3 includes a plurality of angle sensors 3d that detects the rotation angles of the joints 3b and 3c. The angle sensor 3d is, for example, an encoder, a potentiometer, or a Hall sensor that is provided at each of the joints 3b and 3c.

As illustrated in FIG. 2, the controller 1 includes at least one processor 11 like a central processing unit, a memory 12, a storage unit 13, an input interface 14, an output interface 15, and a user interface 16. The controller 1 may be, for example, a desktop computer, a tablet computer, a laptop computer, a smartphone, or a cellular phone.

The processor 11 may be a single processor, a multiprocessor, or a multicore processor. The processor 11 reads and executes a program stored in the storage unit 13.

The memory 12 is, for example, a semiconductor memory including a ROM (read-only memory) or RAM (Random Access Memory) area. The memory 12 may store data necessary for the processing of the processor 11 (that is, the memory 12 may operate as “storage unit”) like the storage unit 13, which will be described later.

The storage unit 13 is a computer-readable non-transitory recording medium, e.g., a hard disk or a nonvolatile recording medium including a semiconductor memory such as flash memory. The storage unit 13 stores various programs including a follow-up control program (not illustrated) and an image control program (control program) 1a and data necessary for the processing of the processor 11. Processing performed by the processor 11 may be implemented by dedicated logic circuits or hardware, for example, an FPGA (Field Programmable Gate Array), a SoC (System-on-a-Chip), an ASIC (Application Specific Integrated Circuit), and a PLD (Programmable Logic Device).

The storage unit 13 may be a server, e.g., a cloud server connected via a communication network to the controller 1 provided with a communication interface, instead of a recording medium integrated in the controller 1. The communication network may be, for example, a public network such as the Internet, a dedicated line, or a LAN (Local Area Network). The connection of the devices may be wired connection or wireless connection.

The endoscope processor 4 for processing the endoscope image E may be provided with the processor 11. Specifically, like the processor 11 included in the controller 1, the endoscope processor 4 may be provided with processors, dedicated logic circuits, or hardware to perform processing like the processor 11. The processing will be described later. The endoscope processor 4 and the controller 1 may be integrated into one unit. Each of the endoscope processor 4 and the controller 1 may be provided with at least one processor.

Any one of the configurations of the at least one processor 11, the memory 12, the storage unit 13, the input interface 14, the output interface 15, and the user interface 16 in the controller 1 may be provided for a user terminal, aside from the endoscope processor 4 and the controller 1. The controller 1 may be integrated with the moving device 3.

The input interface 14 and the output interface 15 are connected to the endoscope processor 4. The controller 1 can acquire the endoscope image E from the endoscope 2 via the endoscope processor 4 and output the endoscope image E to the display device 5 via the endoscope processor 4. The input interface 14 may be directly connected to the endoscope 2 and the output interface 15 may be directly connected to the display device 5 such that the controller 1 can directly acquire the endoscope image E from the endoscope 2 and directly output the endoscope image E to the display device 5.

The input interface 14 and the output interface 15 are connected to the moving device 3. The controller 1 acquires, from the moving device 3, information on rotation angles detected by the angle sensors 3d at the joints 3b and 3c and transmits, to the moving device 3, a control signal for driving the joints 3b and 3c.

The user interface 16 has input devices for inputs to the user interface 16 by users such as a surgeon and receives a user input. The input devices include a button, a mouse, a keyboard, and a touch panel.

Moreover, the user interface 16 has a means that allows a user to switch a manual mode and an autonomous mode, which will be described later. The means is, for example, a switch.

The user interface 16 is configured to receive a first instruction and a second instruction from a user. The first instruction and the second instruction are instructions for causing the controller 1 to register position information and rotation angle information, which will be described later. For example, the user interface 16 has a button operated by an operator. The user interface 16 receives the first instruction in response to a first button operation and receives the second instruction in response to a second button operation.

The processor 11 can be operated in the manual mode or the autonomous mode.

The manual mode is a mode that permits users such as a surgeon to operate the endoscope 2. In the manual mode, a surgeon can manually move the endoscope 2 with a hand holding the proximal end portion of the endoscope 2. Furthermore, the surgeon can remotely operate the endoscope 2 by using an operating device connected to the moving device 3. The operating device can include a button, a joystick, and a touch panel.

The autonomous mode is a mode that causes the endoscope 2 to automatically follow the surgical instrument 6 by controlling the moving device 3 on the basis of the position of the surgical instrument 6 in the endoscope image E. In the autonomous mode, the processor 11 acquires the three-dimensional position of the tip of the surgical instrument 6 from the endoscope image E and controls the moving device 3 on the basis of the three-dimensional position of the tip of the surgical instrument 6 and the three-dimensional position of a predetermined target point set in the field of view of the endoscope 2. The target point is, for example, a point that is located on the optical axis C and corresponds to the center point of the endoscope image E. Thus, the controller 1 controls a movement of the endoscope 2 and causes the surgical instrument 6 to follow the endoscope 2 such that the tip of the surgical instrument 6 is disposed at the center point in the endoscope image E.

In the autonomous mode, by performing a control method in FIGS. 3A and 3B according to the image control program 1a read into the memory 12, the processor 11 controls the rotation angle of the endoscope image E displayed on the display screen 5a.

The control method performed by the processor 11 will be described below.

As indicated in FIGS. 3A and 3B, the control method according to the present embodiment includes step SB2 of setting the initial position of the endoscope 2, steps SB3 and SB4 of determining first position information and first rotation angle information on the first region in a subject, steps SB5 and SB6 of determining second position information and second rotation angle information on the second region in the subject, steps SB7 and SB8 of calculating third position information and third rotation angle information on a third region in the subject, step SB9 of storing the position information and the rotation angle information in the storage unit 13, steps SC4 to SC9 of rotating the endoscope image E according to a current imaging region that is currently being imaged by the endoscope 2, and step SC10 of outputting the rotated endoscope image E to the display device 5.

As indicated in FIG. 3A, steps SB2 to SB9 are performed in the manual mode. As indicated in FIG. 3B, steps SC3 to SC10 are performed in the autonomous mode.

A user, e.g., a surgeon inserts the endoscope 2 held by the moving device 3 into an abdominal cavity, switches to the manual mode (SAl, SB1), and starts panning around by moving the endoscope 2 in the abdominal cavity (SA3). Panning around is an operation for observing the overall abdominal cavity to confirm the positions or the like of organs and tissues. The positions of organs and tissues vary among patients, so that the operation is required each time the endoscope is inserted. When panning around, the surgeon rotates the endoscope 2 about the pivot axis P1 so as to observe, through the endoscope 2, a range including at least two specific tissues having anatomical characteristics. In the present embodiment, the specific tissues are the aorta F and the pelvis G.

As indicated in FIG. 3A, the surgeon registers the initial position of the endoscope 2 in the controller 1 before panning around (SA2). For example, the surgeon places the endoscope 2 at a desired initial position and operates a predetermined button of the user interface 16. In response to the operation of the predetermined operation, the processor 11 calculates a current position φ of the endoscope 2 and stores the current position φ as an initial position φ=0° in the storage unit 13 (SB2). The position φ is the position of the endoscope 2 in a circumferential direction around the pivot axis P1 and is calculated from the rotation angles detected by the angle sensors 3d at the joints 3b and 3c. The position p represents the position of an imaging region in the circumferential direction around the pivot axis P1.

Subsequently, as illustrated in FIGS. 4A and 5A, the surgeon places the endoscope 2 at a position (O-point) for imaging the aorta F from the front and adjusts a rotation angle ω of the endoscope 2 about the optical axis C such that the aorta F is placed at a desired rotation angle in the endoscope image E (SA4). In this case, the rotation angle of the aorta F is a position in the circumferential direction around the center point of the endoscope image E. In the present embodiment, as illustrated in FIG. 5A, the rotation angle ω is adjusted such that the aorta F is horizontally placed in the endoscope image E. The surgeon then inputs the first instruction to the user interface 16 (SA5).

After the first instruction is inputted, the surgeon observes the overall aorta F through the endoscope 2 by rotating the endoscope 2 from O-point about the pivot axis P1 while keeping the rotation angle ω adjusted at O-point. As illustrated in FIGS. 5A and 5B, the aorta F makes a rotational movement in the endoscope image E as the endoscope 2 rotates from O-point to B-point. B-point is the end point of the observation range of the aorta F in the endoscope image E.

In response to the first instruction received by the user interface 16, the processor 11 determines, on the basis of the endoscope image E, the first position information and the first rotation angle information on the first region including the aorta (first specific tissue) F (SB3, SB4). The first rotation angle information is information that defines the rotation angle of the endoscope image E of the first region.

Specifically, the storage unit 13 stores a learned model lb of machine learning of the correspondence between an image including a specific tissue and the type of the specific tissue. In step SB3, the processor 11 recognizes the aorta F in the endoscope image E by using the learned model 1b and determines, as the first position information, the range of the position φ of the endoscope 2 with the aorta F included in the endoscope image E. In other words, the first region is a region between O-point and B-point.

For example, the first position information is p=0° to 20°. As described above, instead of the setting of the initial position in steps SA2 and SB2, the processor 11 may set the position φ of the endoscope 2 at the time of the reception of the first instruction to the initial position p=0°. In other words, the initial position is determined at a time and a location as requested by the user.

Alternatively, the processor 11 may set the position φ of the endoscope 2 at the time of the reception of the first instruction, as the first position information without processing using the learned model 1b. In other words, the first position information is determined at a time and a location as requested by the user.

Subsequently, in step SB4, the processor 11 sets the endoscope image E and the rotation angle ω of the endoscope 2 at the time of the reception of the first instruction by the user interface 16, as a first reference endoscope image and a first reference rotation angle, and the processor 11 determines the first rotation angle information on the basis of the first reference endoscope image and the first reference rotation angle.

Specifically, the processor 11 calculates the first reference rotation angle corresponding to a predetermined initial rotation angle ω=0°, as a target rotation angle θt of the endoscope image E at the position φ at the time of the reception of the first instruction. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the aorta F is to be horizontally placed in the endoscope image E at the position φ at the time of the reception of the first instruction. In the present embodiment, the first reference rotation angle ω is set at the initial rotation angle 0°.

The processor 11 then calculates a required rotation amount Δθ of the endoscope image E, which is obtained at another position φ included in the first position information, when the aorta F in the endoscope image E is to be aligned with the aorta F in the first reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle θt at another position φ by adding the rotation amount Δθ to the first reference rotation angle. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the aorta F is to be horizontally placed in the endoscope image E at another position φ. FIG. 5C illustrates the endoscope image E of FIG. 5B when a rotation is made by the target angle θt at B-point.

As described above, the processor 11 calculates the target rotation angle θt of the endoscope image E when the aorta F is to be horizontally placed at each position φ=0°, . . . , 20° included in the first position information, and the processor 11 determines the target rotation angle θt at each position φ=0°, . . . , 20° as the first rotation angle information. FIG. 7 only indicates representative target rotation angles θt=0°, −10° at φ=0°, 20° as the first rotation angle information.

Thereafter, as illustrated in FIG. 4B, the surgeon places the endoscope 2 at a position (D-point) for imaging the pelvis G. When the pelvis G is observed at the initial rotation angle ω =0°, the pelvis G may be placed at an improper position in the endoscope image E as illustrated in FIG. 6A. The surgeon adjusts the rotation angle ω about the optical axis C of the endoscope 2 such that the pelvis G is placed at a desired rotation angle in the endoscope image E (SA6), and inputs the second instruction to the user interface 16 (SA7). In the present embodiment, as illustrated in FIG. 6B, the rotation angle ω is adjusted such that the pelvis G is placed in an upper part in the endoscope image E.

After the second instruction is inputted, the surgeon observes the overall pelvis G through the endoscope 2 by rotating the endoscope 2 from D-point about the pivot axis P1 while keeping the rotation angle ω adjusted at D-point. Also at this point, the pelvis G makes a rotational movement in the endoscope image E as the endoscope 2 rotates from D-point to A-point. A-point is the end point of the observation range of the pelvis G in the endoscope image E.

In response to the second instruction received by the user interface 16, the processor 11 determines, on the basis of the endoscope image E, the second position information and the second rotation angle information on the second region including the pelvis (second specific tissue) G (SBS, SB6). The second rotation angle information is information that defines the rotation angle of the endoscope image E of the second region.

Specifically, in step SB5, the processor 11 recognizes the pelvis G in the endoscope image E by using the learned model 1b and determines, as the second position information, the range of the position φ of the endoscope 2 with the pelvis G included in the endoscope image E. In other words, the second region is a region between D-point and A-point. For example, the second position information is p=70° to 90°.

Also for the second position information, the processor 11 may set the position φ of the endoscope 2 at the time of the reception of the second instruction, as the second position information without processing using the learned model 1b. In other words, the second position information is determined at a time and a location as requested by the user.

Subsequently, in step SB6, the processor 11 sets the endoscope image E and the rotation angle ω of the endoscope 2 at the time of the reception of the second instruction by the user interface 16, as a second reference endoscope image and a second reference rotation angle, and the processor 11 determines the second rotation angle information on the basis of the second reference endoscope image and the second reference rotation angle.

Specifically, the processor 11 calculates the second reference rotation angle corresponding to an initial rotation angle ω =0°, as a target rotation angle θt of the endoscope image E at the position φ at the time of the reception of the second instruction. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the pelvis G is to be placed in an upper part in the endoscope image E at the position φ at the time of the reception of the second instruction.

The processor 11 then calculates a required rotation amount A of the endoscope image E, which is obtained at another position φ included in the second position information, when the pelvis G in the endoscope image E is to be aligned with the pelvis G in the second reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle θt at another position φ by adding the rotation amount A to the second reference rotation angle. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the pelvis G is to be placed in an upper part in the endoscope image E at another position φ.

As described above, the processor 11 calculates the target rotation angle θt of the endoscope image E when the pelvis G is to be placed in an upper part at each position φ=70°, . . . , 90° included in the second position information, and the processor 11 determines the target rotation angle θt at each position φ=70°, . . . , 90° as the second rotation angle information. FIG. 7 only indicates representative target rotation angles θt=100°, 90° at φ=70°, 90° as the second rotation angle information.

The processor 11 then calculates third position information and third rotation angle information on a third region on the basis of the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SB7, SB8). The third region is different from the first region and the second region and is located between A-point and B-point in the present embodiment.

In step SB7, the processor 11 determines, as the third position information, the range of the position φ between the first position information and the second position information. For example, the third position information is φ=20° to 70°.

In step SB8, the processor 11 then calculates the third rotation angle information on the basis of the first, second, and third position information and the first and second rotation angle information. The third rotation angle information is information that defines the rotation angle of the endoscope image E of the third region.

Specifically, the processor 11 calculates the positional relationship between the third position information and the first and second position information and calculates the third rotation angle information on the basis of the positional relationship, the first rotation angle information, and the second rotation angle information.

For example, it is assumed that each position φ (M-point) of the third position information is an internally dividing point that internally divides a path between A-point and B-point in a m:n ratio. The processor 11 calculates a target rotation angle θt at each position φ on the basis of the ratio m:n, the rotation angle of 100° at A-point, and the rotation angle of −10° at B-point. For example, the position φ=45° internally divides the path between A-point and B-point, so that the target rotation angle θt at the position φ=45° is 45°, a median value between −10° and 100°.

This calculates the target rotation angle θt that gradually changes from 100° to 10° as the position φ changes from B-point to A-point.

The processor 11 determines the target rotation angle et at each position φ=20°, . . . , 70° as the third rotation angle information. FIG. 7 only indicates a representative target rotation angle θt =45° at φ=45° as the third rotation angle information.

In other words, the third region is a region where a specific tissue like the pelvis G and the aorta F in the first region and the second region is not included in an endoscope image, the specific tissue serving as an index of the rotation angle of the endoscope image E. Such a region provides difficulty in recognizing a specific tissue by the learned model 1b and determining a desired rotation angle by a user. This requires calculation of the third position information and the third rotation angle information on the basis of the first and second position information and the first and second rotation angle information of the first region and the second region.

Subsequently, in step SB9, the processor 11 stores the first position information, the first rotation angle information, the second position information, the second rotation angle information, the third position information, and the third rotation angle information, which are determined in steps SB3 to SB8, in the storage unit 13. Thus, as indicated in FIG. 7, data is generated in the storage unit 13, the data including a rotation angle φ of the endoscope 2 and a target rotation angle θt of the endoscope image E at each rotation angle φ indicating a position of the imaging region.

After the completion of panning, the surgeon switches from the manual mode to the autonomous mode and performs treatment on the aorta F and the pelvis G with the surgical instrument 6. As indicated in FIG. 3B, when the surgeon switches to the autonomous mode (SC2), the processor 11 rotates the rotary joint 3c so as to match the rotation angle co of the endoscope 2 with the initial rotation angle 20° and causes the endoscope 2 to follow the tip of the surgical instrument 6 by controlling the moving device 3 while keeping the rotation angle ω at 0° (SC3). Moreover, in parallel with the tracking of the endoscope 2, the processor 11 controls the vertical direction of the endoscope image E displayed on the display screen 5a (SC4 to SC10).

During the startup of the devices 1 and 3, the processor 11 sequentially receives the rotation angles of the joints 3b and 3c from the moving device 3 and calculates the current position φ of the endoscope 2 from the rotation angles of the joints 3b and 3c (SC1).

The processor 11 determines which one of the first region, the second region, and the third region includes the current imaging region on the basis of the current position of the endoscope 2, the first position information, and the second position information (SC4, SC6, SC8).

Specifically, if the current position φ is included in the first position information (φ=0° to 20°, the processor 11 determines that the current imaging region is included in the first region (YES at SC4). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the first rotation angle information stored in the storage unit 13 (SC5). Specifically, the processor 11 reads the target rotation angle θt of the current position φ from the storage unit 13 and rotates the endoscope image E by the target rotation angle θt through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5a (SC10).

In the rotated endoscope image E, the aorta F is horizontally placed. Thus, while the endoscope 2 moves in the range of φ=0° to 20° and captures the endoscope image E including the aorta F, the aorta F in the endoscope image E displayed on the display screen 5a is kept in a horizontal position. For example, if the endoscope 2 pivots 20° from O-point to B-point about the pivot axis P1, the endoscope image E rotates from 0° to −10°.

If the current position φ is included in the second position information (p=70° to 90°, the processor 11 determines that the current imaging region is included in the second region (NO at SC4 and YES at SC6). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the second rotation angle information stored in the storage unit 13 (SC7). Specifically, the processor 11 reads the target rotation angle θt of the current position φ from the storage unit 13 and rotates the endoscope image E by the target rotation angle et through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5a (SC10).

In the rotated endoscope image E, the pelvis G is placed in an upper part. Thus, while the endoscope 2 moves in the range of φ=70° to 90° and captures the endoscope image E including the pelvis G, the pelvis G in the endoscope image E displayed on the display screen 5a is kept in the upper part. For example, if the endoscope 2 pivots 20° from A-point to D-point about the pivot axis P1, the endoscope image E rotates from 100° to 90°.

If the current position φ is not included in the first position information or the second position information (NO at SC4 and NO at SC6), the processor 11 determines that the current imaging region is included in the third region (SC8). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the third rotation angle information stored in the storage unit 13 (SC9). Specifically, the processor 11 reads the rotation angle of the current position φ from the storage unit 13 and rotates the endoscope image E by the rotation angle through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5a (SC10).

The endoscope image E displayed on the display screen 5a is rotated by the target rotation angle θt corresponding to the position φ. The target rotation angle θt gradually changes from the target rotation angle of the first region to the target rotation angle of the second region as the position φ changes from the first region to the second region. Thus, for example, if the endoscope 2 pivots from B-point to A-point about the pivot axis P1, the endoscope image E displayed on the display screen 5a rotates from −10° to 100° in one direction.

As described above, according to the present embodiment, the storage unit 13 stores the first position information on the first region including a specific tissue F and the first rotation angle information for defining the target rotation angle θt of the endoscope image E, the target rotation angle θt being defined for placing the specific tissue F at a desired rotation angle by the surgeon. Furthermore, the storage unit 13 stores the second position information on the second region including a specific tissue G and the second rotation angle information for defining the target rotation angle θt of the endoscope image E, the target rotation angle θt being defined for placing the specific tissue G at a desired rotation angle by the surgeon. Moreover, as the third rotation angle information of the third region between the first region and the second region, the target rotation angle θt that gradually changes between the target rotation angle θt of the first rotation angle information and the target rotation angle θt of the second rotation angle information is interpolated and is stored in the storage unit 13.

Thereafter, in the autonomous mode, the endoscope image E is rotated by the target rotation angle θt corresponding to the position φ of the current imaging region, thereby automatically adjusting the vertical direction of the endoscope image E. Specifically, when the current imaging region is the first or second region including the specific tissues F and G, the endoscope image E is automatically rotated by the target rotation angle θt that places the specific tissues F and G at a predetermined rotation angle. When the current imaging region is the third region that does not include the specific tissues F and G, the endoscope image E is automatically rotated by a proper target rotation angle et that is estimated from the first and second rotation angle information.

As described above, the operator can be provided with the endoscope image E in a proper vertical direction according to the position of the current imaging region in an abdominal cavity.

Moreover, an automatic adjustment to the vertical direction of the endoscope image E can relieve the stress of the surgeon and shorten the treatment time. Specifically, if the surgeon adjusts the vertical direction of the endoscope image E, the surgeon needs to take a hand off from the surgical instrument 6 during an operation and then manually rotate the endoscope 2. According to the present embodiment, the surgeon does not need to operate the endoscope 2 to adjust the vertical direction, so that the surgeon can continue treatment without being interrupted.

(Second Embodiment)

An endoscope system, a controller, a control method, and a recording medium according to a second embodiment of the present invention will be described below with reference to the accompanying drawings.

The present embodiment is different from the first embodiment in that a processor 11 rotates an endoscope image E by a rotation of an endoscope 2 instead of image processing. In the present embodiment, configurations different from those of the first embodiment will be described. Configurations in common with the first embodiment are indicated by the same reference numerals and an explanation thereof is omitted.

An endoscope system 10 according to the present embodiment includes a controller 1, the endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5 as in the first embodiment.

FIGS. 8A and 8B indicate a control method performed by the processor 11 in the present embodiment.

As indicated in FIGS. 8A and 8B, the control method according to the present embodiment includes step SB2 of determining the initial position of the endoscope 2, steps SB3 and SB4′ of determining first position information and first rotation angle information on a first region in a subject, steps SB5 and SB6′ of determining second position information and second rotation angle information on a second region in the subject, steps SB7 and SB8′ of determining third position information and third rotation angle information on a third region in the subject, step SB9 of storing the position information and the rotation angle information in the storage unit 13, steps SC4 to SC9′ of rotating the endoscope image E according to a current imaging region that is currently being imaged by the endoscope 2, and step SC10 of outputting the rotated endoscope image E to the display device 5.

As indicated in FIG. 8A, steps SB2 to SB9 are performed in a manual mode. As indicated in FIG. 8B, steps SC4 to S10 are performed in an autonomous mode.

As in the first embodiment, a user performs steps SAl to SA5. In response to a first instruction received by a user interface 16, the processor 11 determines the first position information and the first rotation angle information on the first region on the basis of the endoscope image E (SB3, SB4′).

Specifically, in step SB4′ subsequent to step SB3, the processor 11 sets the endoscope image E and a rotation angle co of the endoscope 2 at the time of the reception of the first instruction by the user interface 16, as a first reference endoscope image and a first reference rotation angle.

Subsequently, the processor 11 calculates the first reference rotation angle corresponding to a predetermined initial rotation angle ω =0°, as a target rotation angle cot of the endoscope 2 at a position φ at the time of the reception of the first instruction.

The processor 11 then calculates a required rotation amount A of the endoscope image E, which is obtained at another position φ included in the first position information, when an aorta F in the endoscope image E is to be aligned with the aorta F in the first reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle cot of the endoscope 2 at another position φ by adding the rotation amount A to the first reference rotation angle.

As described above, the processor 11 calculates the target rotation angle cot of the endoscope 2 when the aorta F is to be horizontally placed at each position φ=0°, . . . , 20° included in the first position information, and the processor 11 determines the target rotation angle cot at each position φ=0°, . . . , 20° as the first rotation angle information.

The user then performs steps SA6 and SA7. In response to a second instruction received by the user interface 16, the processor 11 determines the second position information and the second rotation angle information on the second region on the basis of the endoscope image E (SB5, SB6′).

Specifically, in step SB6′ subsequent to step SB5, the processor 11 sets the endoscope image E and the rotation angle co of the endoscope 2 at the time of the reception of the second instruction by the user interface 16, as a second reference endoscope image and a second reference rotation angle.

Subsequently, the processor 11 calculates the second reference rotation angle corresponding to an initial rotation angle ω =0°, as a target rotation angle cot of the endoscope 2 at the position φ at the time of the reception of the second instruction.

The processor 11 then calculates a required rotation amount A of the endoscope image E, which is obtained at another position φ included in the second position information, when the pelvis G in the endoscope image E is to be aligned with the pelvis G in the second reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle cot of the endoscope 2 at another position φ by adding the rotation amount A to the second reference rotation angle.

As described above, the processor 11 calculates the target rotation angle cot of the endoscope 2 when the pelvis G is to be placed in an upper part at each position φ=70°, . . . , 90° included in the second position information, and the processor 11 determines the target rotation angle cot at each position φ=70°, . . . , 90° as the second rotation angle information.

The processor 11 then calculates the third position information and the third rotation angle information on the third region on the basis of the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SB7, SB8′). Specifically, in step SB8′ subsequent to step SB7, the processor 11 determines the target rotation angle cot at each position φ=20°, . . . , 70° of the third position information as the third rotation angle information as in step SB8.

Subsequently, in step SB9, the processor 11 stores the position information and the rotation angle information, which are determined in steps SB3, SB4′, SB5, SB6′, SB7, and SB8′, in the storage unit 13. Thus, data is generated in the storage unit 13, the data including the rotation angle φ of the endoscope 2 and the target rotation angle cot of the endoscope image E at each rotation angle φ indicating a position of the imaging region.

As indicated in FIG. 8B, the processor 11 then calculates the current position φ of the endoscope 2 (SC1). When switching to the autonomous mode (YES at SC2), the processor 11 determines which one of the first region, the second region, and the third region includes the current imaging region (SC4, SC6, SC8).

If the processor 11 determines that the current imaging region is included in the first region (YES at SC4), the processor 11 rotates the endoscope 2 on the basis of the first rotation angle information stored in the storage unit 13 (SC5′). Specifically, the processor 11 reads the target rotation angle cot of the current position φ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 to the target rotation angle cot.

If the processor 11 determines that the current imaging region is included in the second region (NO at SC4 and YES at SC6), the processor 11 rotates the endoscope 2 on the basis of the second rotation angle information stored in the storage unit 13 (SC7′). Specifically, the processor 11 reads the target rotation angle cot of the current position φ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 by the target rotation angle cot.

If the processor 11 determines that the current imaging region is included in the third region (SC7), the processor 11 rotates the endoscope 2 on the basis of the third rotation angle information stored in the storage unit 13 (SC8′). Specifically, the processor 11 reads the target rotation angle cot of the current position φ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 by the target rotation angle cot.

Subsequent to step SC5′, SC7′, or SC9′, the processor 11 outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on a display screen 5a (SC10).

As described above, in the autonomous mode according to the present embodiment, the endoscope 2 is rotated to the target rotation angle cot corresponding to the position φ of the current imaging region, thereby automatically adjusting the vertical direction of the endoscope image E as in the first embodiment. Specifically, when the current imaging region is the first or second region including the specific tissues F and G, the endoscope 2 is automatically rotated to the target rotation angle cot that places the specific tissues F and G at a predetermined rotation angle. When the current imaging region is the third region that does not include the specific tissues F and G, the endoscope 2 is automatically rotated to a proper target rotation angle cot that is estimated from the first and second rotation angle information.

As described above, an operator can be provided with the endoscope image E in a proper vertical direction according to the position of the current imaging region in an abdominal cavity. Moreover, an automatic adjustment to the vertical direction of the endoscope image E can relieve the stress of the surgeon and shorten the treatment time.

According to the present embodiment, a rotation of the endoscope image E by rotating the endoscope 2 about an optical axis C can eliminate the need for image processing for rotating the endoscope image E, thereby reducing a load of the processor 11. Moreover, the user can intuitively recognize the vertical direction of the endoscope image E by confirming the target angle ω of a portion of the endoscope 2 outside a body.

In the present embodiment, the endoscope image E is rotated by rotating the overall endoscope 2 about the optical axis C. Alternatively, an image sensor 2a may be rotated about the optical axis C while keeping the rotation angle ω of the endoscope 2 about the optical axis C. In this case, the endoscope 2 includes a rotating mechanism for rotating the image sensor 2a.

A rotation of the image sensor 2a relative to the body of the endoscope 2 can rotate the endoscope image E like a rotation of the overall endoscope 2.

(Third Embodiment)

An endoscope system, a controller, a control method, and a recording medium according to a third embodiment of the present invention will be described below with reference to the accompanying drawings.

The present embodiment is different from the first and second embodiments in that an endoscope image E is rotated by a combination of a rotation of an endoscope 2 about an optical axis C and image processing. In the present embodiment, configurations different from those of the first and second embodiments will be described. Configurations in common with the first and second embodiments are indicated by the same reference numerals and an explanation thereof is omitted.

An endoscope system 10 according to the present embodiment includes a controller 1, the endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5 as in the first embodiment.

FIG. 9 indicates a control method performed by a processor 11 in an autonomous mode in the present embodiment. The control method according to the present embodiment includes step SC11 of determining whether a rotation angle co of the endoscope 2 is a predetermined critical angle and step SC12 of rotating the endoscope image E by image processing in addition to steps SB2, SB3, SB4′, SBS, SB6′, SB7, SB8′, SB9, SC1 to SC4, SC5′, SC6, SC7′, SC8, and SC9′ that are described in the second embodiment.

After Step SB9, as indicated in FIG. 9, the processor 11 calculates the current position φ of the endoscope 2 (SC1). When switching to the autonomous mode (YES at SC2), the processor 11 performs steps SC1 to SC4, SC5′, SC6, SC7′, SC8, and SC9′.

In steps SC5′, SC7′, and SC9′, the processor 11 determines whether the rotation angle ω of the endoscope 2 has reached the critical angle of the rotatable range of the endoscope 2 on the basis of a rotation angle detected by an angle sensor 3d at a rotary joint 3c (SC11). The rotatable range in which the endoscope 2 is rotatable may be limited by physical constraints or the like. For example, a cable in the endoscope 2 and the moving device 3 is twisted by a rotation of the endoscope 2 and thus the rotatable range of the endoscope 2 is set without causing an excessive twist.

If the endoscope 2 rotates to a target rotation angle cot before the rotation angle ω reaches the critical angle (NO at SC11), the processor 11 outputs the rotated endoscope image E to the display device 5 (SC10).

If the rotation angle ω reaches the critical angle before the target rotation angle cot (YES at SC11), the processor 11 stops the rotation of the endoscope 2 at the critical angle, rotates the endoscope image E through image processing by a rotation angle to be added to reach the target rotation angle cot (SC12), and outputs the rotated endoscope image E to the display device 5 (SC10).

As described above, according to the present embodiment, the endoscope image E can be rotated by a combination of a rotation of the endoscope 2 about the optical axis C and image processing even if the endoscope image E is hard to rotate by a rotation of the endoscope 2 alone.

Other effects of the present embodiment are identical to those of the first and second embodiments and thus an explanation thereof is omitted.

(First Modification)

A first modification of the endoscope system 10, the controller 1, the control method, and the recording medium according to the first to third embodiments will be described below.

As illustrated in FIG. 10, the present modification is different from the first to third embodiments in that the endoscope 2 is an oblique type.

The oblique endoscope 2 includes a long insertion portion 2b that is inserted with the longitudinal axis I into a subject, and an imaging portion 2c that includes the image sensor 2a and is connected to the proximal end of the insertion portion 2b. The insertion portion 2b and the imaging portion 2c are integrally rotated about the longitudinal axis I by a rotation of the rotary joint 3c. In the case of a separate oblique mirror, a camera head (imaging portion 2c) and an optical visual tube (insertion portion 2b) have different pieces of rotation angle information. In the present modification, the camera head and the optical visual tube are integrally rotated to perform processing using common rotation angle information.

In the case of the direct-vision endoscope 2, a visual axis (optical axis) C is coaxial with the longitudinal axis I, so that the position of the visual axis C is kept even if the endoscope 2 rotates about the longitudinal axis I. In the case of the oblique endoscope 2, the visual axis C tilts with respect to the longitudinal axis I and thus makes a rotational movement about the longitudinal axis I in response to a rotation of the endoscope 2 about the longitudinal axis I, thereby moving the imaging region.

FIGS. 11A and 11B indicate a control method performed by the processor 11 in the present modification. As indicated in FIGS. 11A and 11B, the control method according to the present modification includes steps SB2′ and SB3 to SB9 and steps SC3′ and SC4 to SC10.

In step SB2′, the processor 11 sets the current position φ of the endoscope 2 at the initial position φ=0° and sets the current orientation ω of the endoscope 2 at the initial position ω =0°. The orientation ω of the endoscope 2 is a rotation angle about the longitudinal axis I and corresponds to the orientation of the visual axis C with respect to the longitudinal axis I.

In response to the first instruction received by the user interface 16 (SA5), the processor 11 determines the first position information and the first rotation angle information (SB3, SB4) and holds information on a first orientation of the endoscope 2 when the first instruction is received.

Subsequently, in response to the second instruction received by the user interface 16 (SA7), the processor 11 determines the second position information and the second rotation angle information (SB5, SB6) and holds information on a second orientation of the endoscope 2 when the second instruction is received.

In step SB9, the processor 11 stores the first orientation and the second orientation in the storage unit 13 in addition to the position information and the rotation angle information. Thus, data is generated in the storage unit 13, the data including a rotation angle φ of the endoscope 2, the target rotation angle θt of the endoscope image E at each rotation angle φ, and the first orientation and the second orientation of the endoscope 2, the rotation angle φ indicating the position of the imaging region, the first and second orientations corresponding to each imaging region.

Subsequently, in the autonomous mode, the processor 11 controls the position and orientation of the endoscope 2 by controlling the moving device 3 and causes the endoscope 2 to follow the tip of the surgical instrument 6 (SC3′). At this point, the processor 11 controls the position and orientation of the endoscope 2 on the basis of the first and second position information and the first and second orientations stored in the storage unit 13, so that an orientation ω of the endoscope 2 is controlled to the first orientation when the imaging region is included in the first region, whereas the orientation ω of the endoscope 2 is controlled to the second orientation when the imaging region is included in the second region.

As in the first embodiment, the processor 11 rotates the endoscope image E by the target rotation angle θt according to the current imaging region through image processing (SC4 to SC9).

As described above, in the case of the oblique endoscope 2, the imaging region is moved by a rotation of the endoscope 2 about the longitudinal axis I. Thus, the vertical direction of the endoscope image E is hard to control only by the control method of the second embodiment, in which the endoscope image E is rotated by a rotation of the endoscope 2.

According to the present modification, in the manual mode, the first orientation of the endoscope 2 at the time of imaging of the first region and the second orientation of the endoscope 2 at the time of imaging of the second region are stored. At the time of imaging of the first region in the autonomous mode, the orientation of the endoscope 2 is controlled to the first orientation and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. At the time of imaging of the second region in the autonomous mode, the orientation of the endoscope 2 is controlled to the second orientation and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. This can properly control the vertical direction of the endoscope image E captured by the oblique endoscope 2.

(Second Modification)

A second modification of the endoscope system 10, the controller 1, the control method, and the recording medium 13 according to the first to third embodiments will be described below.

As illustrated in FIG. 12, the present modification is different from the first to third embodiments in that the endoscope 2 has a curved portion 2d.

The endoscope 2 includes the long insertion portion 2b that is inserted into a subject and the curved portion 2d that is provided at the tip portion of the insertion portion 2b and can be curved in a direction that crosses the longitudinal axis I of the insertion portion 2b. When the curved portion 2d is bent, the visual axis C tilts with respect to the longitudinal axis I and thus makes a rotational movement about the longitudinal axis I in response to a rotation of the endoscope 2 about the longitudinal axis I, thereby moving the imaging region. Moreover, the tilt direction and the tilt angle of the visual axis C with respect to the longitudinal axis I change according to the curving direction and the curving angle of the curved portion 2d.

The control method performed by the processor 11 in the present modification includes steps SB2′ and SB3 to SB9 and steps SC3′ and SC4 to SC10 as in the first modification. As the orientation of the endoscope 2, the rotation direction and the rotation angle of the curved portion 2d are used instead of the rotation angle ω about the longitudinal axis I.

Specifically, in step SB2′, the processor 11 sets the current curving direction and curving angle of the curved portion 2d as an initial orientation. Subsequently, in step SB9, the curving direction and the curving angle of the curved portion 2d at the time of the reception of the first instruction are stored as a first orientation in the storage unit 13 by the processor 11, and the curving direction and the curving angle of the curved portion 2d at the time of the reception of the second instruction are stored as a second orientation in the storage unit 13 by the processor 11.

In step SC3′ of the autonomous mode, the processor 11 controls the position and orientation of the endoscope 2 on the basis of the first and second position information and the first and second orientations stored in the storage unit 13, so that the curving direction and the curving angle of the curved portion 2d are controlled to the first orientation when the imaging region is included in the first region, whereas the curving direction and the curving angle of the curved portion 2d are controlled to the second orientation when the imaging region is included in the second region (SC3′).

As described above, in the case of the endoscope 2 including the curved portion 2d, the imaging region makes a rotational movement by a rotation of the endoscope 2 according to the curving direction and the curving angle of the curved portion 2d. Thus, the vertical direction of the endoscope image E is hard to control by the control method of the second embodiment, in which the endoscope image E is rotated by a rotation of the endoscope 2.

According to the present modification, at the time of imaging of the first region in the autonomous mode, the orientation of the endoscope 2 is controlled to the first orientation stored in the manual mode and the vertical direction of the endoscope image E is adjusted by a rotation through image processing as in the first modification. At the time of imaging of the second region in the autonomous mode, the orientation of the endoscope 2 is controlled to the second orientation stored in the manual mode and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. This can properly control the vertical direction of the endoscope image E captured by the endoscope 2 including the curved portion 2d.

In the embodiments and the modifications, the processor 11 calculates the third rotation angle information in the manual mode and stores the information in the storage unit 13. Alternatively, as indicated in FIGS. 13A and 13B, the processor 11 may calculate the third rotation angle information in real time during the autonomous mode (SC13). In other words, the processor 11 does not determine or store the third position information and the third rotation angle information in the manual mode. In this case, the third region is assumed to be a region other than the first region and the second region.

In the autonomous mode of the embodiments and the modifications, if it is determined that the current imaging region is included in the third region (not included in the first region or the second region), the processor 11 may calculate the target rotation angle θt or cot at the current position φ of the endoscope 2 in real time on the basis of the current position φ, the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SC13). If the current imaging region is included in one of the first region and the second region (not included in the third region), the processor 11 may match the target rotation angle θt or cot with the first rotation angle information or the second rotation angle information without calculating the target rotation angle θt or cot in real time. This can reduce the amount of position information and rotation angle information to be stored in the storage unit 13 during the manual mode and only requires the calculation of the third position information and the third rotation angle information that are required for an operation of the autonomous mode, thereby reducing a load to the system.

If the current imaging region is included in the first region or the second region, the processor 11 may update the stored first position information or second position information or the stored first rotation angle information or second rotation angle information to the current position information and rotation angle information. The endoscope 2 is moved after the update. If it is determined that the current imaging region is included in the first region or the second region, the updated first position information, second position information, first rotation angle information, and second rotation angle information can be used. For the update, the user may provide an instruction to update from the user interface 16. Thus, even if the body of a patient is deformed by, for example, an adjustment to pneumoperitoneum or a body posture, the position information and the rotation angle information can be updated to correct information according to the current circumstances.

In the embodiments and the modifications, the processor 11 recognizes a specific tissue in the endoscope image E and determines the position information and the rotation angle information on the basis of the recognized specific tissue. Alternatively, the position information and the rotation angle information may be determined on the basis of the position φ and the rotation angle ω of the endoscope 2 at the time of the reception of the instruction.

Specifically, in the manual mode, the surgeon places the endoscope 2 at a desired position at a desired rotation angle co and inputs the first instruction. The processor 11 determines, as the first position information, a range around the position φ of the endoscope 2 at the time of the reception of the first instruction by the user interface 16 and determines, as the first rotation angle information, the rotation angle ω of the endoscope 2 at the time of the reception of the second instruction by the user interface 16.

Similarly, the surgeon places the endoscope 2 at another desired position at a desired rotation angle ω and inputs the second instruction. The processor 11 determines, as the second position information, a range around the position φ of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 and determines, as the second rotation angle information, the rotation angle ω of the endoscope 2 at the time of the reception of the second instruction by the user interface 16.

With this configuration, the surgeon can register any regions in a subject as the first region and the second region, thereby determining the position information and the rotation angle information that are further adapted to the feeling of the surgeon. Also when the first and second regions do not include a specific tissue, any position information and rotation angle information can be determined and stored for the first and second regions without performing the processing of the learned model 1b.

The determination of the position information and the rotation angle information on the basis of a specific tissue in the endoscope image E may be used in combination with the determination of the position information and the rotation angle information on the basis of the position φ and the rotation angle ω of the endoscope 2 at the time of the reception of an instruction.

For example, after determining the first and second position information and the first and second rotation angle information on the basis of the specific tissues F and G in the endoscope image E as described in the first to third embodiments and the modifications thereof, the processor 11 may further determine position information and rotation angle information on any region different from the first and second regions on the basis of an instruction of the surgeon.

In the embodiments and the modifications, specific tissues are the aorta F and the pelvis G. The specific tissues may be any organs or tissues having anatomical characteristics. For example, a uterus may be used.

In the embodiments and the modifications, the position information and the rotation angle information on the two regions are stored. Position information and rotation angle information on three or more regions may be stored instead. This can improve accuracy when position information and rotation angle information are calculated on the basis of stored information.

In the embodiments and the modifications, the position φ of the endoscope 2 is expressed by a two-dimensional polar coordinate system with the pivot point H serving as an origin, the position φ indicating the position of the imaging region. The position φ may be expressed by a three-dimensional polar coordinate system. Specifically, the endoscope 2 may be supported so as to pivot about a second pivot axis P2 that passes through the pivot point H and is orthogonal to the first pivot axis P1, and the position of the imaging region may be expressed as (φ1, φ2), where φ1 is a rotation angle about the first pivot axis P1 and φ2 is a rotation angle about the second pivot axis P2. In this case, the first position information, the second position information, and the third position information are three-dimensional information including rotation angles φ1 and φ2.

In the embodiments and the modifications, the position of the imaging region may be expressed by other kinds of coordinate systems instead of a polar coordinate system. For example, the position of the imaging region may be expressed by a cartesian coordinate system with the hole H serving as an origin.

In the embodiments and the modifications, the coordinate system of the position φ of the imaging region is a global coordinate system fixed relative to a subject. A relative coordinate system for the tip of the endoscope 2 may be used instead.

In the embodiments and the modifications, the first and second position information are determined in the manual mode and are stored in the storage unit 13. Alternatively, the first and second position information may be stored in advance in the storage unit 13 before a surgical operation.

Before a surgical operation, an examination image of a range including an affected part, for example, a CT image of an abdominal region may be captured. Deconvolution on multiple CT images generates a three-dimensional image in an abdominal cavity. The first and second position information may be determined and stored in the storage unit 13 on the basis of such a three-dimensional image before a surgical operation. In this case, steps SB4 and SB6 are omitted in the manual mode.

This configuration can reduce the computational complexity of the processor 11 in the manual mode.

In the embodiments and the modifications, the processor 11 in the manual mode may store a first endoscope image and a second endoscope image in the storage unit 13. The first endoscope image is the endoscope image E of the first region, and the second endoscope image is the endoscope image E of the second region. For example, in step SB3, the processor 11 stores at least one endoscope image E, in which the aorta F is recognized, as the first endoscope image in the storage unit 13. In step SB6, the processor 11 stores at least one endoscope image E, in which the pelvis G is recognized, as the second endoscope image in the storage unit 13.

In this case, the processor 11 in the autonomous mode may determine which one of the first region, the second region, and the third region includes the current imaging region on the basis of the first endoscope image and the second endoscope image. In other words, the processor 11 compares the current endoscope image E with the first endoscope image and the second endoscope image. The processor 11 determines that the current imaging region is included in the first region in the presence of a first endoscope image identical or similar to the current endoscope image E. The processor 11 determines that the current imaging region is included in the second region in the presence of a second endoscope image identical or similar to the current endoscope image E.

In the embodiments and the modifications, if a specific tissue is included in the endoscope image E, the processor 11 may read information on the rotation angle of the specific tissue from a database 1c stored in the storage unit 13 and then rotate the endoscope image E on the basis of the read information on the rotation angle. The rotation angle is an angle around the center point of the endoscope image E. This configuration can rotate the endoscope image E such that a specific tissue in the endoscope image E is placed at a predetermined rotation angle.

For example, registered in the database 1c are the type of at least one specific tissue other than the aorta F and the pelvis G and the rotation angle of the type of the specific tissue. The processor 11 recognizes a specific tissue in the endoscope image E, reads the rotation angle of the specific tissue from the database 1c, and rotates the endoscope image E such that the specific tissue is placed at the rotation angle.

For example, a uterus J as a specific tissue is preferably placed in an upper part of the endoscope image E and thus 90° equivalent to the 12 o'clock position is registered as a rotation angle of the uterus J. The processor 11 rotates the endoscope image E such that the recognized uterus J is placed at the position of 90°. Thus, if the endoscope image E includes the uterus J, the vertical direction of the endoscope image E is automatically adjusted such that the uterus J is placed at the position of 90°.

In the embodiments and the modifications, the rotation of the endoscope image E is controlled on the basis of the specific tissues F and G in the endoscope image E. Additionally, the rotation of the endoscope image E may be controlled on the basis of the surgical instrument 6 in the endoscope image E.

For example, the processor 11 can operate in a first rotation mode for controlling the rotation of the endoscope image E on the basis of the specific tissues F and G and a second rotation mode for controlling the rotation of the endoscope image E on the basis of the surgical instrument 6. A user, for example, a surgeon can switch the first rotation mode and the second rotation mode by using the user interface 16.

In the second rotation mode, the processor 11 detects the angle of the surgical instrument 6 in the current endoscope image E, rotates the endoscope image E by a rotation of the endoscope 2 or image processing such that the angle of the surgical instrument 6 is equal to a predetermined target angle, outputs the rotated endoscope image E to the display device 5, and displays the image on the display screen 5a. The angle of the surgical instrument 6 is, for example, the angle of the longitudinal axis of the shaft of the surgical instrument 6 with respect to the horizon of the endoscope image E.

For a proper operation of the surgical instrument 6 by the surgeon who is observing the endoscope image E, it is important to properly set the angle of the surgical instrument 6 in the endoscope image E displayed on the display screen 5a. However, a movement of the surgical instrument 6 by the surgeon or a change of the orientation of the endoscope 2 following the surgical instrument 6 leads to a change of the angle of the surgical instrument 6 in the endoscope image E.

The surgeon optionally switches from the first rotation mode to the second rotation mode such that the surgical instrument 6 in the endoscope image E can be displayed at a target angle on the display screen 5a.

In the embodiments and the modifications, the surgeon manually operates the surgical instrument 6 held with his/her hand. Alternatively, as illustrated in FIGS. 14A and 14B, the surgical instrument 6 may be held and controlled by a second moving device 31 that is different from the moving device 3. In this case, the controller 1 may acquire position information on the endoscope 2 and the surgical instrument 6 from the moving device 3 for moving the endoscope 2 and the second moving device 31 for moving the surgical instrument 6. Like the moving device 3, the second moving device 31 holds the surgical instrument 6 with a robot arm or an electric holder and three-dimensionally changes the position and orientation of the surgical instrument 6 under the control of a controller 101. As illustrated in FIG. 14A, the surgical instrument 6 may be connected to the tip of the robot arm and is integrated with the robot arm. As illustrated in FIG. 14B, the surgical instrument 6 may be a separate part held by a robot arm.

REFERENCE SIGNS LIST

  • 1 Controller
  • 11 Processor
  • 12 Memory
  • 13 Storage unit, recording medium
  • 14 Input interface
  • 15 Output interface
  • 16 User interface
  • 1a Image control program
  • 1b Learned model
  • 1c Database
  • 2 Endoscope
  • 2a Image sensor
  • 3 Moving device
  • 3a Robot arm
  • 3b, 3c Joint
  • 3d Angle sensor
  • 4 Endoscope processor
  • 5 Display device
  • 5a Display screen
  • 6 Surgical instrument
  • A, B, D, O Position
  • C Optical axis, visual axis
  • P1 First pivot axis
  • P2 Second pivot axis
  • E Endoscope image
  • F Aorta, first specific tissue
  • G Pelvis, second specific tissue
  • H Hole

Claims

1. An endoscope system comprising:

an endoscope that is inserted into a subject and captures an endoscope image in the subject;
a moving device that holds the endoscope and moves the endoscope;
a storage unit; and
a controller including at least one processor,
wherein the storage unit stores first position information and first rotation angle information on a first region in the subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region,
the at least one processor calculates third rotation angle information on a third region in the subject on a basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions,
the at least one processor rotates the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and
the at least one processor outputs the rotated endoscope image to a display device.

2. The endoscope system according to claim 1, wherein the at least one processor rotates the endoscope image by image processing.

3. The endoscope system according to claim 1, wherein the moving device is configured to rotate the endoscope about an optical axis of the endoscope, and

the at least one processor rotates the endoscope image by controlling the moving device so as to rotate the endoscope about the optical axis.

4. The endoscope system according to claim 1, wherein the at least one processor is operable in a manual mode that permits a user to operate the endoscope, and

in the manual mode, the at least one processor determines the first position information, the first rotation angle information, the second position information, and the second rotation angle information and stores the first position information, the first rotation angle information, the second position information, and the second rotation angle information in the storage unit.

5. The endoscope system according to claim 4, wherein the at least one processor determines the first position information and the first rotation angle information on a basis of a first specific tissue included in the endoscope image, and

the at least one processor determines the second position information and the second rotation angle information on a basis of a second specific tissue included in the endoscope image.

6. The endoscope system according to claim 5, wherein the storage unit stores a learned model of machine learning of a correspondence between an image including a specific tissue and a type of the specific tissue,

the at least one processor recognizes the first specific tissue and the second specific tissue in the endoscope image by using the learned model stored in the storage unit,
the at least one processor determines the first position information on a basis of a position of an imaging region of the endoscope image in which the first specific tissue is recognized, and determines the first rotation angle information on a basis of a rotation angle of the first specific tissue in the endoscope image, and
the at least one processor determines the second position information on a basis of a position of an imaging region of the endoscope image in which the second specific tissue is recognized, and determines the second rotation angle information on a basis of a rotation angle of the second specific tissue in the endoscope image.

7. The endoscope system according to claim 4, wherein the controller further includes a user interface that receives a user instruction,

the at least one processor determines the first position information on a basis of a position of the imaging region at a time of reception of a first instruction by the user interface, and determines the first rotation angle information on a basis of a rotation angle about an optical axis of the endoscope at a time of reception of the first instruction by the user interface, and
the at least one processor determines the second position information on a basis of a position of the imaging region at a time of reception of a second instruction by the user interface, and determines the second rotation angle information on a basis of a rotation angle about the optical axis of the endoscope at a time of reception of the second instruction by the user interface.

8. The endoscope system according to claim 4, wherein the at least one processor stores a first endoscope image and a second endoscope image in the storage unit, the first endoscope image serving as an endoscope image of the first region, the second endoscope image serving as an endoscope image of the second region.

9. The endoscope system according to claim 8, wherein the at least one processor determines which one of the first region, the second region, and the third region includes the current imaging region on a basis of the first endoscope image and the second endoscope image that are stored in the storage unit.

10. The endoscope system according to claim 1, wherein the first position information and the second position information are stored in advance in the storage unit, the first and second position information being determined on a basis of an examination image in the subject captured before a surgical operation.

11. The endoscope system according to claim 1, wherein the at least one processor rotates the endoscope image on a basis of the first rotation angle information if the first region includes the current imaging region, and

the at least one processor rotates the endoscope image on a basis of the second rotation angle information if the second region includes the current imaging region.

12. The endoscope system according to claim 11, wherein the processor determines which one of the first region, the second region, and the third region includes the current imaging region on a basis of a position of the imaging region, the first position information, and the second position information.

13. The endoscope system according to claim 1, wherein the endoscope is supported so as to pivot about a first pivot axis at a predetermined pivot point fixed to the subject, the endoscope pivots about the first pivot axis so as to move the imaging region between the first region and the second region, and

the first position information, the second position information, and the third position information each include a rotation angle of the endoscope about the first pivot axis.

14. The endoscope system according to claim 13, wherein the endoscope is supported so as to pivot about a second pivot axis at the predetermined pivot point, the second pivot axis being orthogonal to the first pivot axis, and

the first position information, the second position information, and the third position information are each three-dimensional information and further include a rotation angle of the endoscope about the second pivot axis.

15. The endoscope system according to claim 13, wherein the moving device includes at least one joint and at least one angle sensor that detects a rotation angle of the at least one joint, and

the processor calculates a rotation angle of the endoscope about the first pivot axis on a basis of the rotation angle detected by the at least one angle sensor.

16. The endoscope system according to claim 1, wherein the storage unit stores a database in which a type of a specific tissue and rotation angle information are associated with each other, and

if the specific tissue is included in the endoscope image of the third region,
the processor reads, from the database, the rotation angle information corresponding to the type of the specific tissue in the endoscope image and
the processor rotates the endoscope image on a basis of the read rotation angle information.

17. The endoscope system according to claim 1, wherein the at least one processor calculates a positional relationship between the third position information and the first and second position information and

the at least one processor calculates the third rotation angle information on a basis of the positional relationship, the first rotation angle information, and the second rotation angle information.

18. The endoscope system according to claim 3, wherein the at least one processor rotates the endoscope image by image processing if a rotation angle of the endoscope about the optical axis reaches a critical angle of a predetermined rotatable range.

19. The endoscope system according to claim 1, wherein the endoscope is a direct-view endoscope or an oblique endoscope.

20. The endoscope system according to claim 1, wherein the endoscope has a curved portion electrically operated to be bent at a tip portion of the endoscope.

21. The endoscope system according to claim 1, wherein the at least one processor is configured to update the first position information or the first rotation angle information to position information or rotation angle information on the current imaging region if the current imaging region is included in the first region, and

the at least one processor is configured to update the second position information or the second rotation angle information to the position information or the rotation angle information on the current imaging region if the current imaging region is included in the second region.

22. The endoscope system according to claim 4, wherein in the manual mode, the at least one processor calculates the third position information and the third rotation angle information and stores the third position information and the third rotation angle information in the storage unit.

23. The endoscope system according to claim 1, wherein the at least one processor is operable in an autonomous mode for autonomously moving the endoscope by controlling the moving device, and

the at least one processor calculates the third position information and the third rotation angle information during the autonomous mode.

24. A controller configured to control an endoscope image that is captured by an endoscope and is displayed on a display device,

the controller comprising:
a storage unit; and
at least one processor,
wherein the storage unit stores first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region,
the at least one processor calculates third rotation angle information on a third region in the subject on a basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions,
the at least one processor rotates the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and
the at least one processor outputs the rotated endoscope image to the display device.

25. A control method for controlling an endoscope image that is captured by an endoscope and is displayed on a display device, by using first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject,

the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region,
the control method comprising the steps of:
calculating third rotation angle information on a third region in the subject on a basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions;
rotating the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and
outputting the rotated endoscope image to the display device.

26. A computer-readable non-transitory recording medium in which a control program for causing a computer to perform the control method according to claim 25 is stored.

Patent History
Publication number: 20230180998
Type: Application
Filed: Feb 3, 2023
Publication Date: Jun 15, 2023
Applicants: OLYMPUS CORPORATION (Tokyo), National Cancer Center (Tokyo)
Inventors: Chiharu MIZUTANI (Tokyo), Masaru YANAGIHARA (Tokyo), Hiroto OGIMOTO (Tokyo), Hiro HASEGAWA (Tokyo), Daichi KITAGUCHI (Tokyo), Nobuyoshi TAKESHITA (Tokyo), Shigehiro KOJIMA (Tokyo), Yuki FURUSAWA (Tokyo), Yumi KINEBUCHI (Tokyo), Masaaki ITO (Tokyo)
Application Number: 18/105,300
Classifications
International Classification: A61B 1/00 (20060101); A61B 1/045 (20060101);