CONTROLLER, ENDOSCOPE SYSTEM, CONTROL METHOD, AND CONTROL PROGRAM

- Olympus

A controller controls an image that is captured by an image sensor of an endoscope and is displayed on the display screen of a display device. The controller includes a processor. The processor acquires a first image that is an image captured by the image sensor, detects a first angle of a surgical instrument, and generates a second image rotated with respect to the first image on the basis of the first angle and a predetermined target angle. The first angle is an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image. A second angle formed by the surgical instrument in the second image with respect to the predetermined reference line is equal to the predetermined target angle.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
PROGRAM Technical Field

The present invention relates to a controller, an endoscope system, a control method, and a control program, and particularly relates to a controller, an endoscope system, a control method, and a control program, by which an endoscope image displayed on a display device is controlled.

The present application claims priority under the provisional U.S. Pat. application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/033209 which is hereby incorporated by reference herein in its entirety.

Background Art

Conventionally, a system has been proposed to move the field of view of an endoscope in a semiautonomous manner by causing the endoscope to follow a surgical instrument (for example, see PTL 1 and PTL 2).

Citation List Patent Literatures

  • [PTL 1] Japanese Unexamined Patent Application Publication No. 2003-127076
  • [PTL 2] Japanese Unexamined Patent Application Publication No. 2001-112704

SUMMARY OF INVENTION

An aspect of the present invention is a controller configured to control an image that is captured by an image sensor of an endoscope and is displayed on the display screen of a display device, the controller including a processor, wherein the processor acquires a first image that is an image captured by the image sensor, the processor detects a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image, and the processor generates a second image rotated with respect to the first image on the basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.

Another aspect of the present invention is an endoscope system including an endoscope, a moving device that includes a robot arm and that moves the endoscope in a subject, and the controller.

Another aspect of the present invention is a control method for controlling an image that is captured by an image sensor of an endoscope and is displayed on the display screen of a display device, the control method including: acquiring a first image that is an image captured by the image sensor; detecting a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image, and generating a second image rotated with respect to the first image on the basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.

Another aspect of the present invention is a non-transitory computer-readable medium having a control program stored therein, the program being for controlling an image that is captured by an image sensor of an endoscope and is displayed on a display screen of a display device, the program causing a processor to execute functions of: acquiring a first image that is an image captured by the image sensor; detecting a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image; and generating a second image rotated with respect to the first image on a basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.

BRIEF DESCRIPTION OF DRAWINGS

FIG. 1A is an overall schematic diagram illustrating an example of an endoscope system according to a first embodiment of the present invention.

FIG. 1B is an overall schematic diagram illustrating another example of the endoscope system according to the first embodiment of the present invention.

FIG. 1C is an overall schematic diagram illustrating another example of the endoscope system according to the first embodiment of the present invention.

FIG. 2 is a block diagram of the endoscope system illustrated in FIGS. 1A to 1C.

[FIG. 3A] FIG. 3A illustrates an example of an endoscope image displayed on the display screen of a display device.

FIG. 3B illustrates another example of an endoscope image displayed on the display screen of a display device.

FIG. 3C illustrates another example of an endoscope image displayed on the display screen of a display device.

FIG. 4 is a flowchart of a control method according to the first embodiment of the present invention.

FIG. 5A illustrates an example of a first image.

FIG. 5B illustrates a second image rotated with respect to the first image in FIG. 5A.

FIG. 6A illustrates another example of the first image.

FIG. 6B illustrates a second image rotated with respect to the first image in FIG. 6A.

FIG. 7A illustrates an example of a third image acquired by a controller in an endoscope system according to a second embodiment of the present invention.

FIG. 7B illustrates an example of a first image acquired by the controller in the endoscope system according to the second embodiment of the present invention.

FIG. 7C illustrates a second image rotated with respect to the first image in FIG. 7B.

FIG. 8 is a flowchart of a control method according to the second embodiment of the present invention.

FIG. 9A illustrates an example of a first image acquired by a controller in an endoscope system according to a third embodiment of the present invention.

FIG. 9B illustrates a second image rotated with respect to the first image in FIG. 9A.

FIG. 10 is a flowchart of a control method according to the third embodiment of the present invention.

FIG. 11A illustrates an example of a first image acquired by a controller in an endoscope system according to a fourth embodiment of the present invention.

FIG. 11B illustrates another example of a first image acquired by the controller in the endoscope system according to the fourth embodiment of the present invention.

FIG. 11C illustrates a second image rotated with respect to the first image in FIG. 11B.

FIG. 12 is a flowchart of a control method according to the fourth embodiment of the present invention.

FIG. 13A illustrates an example of the first image acquired by the controller in a second rotation mode.

FIG. 13B illustrates another example of the first image acquired by the controller in the second rotation mode.

FIG. 13C illustrates a fourth image rotated with respect to the first image in FIG. 13B.

DESCRIPTION OF EMBODIMENTS First Embodiment

A controller, an endoscope system, a control method, and a control program according to a first embodiment of the present invention will be described below with reference to the accompanying drawings.

As illustrated in FIGS. 1A to 1C, an endoscope system 10 according to the present embodiment is used for a surgical operation in which an endoscope 2 and at least one surgical instrument 6 are inserted into the body of a patient P serving as a subject and an affected part is treated with the surgical instrument 6 while the surgical instrument 6 is observed through the endoscope 2. The endoscope system 10 is used for, for example, laparoscopic surgery.

In the endoscope system 10 of FIG. 1A, the surgical instrument 6 is held with a hand of a surgeon and is manually operated by the surgeon. The endoscope system 10 of FIGS. 1B and 1C includes a moving device 31 that holds and moves the surgical instrument 6 and a controller 101 that controls the moving device 31. As illustrated in FIG. 1B, the surgical instrument 6 may be connected to the tip of the robot arm of the moving device 31 and integrated with the robot arm. Alternatively, the surgical instrument 6 may be a separate part held by the robot arm.

As illustrated in FIG. 1A to 2, the endoscope system 10 includes the endoscope 2, a moving device 3 that holds the endoscope 2 and moves the endoscope 2 in the subject, an endoscope processor 4 that is connected to the endoscope 2 and processes an endoscope image A captured by the endoscope 2, a controller 1 that is connected to the moving device 3 and the endoscope processor 4 and controls the moving device 3, and a display device 5 that is connected to the endoscope processor 4 and displays the endoscope image A.

The endoscope 2 is, for example, a rigid endoscope and includes an image sensor 2a. The image sensor 2a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 2 and captures a stereo image, which includes a tip 6a of the surgical instrument 6, as the endoscope image A (for example, see FIGS. 3A and 3B). The image sensor 2a is an image sensor such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. The image sensor 2a generates an image of a predetermined region by converting received light from the predetermined region into an electric signal through photoelectric conversion. A stereo image as an endoscope image E is generated by performing image processing on two images with a parallax through the endoscope processor 4 or the like. The surgical instrument 6 has a long shaft 6b. The surgical instrument 6 may further include an end effector 6c connected to the tip of the shaft 6b.

The endoscope image A is transmitted from the endoscope 2 to the endoscope processor 4, is subjected to necessary processing in the endoscope processor 4, is transmitted from the endoscope processor 4 to the display device 5, and is displayed on a display screen 5a of the display device 5. The display device 5 may be any display, for example, a liquid crystal display or an organic electroluminescent display. A surgeon operates the surgical instrument 6 in a body while observing the endoscope image A displayed on the display screen 5a. The display device 5 may include an audio system, for example, a speaker.

In addition to the display device 5, a user terminal for communications with the controller 1 and the endoscope processor 4 via a communication network may be provided to display the endoscope image E at the terminal. The terminal is, for example, a notebook computer, a laptop computer, a tablet computer, or a smartphone but is not particularly limited thereto.

The moving device 3 includes a robot arm 3a (including an electric scope holder) that holds the endoscope 2 and three-dimensionally controls the position and orientation of the endoscope 2. The moving device 3 in FIGS. 1A to 1C includes the robot arm 3a having a plurality of joints 3b that operate to move the endoscope 2, thereby three-dimensionally changing the position and orientation of the endoscope 2. The moving device 3 may further include a mechanism for rotating the endoscope 2 about a visual axis (optical axis).

As illustrated in FIG. 2, the controller 1 includes at least one processor 1a like a central processing unit, a memory 1b, a storage unit 1c, an input interface 1d, an output interface 1e, and a user interface 1f. The controller 1 may be, for example, a desktop computer, a tablet computer, a laptop computer, a smartphone, or a cellular phone.

The storage unit 1c is a hard disk or a nonvolatile recording medium including a semiconductor memory such as flash memory and stores various programs including a follow-up control program (not illustrated) and an image control program (control program) 1g and data necessary for the processing of the processor 1a. Processing performed by the processor 1a may be partially implemented by dedicated logic circuits or hardware, for example, an FPGA (Field Programmable Gate Array), a SoC (System-on-a-Chip), an ASIC (Application Specific Integrated Circuit), and a PLD (Programmable Logic Device). The processing will be described later.

The storage unit 1c may be a server, e.g., a cloud server connected via a communication network to the controller 1 provided with a communication interface, instead of a recording medium integrated in the controller 1. The communication network may be, for example, a public network such as the Internet, a dedicated line, or a LAN (Local Area Network). The connection of the devices may be wired connection or wireless connection.

Any one of the configurations of the at least one processor 1a, the memory 1b, the storage unit 1c, the input interface 1d, the output interface 1e, and the user interface 1f in the controller 1 may be provided for a user terminal, aside from the endoscope processor 4 and the controller 1. The controller 1 may be integrated with the moving device 3.

The processor 1a performs processing according to the follow-up control program (not illustrated) read in the memory 1b to cause the endoscope 2 to follow the surgical instrument 6 to be followed. Specifically, the processor 1a acquires the three-dimensional position of the tip 6a of the surgical instrument 6 from the endoscope image A and controls the moving device 3 on the basis of the three-dimensional position of the tip 6a and the three-dimensional position of a predetermined target point set in the field of view of the endoscope 2. The target point is, for example, a point that is located on an optical axis and is disposed at a predetermined distance from the tip of the endoscope 2 in a direction parallel to the optical axis. The target point corresponds to a center point C of the endoscope image A. Thus, the controller 1 controls a movement of the endoscope 2 and causes the endoscope 2 to automatically follow the surgical instrument 6 such that the target point is disposed at the tip 6a.

The processor 1a is configured to operate in a rotation mode. As illustrated in FIGS. 3A to 3C, by performing processing in the rotation mode according to the image control program 1g read in the memory 1b, the processor 1a controls the rotation angle of the endoscope image A displayed on the display screen 5a and adjusts the vertical direction of the endoscope image A such that an angle θ of the surgical instrument 6 is equal to a target angle θt on the display screen 5a. FIGS. 3A to 3C illustrate examples that are different from one another in the position of the surgical instrument 6 and are identical to each other in the angle θ of the surgical instrument 6 and the vertical direction of the endoscope image A. In FIGS. 3A to 3C, reference character Lh denotes a horizontal line extending in the lateral direction (horizontal direction) of the display screen 5a, and reference character Lv denotes a vertical line extending in the longitudinal direction (vertical direction) of the display screen 5a.

The input interface 1d and the output interface 1e are connected to the endoscope processor 4. The controller 1 can acquire the endoscope image A from the endoscope 2 via the endoscope processor 4 and output the endoscope image A to the display device 5 via the endoscope processor 4. The input interface 1d may be directly connected to the endoscope 2 and the output interface 1e may be directly connected to the display device 5 such that the controller 1 can directly acquire the endoscope image A from the endoscope 2 and directly output the endoscope image A to the display device 5.

The user interface 1f has input devices for inputs to the user interface 1f by users such as a surgeon and receives a user input. The input devices include a mouse, a button, a keyboard, and a touch panel. Moreover, the user interface 1f has a means that allows a user to switch on/off the rotation mode. The means is, for example, a switch. At the start of the controller 1, the switch is initially set to be turned off. Thereafter, when the switch is turned on by a user, the user interface 1f receives the input of the turn-on of the rotation mode. When the switch is turned off by the user, the user interface 1f receives the input of the turn-off of the rotation mode.

For a proper operation of the surgical instrument 6 by the surgeon who is observing the endoscope image A, it is important to properly set the vertical direction of the endoscope image A (the orientations of subjects, for example, the surgical instrument 6 and a biological tissue in the endoscope image A) displayed on the display screen 5a. For example, it is desirable to the surgeon that the surgical instrument 6 operated with the right hand of the surgeon would protrude at about 30° from the right side of the endoscope image A. During a surgical operation, however, a movement of the surgical instrument 6 by the surgeon or a change of the orientation of the endoscope 2 following the surgical instrument 6 leads to a change of the angle θ of the surgical instrument 6 in the endoscope image A. When the surgical instrument 6 in the endoscope image A is to be displayed at the target angle θt on the display screen 5a, a user, e.g., a surgeon can start the rotation mode by turning on the switch.

A control method performed by the processor 1a in the rotation mode will be described below.

As indicated in FIG. 4, the control method according to the present embodiment includes step S1 of receiving the input of the turn-on of the rotation mode, step S2 of acquiring a first image A1, step S3 of detecting a first angle θ1 of the surgical instrument 6 in the first image A1, steps S4 to S6 of generating a second image A2 that is rotated with respect to the first image A1 on the basis of the first angle θ1 and the predetermined target angle θt, step S7 of displaying the second image A2 on the display screen 5a, and step S8 of receiving the input of the turn-off of the rotation mode.

When the user interface 1f receives the input of the turn-on of the rotation mode (YES at step S1), the processor 1a acquires the first image A1, which is the latest endoscope image A, from the endoscope 2 as illustrated in FIGS. 5A and 6A (step S2). The top, bottom, left, and right of the first image A1 correspond to the top, bottom, left, and right of the display screen 5a.

The processor 1a then recognizes the surgical instrument 6 and the shaft 6b in the first image A1 and detects the first angle θ1 of the surgical instrument 6 (step S3). For recognizing the surgical instrument 6 and the shaft 6b, for example, a known image recognition technique according to deep learning is used. The first angle θ1 is an angle formed by a longitudinal axis B of the shaft 6b with respect to a predetermined reference line L.

The predetermined reference line L is a straight line that is set with respect to the plane of the first image A1 and forms a predetermined angle with respect to the horizontal line in the first image A1. The reference line L is fixed relative to the first image A1. In the present embodiment, the reference line L is a horizontal line passing through the center point C of the first image A1 and corresponds to a horizontal line Lh of the display screen 5a.

The processor 1a then compares the first angle θ1 with the predetermined target angle θt (step S4). The target angle θt is, for example, a value determined in advance by the surgeon according to the preferences of the surgeon, a fixed value set for each case of surgery (e.g., a surgical site), or a fixed value set for each surgical instrument 6. The target angle θt is stored in advance in, for example, the storage unit 1c.

If the first angle θ1 is equal to the target angle θt (YES at step S4), the processor 1a returns to step S2 without performing steps S5 to S7. In this case, the processor 1a outputs the first image A1 to the display device 5 and displays the image on the display screen 5a.

If the first angle θ1 is different from the target angle θt (NO at step S4), the processor 1a then calculates a difference Δθ between the first angle θ1 and the target angle θt (step S5).

Subsequently, as illustrated in FIGS. 5B and 6B, the processor 1a generates the second image A2 that is the endoscope image A rotated by the difference Δθ with respect to the first image A1 (step S6). The second image A2 is an image rotated about a predetermined rotation axis D with respect to the first image A1 by the difference Δθ in a direction that eliminates the difference Δθ (clockwise in FIGS. 5B and 6B). Thus, a second angle θ2 formed by the longitudinal axis B of the shaft 6b in the second image A2 with respect to the reference line L is equal to the target angle θt. The rotation axis D is an axis that passes through the center point C of the first image A1 and is parallel to the optical axis of the endoscope 2 (that is, perpendicular to the plane of the first image A1).

FIGS. 5A and 5B illustrate the first image A1 and the second image A2, respectively, when the tip 6a is disposed at the center point C. FIGS. 6A and 6B illustrate the first image A1 and the second image A2, respectively, when the tip 6a is disposed at a position deviated from the center point C.

In an example of step S6, the processor 1a generates the second image A2 by rotating the image sensor 2a about the optical axis by the difference Δθ. In other words, the image sensor 2a captures the second image A2 that is the endoscope image A rotated by the difference Δθ with respect to the first image A1.

In another example of step S6, the processor 1a generates the second image A2 by rotating the first image A1 by the difference Δθ through image processing.

In another example of step S6, the processor 1a generates the second image A2 by rotating the endoscope 2 by using a rotating mechanism provided at the tip of the moving device 3.

The processor 1a then outputs the generated second image A2 to the display device 5 and displays the image on the display screen 5a (step S7). The surgical instrument 6 in the second image A2 displayed on the display screen 5a forms the target angle θt with respect to the horizontal line Lh of the display screen 5a.

Until the user interface 1f receives the input of the turn-off of the rotation mode in step S8, the processor 1a regularly performs the acquisition of the first image A1 in step S2 and performs steps S3 to S7 each time an additional first image A is acquired. Thus, steps S2 to S7 are repeated and the angle θ of the surgical instrument 6 on the display screen 5a is kept at the target angle θt while the rotation mode is executed.

As described above, according to the present embodiment, when the first angle θ1 of the surgical instrument 6 in the first image A1, which is the endoscope image A captured by the endoscope 2, is equal to the predetermined target angle θt, the first image A1 is displayed on the display screen 5a. When the first angle θ1 is different from the predetermined target angle θt, the second image A2 is automatically generated by a rotation with respect to the first image A1 such that the second angle θ2 of the surgical instrument 6 is equal to the target angle θt, and the second image A2 is displayed on the display screen 5a instead of the first image A1.

As described above, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a such that the angle θ of the surgical instrument 6 on the display screen 5a is equal to the target angle θt. Thus, the surgeon who is operating the surgical instrument 6 does not need to manually operate the endoscope 2 to adjust the vertical direction of the endoscope image A. In other words, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.

Second Embodiment

A controller, an endoscope system, a control method, and a control program according to a second embodiment of the present invention will be described below with reference to the accompanying drawings.

As illustrated in FIGS. 7A and 7B, the present embodiment is different from the first embodiment in that a user sets a target angle θt and a rotation angle Δθ of a second image A2 with respect to a first image A1 by using the angle of a surgical instrument 6 in an endoscope image A. In the present embodiment, configurations different from those of the first embodiment will be described. Configurations in common with the first embodiment are indicated by the same reference numerals and an explanation thereof is omitted.

As in the first embodiment, an endoscope system 10 according to the present embodiment includes a controller 1, an endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5.

A user interface 1f is configured to receive the inputs of a first trigger and a second trigger from a user. For example, the user interface 1f includes a first switch for a user input of the first trigger and a second switch for a user input of the second trigger.

In a rotation mode of the present embodiment, a processor 1a performs a control method shown in FIG. 8.

The control method according to the present embodiment includes step S1, step S11 of receiving the first trigger, step S12 of acquiring a third image A3, step S13 of detecting a third angle θ3 of the surgical instrument 6 in the third image A3, step S14 of setting the target angle θt at the third angle θ3, step S15 of receiving the second trigger, and steps S2, S3, and S5 to S8.

When the user interface 1f receives the input of the turn-on of the rotation mode (YES at step S1), the processor 1a waits for the reception of the first trigger by the user interface 1f (step S11). As illustrated in FIG. 7A, a surgeon moves the surgical instrument 6 so as to place the surgical instrument 6 at a desired angle θ3 in the endoscope image A displayed on a display screen 5a and inputs the first trigger to the user interface 1f.

In response to the reception of the first trigger by the user interface 1f (YES at step S11), the processor 1a acquires the third image A3, which is the latest endoscope image A, from the endoscope 2 (step S12).

The processor 1a then detects the third angle θ3 of the surgical instrument 6 in the third image A3 according to the same method as step S3 (step S13). Like a first angle θ1, the third angle θ3 is an angle formed by a longitudinal axis B of a shaft 6b with respect to a predetermined reference line L.

The processor 1a then sets the target angle θt at the third angle θ3 (step S14).

Thereafter, the processor 1a waits for the reception of the second trigger by the user interface 1f (step S15). As illustrated in FIG. 7B, the surgeon moves the surgical instrument 6 so as to change the angle of the surgical instrument 6 by a desired angle Δθ from the third angle θ3 in the endoscope image A displayed on the display screen 5a and inputs the second trigger to the user interface 1f.

In response to the reception of the second trigger by the user interface 1f (YES at step S15), the processor 1a acquires the first image, which is the latest endoscope image A (step S2), and detects the first angle θ1 of the surgical instrument 6 in the first image A1 (step S3).

Subsequently, the processor 1a calculates a difference Δθ between the first angle θ1 and the target angle θt (step S5) and generates, as illustrated in FIG. 7C, the second image A2 that is the endoscope image A rotated by the difference Δθ with respect to the first image A1 (step S6). The second image A2 is the endoscope image A displayed against a background including a biological tissue E present behind the surgical instrument 6, the background being rotated by an angle Δθ desired by a user with respect to the third image A3. A second angle θ2 of the surgical instrument 6 in the second image A2 is equal to the target angle θt that is the third angle θ3 of the surgical instrument 6 in the third image A3.

Thereafter, as in the first embodiment, steps S2 to S7 are repeated and the angle θ of the surgical instrument 6 on the display screen 5a is kept at the target angle θt.

As described above, according to the present embodiment, the surgeon can set the target angle θt and the difference Δθ at desired values by using the surgical instrument 6 in the endoscope image A. Moreover, the surgeon can rotate the background of the surgical instrument 6 by a desired angle A6 and properly adjust the orientation of the background. For example, when the biological tissue E is diagonally laid in the endoscope image A as illustrated in FIG. 7A, the vertical direction of the endoscope image A displayed on the display screen 5a can be adjusted so as to place the biological tissue E in the horizontal direction suitable for a procedure as illustrated in FIG. 7C. Thus, the rotation mode of the present embodiment is effective for adjusting the angle of the background in the endoscope image A displayed on the display screen 5a.

Furthermore, after the target angle θt and the difference Δθ are set, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a as in the first embodiment such that the angle θ of the surgical instrument 6 on the display screen 5a is equal to the target angle θt. Hence, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.

Third Embodiment

A controller, an endoscope system, a control method, and a control program according to a third embodiment of the present invention will be described below with reference to the accompanying drawings.

As illustrated in FIGS. 9A and 9B, the present embodiment is different from the first embodiment in that an angle θ of a surgical instrument 6 is the deflection angle of a tip 6a. In the present embodiment, configurations different from those of the first embodiment will be described. Configurations in common with the first embodiment are indicated by the same reference numerals and an explanation thereof is omitted.

As in the first embodiment, an endoscope system 10 according to the present embodiment includes a controller 1, an endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5.

In a rotation mode of the present embodiment, a processor 1a controls the rotation angle of an endoscope image A displayed on a display screen 5a such that the tip 6a of the surgical instrument 6 is disposed on a predetermined horizontal line Lh passing through the center of the display screen 5a. Specifically, the processor 1a performs a control method shown in FIG. 10.

The control method according to the present embodiment includes steps S1 and S2, step S21 of determining whether the tip 6a of the surgical instrument 6 is disposed on a predetermined reference line L, and steps S3 and S6 to S8.

As in the first embodiment, in response to the reception of the input of the turn-on of the rotation mode by a user interface 1f (YES at step S1), the processor 1a acquires a first image A1 (step S2).

The processor 1a then recognizes the surgical instrument 6 in the first image A1 and determines whether the tip 6a of the surgical instrument 6 is disposed on the predetermined reference line L (step S21). The predetermined reference line L is a horizontal line passing through a center point C of the first image A1 and corresponds to a predetermined horizontal line Lh passing through the center point of the display screen 5a.

If the tip 6a is disposed on the reference line L (YES at step S21), the processor 1a returns to step S2 without performing steps S3 and S6 to S8. In this case, the processor 1a outputs the first image A1 to the display device 5 and displays the image on the display screen 5a.

As illustrated in FIG. 9A, when the tip 6a is not disposed on the reference line L (NO at step S21), the processor 1a then detects a first angle θ1 of the surgical instrument 6 in the first image A1. The first angle θ1 is an angle formed by a line F connecting the center point C and the tip 6a with respect to the reference line L.

Subsequently, the processor 1a generates a second image A2 that is the endoscope image A rotated by the first angle θ1 with respect to the first image A1 (step S6). In other words, in the present embodiment, a target angle θt is 0°. In the second image A2, the tip 6a has a deflection angle (second angle) of 0° and is disposed on the predetermined reference line L.

Thereafter, steps S2, S21, S3, S6, and S7 are repeated, so that the angle θ of the surgical instrument 6 on the display screen 5a is kept at 0° and the tip 6a of the surgical instrument 6 is kept on the horizontal line Lh.

As described above, according to the present embodiment, when the tip 6a of the surgical instrument 6 in the first image A1, which is the endoscope image A captured by the endoscope 2, is disposed on the reference line L, the first image A1 is displayed on the display screen 5a. When the tip 6a is not disposed on the reference line L, the second image A2 rotated with respect to the first image A1 is automatically generated with the tip 6a disposed on the reference line L. The second image A2 is displayed on the display screen 5a instead of the first image A1.

As described above, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a such that the tip 6a of the surgical instrument 6 on the display screen 5a is disposed on the predetermined horizontal line Lh. Thus, the surgeon who is operating the surgical instrument 6 does not need to manually operate the endoscope 2 to adjust the vertical direction of the endoscope image A. In other words, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.

In the present embodiment, the predetermined reference line L is a horizontal line passing through the center point C. The direction and position of the predetermined reference line L can be optionally changed.

For example, if the tip 6a is to be disposed on a vertical line Lv passing through the center point of the display screen 5a, the predetermined reference line L may be a vertical line passing through the center point C. Alternatively, if the tip 6a is to be disposed on an inclined straight line passing through the center point of the display screen 5a, the predetermined reference line L may be an inclined straight line passing through the center point C.

Fourth Embodiment

A controller, an endoscope system, a control method, and a control program according to a fourth embodiment of the present invention will be described below with reference to the accompanying drawings.

As illustrated in FIGS. 11A to 11C, the present embodiment is different from the first embodiment in that a second image A2 is generated when a deviation of a first angle θ1 of a surgical instrument 6 in a first image A1 from a target angle θt exceeds a predetermined threshold value X. In the present embodiment, configurations different from those of the first embodiment will be described. Configurations in common with the first embodiment are indicated by the same reference numerals and an explanation thereof is omitted.

As in the first embodiment, an endoscope system 10 according to the present embodiment includes a controller 1, an endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5.

In the present embodiment, while the controller 1 controls the moving device 3, a processor 1a always operates in a rotation mode and performs a control method shown in FIG. 12.

The control method according to the present embodiment includes steps S2 and S3, step S31 of determining whether the absolute value of a difference between the first angle θ1 and the target angle θt is at most a predetermined threshold value, and steps S5 to S7.

The processor 1a acquires the first image A1, which is the latest endoscope image A, from the endoscope 2 (step S2) and detects the first angle θ1 of the surgical instrument 6 in the first image A1 (step S3).

The processor 1a then determines whether the first angle θ1 is within a range of the target angle θt±X (step S31).

As illustrated in FIG. 11A, if the first angle θ1 is within a range of the target angle θt±X (YES at step S31), that is, if the absolute value of a difference between the angles θ1 and θt is equal to or smaller than the threshold value X, the processor 1a returns to step S2 without performing steps S5 to S7. In this case, the processor 1a outputs the first image A1 to the display device 5 and displays the image on the display screen 5a.

As illustrated in FIG. 11B, if the first angle θ1 is out of a range of the target angle θt±X (NO at step S31), that is, if the absolute value of a difference between the angles θ1 and θt is larger than the threshold value X, the processor 1a then calculates a difference Δθ between the first angle θ1 and the target angle θt (step S5) and generates the second image A2 that is the endoscope image A rotated by the difference Δθ with respect to the first image A1 (step S6).

As in the first embodiment, by repeating steps S2 to S7, the endoscope image A displayed on the display screen 5a is rotated each time the first angle θ1 deviates from a range of the target angle θt±X, so that the angle of the surgical instrument 6 on the display screen 5a is kept in a predetermined range including the target angle θt.

As described above, according to the present embodiment, when the first angle θ1 of the surgical instrument 6 in the first image A1, which is the endoscope image A captured by the endoscope 2, is within a range of the target angle θt±X, the first image A1 is displayed on the display screen 5a. When the first angle θ1 deviates from a range of the target angle θt±X, the second image A2 is automatically generated by a rotation with respect to the first image A1 such that the second angle θ2 of the surgical instrument 6 is equal to the target angle θt, and the second image A2 is displayed on the display screen 5a instead of the first image A1.

As described above, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a such that the angle of the surgical instrument 6 on the display screen 5a is within a range of the target angle θt±X. Thus, the surgeon who is operating the surgical instrument 6 does not need to manually operate the endoscope 2 to adjust the vertical direction of the endoscope image A. In other words, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.

Also in the present embodiment, the control method may include steps S1 and S8 as in the first to third embodiments. The processor 1a may start and end the rotation mode in response to the inputs of the turn-on and turn-off of the rotation mode to a user interface 1f.

In the foregoing embodiments, the predetermined reference line L is a horizontal line of the first image A1 but is not limited thereto. A line extending in any direction can be set as the reference line L.

For example, the predetermined reference line L may be a vertical line extending in the longitudinal direction (vertical direction) of the first image A1 or a line extending in an oblique direction of the first image A1.

In the foregoing embodiments, the processor 1a generates the second image A2 rotated about the rotation axis D that passes through the center point C of the first image A1 and is parallel to the optical axis. The direction and position of the rotation axis D can be optionally changed depending upon a user request or the like. In other words, the rotation axis D may be inclined with respect to the optical axis and may pass through a position deviated from the center point C.

In the foregoing embodiments, the processor 1a may be configured to operate in any one of a first rotation mode in which the second image A2 is generated on the basis of the first angle θ1 of the surgical instrument 6 in the first image A1 and a second rotation mode in which a fourth image A4 is generated on the basis of anatomical characteristics in the first image A1. The processor 1a performs the second rotation mode in response to the reception of the input of the turn-off of the rotation mode by the user interface 1f.

In the first rotation mode, the processor 1a performs any one of the control methods according to the first to fourth embodiments.

As illustrated in FIGS. 13A and 13B, a movement of the endoscope 2 in a subject changes the angle of an anatomical characteristic G of a biological tissue in the endoscope image A displayed on the display screen 5a. The angle of the anatomical characteristic G is a rotation angle around the center point C of the endoscope image A. The second rotation mode is a mode in which the anatomical characteristic G in the endoscope image A is displayed at a predetermined target angle θs on the display screen 5a. The second rotation mode is used for a scene other than a procedure for an affected part, for example, before and after a procedure is performed on an affected part by the surgical instrument 6.

In the storage unit 1c, a predetermined target angle θs of the anatomical characteristic G in the endoscope image A is stored for each type of the anatomical characteristic G. In the second rotation mode, the processor 1a acquires the first image A1 from the endoscope 2, detects the anatomical characteristic G of a biological tissue in the first image A1, and recognizes the type of the anatomical characteristic G. Thereafter, as illustrated in FIG. 13C, the processor 1a generates the fourth image A4 that is rotated with respect to the first image A1 and has the anatomical characteristic G with an angle equal to the target angle θs, on the basis of the angle of the anatomical characteristic G in the first image A1 and the target angle θs of the type of the anatomical characteristic G stored in the storage unit 1c. For example, if the anatomical characteristic G is an aorta, the target angle θs is an angle of an aorta horizontally placed in the lower part of the endoscope image A.

When the vertical direction of the endoscope image A displayed on the display screen 5a changes, the layout of organs and biological tissues are viewed differently. Hence, in order to facilitate the recognition of organs or biological tissues in the endoscope image A by a surgeon, it is important to properly set the vertical direction of the endoscope image A displayed on the display screen 5a. When the first rotation mode is turned off, the processor 1a controls the vertical direction of the endoscope image A displayed on the display screen 5a in the second rotation mode. This can properly adjust the vertical direction of the endoscope image A displayed on the display screen 5a, thereby displaying the anatomical characteristic G of the biological tissue at the predetermined target angle θs on the display screen 5a.

The first to fourth embodiments may be implemented in combination as appropriate. For example, the processor 1a may be operable in the four rotation modes described in the first to fourth embodiments. In this case, a user may input selected one of the four rotation modes to the user interface 1f, and the processor 1a may perform the inputted rotation mode.

REFERENCE SIGNS LIST

  • 1 Controller
  • 1a Processor
  • 1c Storage unit
  • If User interface
  • 1g Control program
  • 2 Endoscope
  • 3 Moving device
  • 5 Display device
  • 5a Display screen
  • 6 Surgical instrument
  • 6b Shaft
  • 10 Endoscope system
  • θ1 First angle
  • θ2 Second angle
  • θ3 Third angle
  • θt Target angle
  • Δθ Difference
  • A Endoscope image (image)
  • A1 Endoscope image, first image
  • A2 Endoscope image, second image
  • A3 Endoscope image, third image
  • A4 Endoscope image, fourth image
  • B Shaft longitudinal axis
  • C Center point
  • D Rotation axis
  • G Anatomical characteristic
  • L Reference line
  • Lh Horizontal line

Claims

1. A controller configured to control an image that is captured by an image sensor of an endoscope and is displayed on a display screen of a display device,

the controller comprising a processor,
wherein the processor acquires a first image that is an image captured by the image sensor,
the processor detects a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image, and
the processor generates a second image rotated with respect to the first image on a basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.

2. The controller according to claim 1, wherein the processor generates the second image by rotating the image sensor.

3. The controller according to claim 1, wherein the processor generates the second image by rotating the first image through image processing.

4. The controller according to claim 1, wherein the target angle is 0°.

5. The controller according to claim 1, wherein the processor rotates the first image about a rotation axis parallel to an optical axis of the endoscope.

6. The controller according to claim 5, wherein the rotation axis passes through a center point of the first image.

7. The controller according to claim 1, wherein the first angle is an angle formed by a longitudinal axis of a shaft of the surgical instrument in the first image with respect to the reference line, and

the second angle is an angle formed by the longitudinal axis of the shaft of the surgical instrument in the second image with respect to the reference line.

8. The controller according to claim 1, wherein the reference line is a straight line that forms a predetermined angle with respect to a horizontal line in the first image and is fixed relative to the first image.

9. The controller according to claim 1, wherein the reference line is a horizontal line that is set in the first image and corresponds to a horizontal line on the display screen.

10. The controller according to claim 1, wherein the reference line is a vertical line that is set in the first image and corresponds to a vertical line on the display screen.

11. The controller according to claim 1, wherein the processor generates the second image when an absolute value of a difference between the first angle and the target angle is larger than a threshold value.

12. The controller according to claim 1, further comprising a user interface that receives an input of a trigger from a user,

wherein the processor acquires a third image in response to reception of a first trigger by the user interface, the third image being an image captured by the image sensor, and
the processor sets the target angle at a third angle of the surgical instrument in the third image, the third angle being an angle formed by the surgical instrument in the third image with respect to the reference line.

13. The controller according to claim 12, wherein after the target angle is set, the processor acquires the first image in response to reception of a second trigger by the user interface.

14. The controller according to claim 1, wherein the processor sets an angle of the surgical instrument at the target angle, the surgical instrument having a tip disposed on a predetermined horizontal line on the display screen.

15. The controller according to claim 1, wherein the processor is operable in a rotation mode, and

in the rotation mode, the processor regularly acquires the first image, detects the first angle, and generates the second image.

16. The controller according to claim 15, further comprising a user interface that receives an input of turn-on/turn-off of the rotation mode.

17. The controller according to claim 16, further comprising a storage unit that stores the predetermined target angle of an anatomical characteristic for each type of the anatomical characteristic,

wherein when the user interface receives the input of the turn-off of the rotation mode,
the processor recognizes the type of the anatomical characteristic in the first image, and
the processor generates a fourth image rotated with respect to the first image on a basis of the predetermined target angle of the recognized type, the predetermined target angle being stored in the storage unit.

18. An endoscope system comprising:

an endoscope;
a moving device that comprises a robot arm and that moves the endoscope in a subject; and
the controller according to claim 1.

19. A control method for controlling an image that is captured by an image sensor of an endoscope and is displayed on a display screen of a display device,

the control method comprising:
acquiring a first image that is an image captured by the image sensor;
detecting a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image; and
generating a second image rotated with respect to the first image on a basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.

20. A non-transitory computer-readable medium having a control program stored therein, the program being for controlling an image that is captured by an image sensor of an endoscope and is displayed on a display screen of a display device, the program causing a processor to execute functions of:

acquiring a first image that is an image captured by the image sensor;
detecting a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image; and
generating a second image rotated with respect to the first image on a basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.
Patent History
Publication number: 20230180996
Type: Application
Filed: Feb 3, 2023
Publication Date: Jun 15, 2023
Applicants: OLYMPUS CORPORATION (Tokyo), National Cancer Center (Tokyo)
Inventors: Mingxuan DAI (Tokyo), Ryota SASAI (Tokyo), Hiro HASEGAWA (Tokyo), Daichi KITAGUCHI (Tokyo), Nobuyoshi TAKESHITA (Tokyo), Shigehiro KOJIMA (Tokyo), Yuki FURUSAWA (Tokyo), Yumi KINEBUCHI (Tokyo), Masaaki ITO (Tokyo)
Application Number: 18/105,314
Classifications
International Classification: A61B 1/00 (20060101); A61B 1/045 (20060101); G06T 3/60 (20060101); H04N 23/62 (20060101); H04N 23/667 (20060101); G06T 7/70 (20060101);