IMAGE PROCESSING METHOD, PROGRAM, IMAGE PROCESSING DEVICE, AND OPHTHALMIC SYSTEM
An image processing method is provided. The image processing method includes: setting a first analysis point and a second analysis point on a fundus image so as to be symmetrical about a reference line; finding a first blood vessel running direction at the first analysis point and finding a second blood vessel running direction at the second analysis point; and comparing the first blood vessel running direction against the second blood vessel running direction.
Latest Nikon Patents:
- IMAGE SENSOR AND IMAGE-CAPTURING DEVICE INCLUDING ADJUSTMENT UNIT FOR REDUCING CAPACITANCE
- IMAGE PROCESSING METHOD, IMAGE PROCESSING PROGRAM, IMAGE PROCESSING DEVICE, AND OPHTHALMIC DEVICE
- FOCUS DETECTION DEVICE, IMAGING DEVICE, AND INTERCHANGEABLE LENS
- METHOD FOR MANUFACTURING SEMICONDUCTOR INTEGRATED CIRCUIT, METHOD FOR MANUFACTURING SEMICONDUCTOR DEVICE, AND EXPOSURE APPARATUS
- IMAGE PROCESSING METHOD, IMAGE PROCESSING DEVICE, AND PROGRAM
This application is a continuation application of International Application No. PCT/JP2019/016651 filed Apr. 18, 2019, the disclosure of which is incorporated herein by reference in its entirety. Further, this application claims priority from Japanese Patent Application No. 2018-080272, filed Apr. 18, 2018, the disclosure of which is incorporated herein by reference in their entirety.
TECHNICAL FIELDTechnology disclosed herein relates to an image processing method, a program, an image processing device, and an ophthalmic system.
RELATED ARTTechnology is disclosed in Japanese Patent Application Laid-Open (JP-A) No. 2015-202236 for extracting blood vessel regions and measuring blood vessel diameters. There has hitherto been demand to analyze fundus images and to measure blood vessel diameters.
SUMMARYAn image processing method of a first aspect of technology disclosed herein includes setting a first analysis point and a second analysis point on a fundus image so as to be symmetrical about a reference line, finding a first blood vessel running direction at the first analysis point and finding a second blood vessel running direction at the second analysis point, and analyzing asymmetry between the first blood vessel running direction and the second blood vessel running direction.
An image processing method of a second aspect of technology disclosed herein includes setting plural first analysis points in a first region in a fundus image and setting plural second analysis points in a second region in the fundus image, finding a first blood vessel running direction for each of the plural first analysis points, and finding a second blood vessel running direction for each of the plural second analysis points, and defining plural pairs of a first analysis point and a second analysis point that have line symmetry between the plural first analysis points and the plural second analysis points, and finding a symmetry indicator indicating symmetry between the first blood vessel running direction and the second blood vessel running direction for each of the plural defined pairs.
A program of a third aspect of technology disclosed herein causes the image processing method of the first aspect or the second aspect to be executed by a computer.
An image processing device of a fourth aspect of technology disclosed herein includes a storage device configured to store a program causing an image processing method to be executed in a processing device, and a processing device configured to execute the image processing method by executing the program stored in the storage device, wherein the image processing method is the image processing method of the first aspect or the second aspect.
An ophthalmic system of a fifth aspect of technology disclosed herein includes the image processing device of the fourth aspect, and an ophthalmic device configured to image a fundus image.
Detailed explanation follows regarding an exemplary embodiment in the present invention, with reference to the drawings. In the following, for ease of explanation, a scanning laser ophthalmoscope is referred to as an “SLO”.
The configuration of an ophthalmic system 100 will now be described with reference to
The ophthalmic device 110, the eye axial length measuring instrument 120, the management server 140, and the image viewer 150 are connected to each other over a network 130.
Note that other ophthalmic instruments (instruments for performing examinations such as optical coherence tomography (OCT) measurement, field of view measurement, and intraocular pressure measurement) and a diagnostic support device that performs image analysis using artificial intelligence may be connected over the network 130 to the ophthalmic device 110, the eye axial length measuring instrument 120, the management server 140, and the image viewer 150.
Explanation follows regarding a configuration of the ophthalmic device 110, with reference to
The control unit 20 includes a CPU 22, memory 24, a communication interface (I/F) 26, and the like. The display/operation unit 30 is a graphical user interface to display images obtained by imaging, and to receive various instructions including an imaging instruction. The display/operation unit 30 also includes a display 32 and an input/instruction device 34 such as a touch panel.
The SLO unit 40 includes a light source 42 for green light (G-light: wavelength 530 nm), a light source 44 for red light (R-light: wavelength 650 nm), and a light source 46 for infrared radiation (IR-light (near-infrared light): wavelength 800 nm). The light sources 42, 44, 46 respectively emit light as commanded by the control unit 20. The SLO unit 40 includes optical systems 50, 52, 54 and 56 that reflect or transmit light from the light sources 42, 44 and 46 in order to guide the reflected light into a single optical path. The optical systems 50 and 56 are mirrors, and the optical systems 52 and 54 are beam splitters. The G-light is reflected by the optical systems 50 and 54, the R-light is transmitted through the optical systems 52 and 54, and the IR-light is reflected by the optical systems 52 and 56, such that all are guided into a single optical path.
The SLO unit 40 includes a wide-angle optical system 80 for two-dimensionally scanning light from the light sources 42, 44, 46 across the posterior segment (fundus) of the examined eye 12. The SLO unit 40 includes a beam splitter 58 that, from out of the light from the posterior segment (fundus) of the examined eye 12, reflects the G-light and transmits light other than the G-light. The SLO unit 40 includes a beam splitter 60 that, from out of the light transmitted through the beam splitter 58, reflects the R-light and transmits light other than the R-light. The SLO unit 40 includes a beam splitter 62 that, from out of the light that has transmitted through the beam splitter 60, reflects IR-light. The SLO unit 40 is provided with a G-light detection element 72 that detects the G-light reflected by the beam splitter 58, an R-light detection element 74 that detects the R-light reflected by the beam splitter 60, and an IR-light detection element 76 that detects IR-light reflected by the beam splitter 62.
The wide-angle optical system 80 includes an X-direction scanning device 82 configured by a polygon mirror to scan the light from the light sources 42, 44, 46 in an X direction, a Y-direction scanning device 84 configured by a galvanometer mirror to scan the light from the light sources 42, 44, 46 in a Y direction, and an optical system 86 including a non-illustrated slit mirror and elliptical mirror to widen the angle over which the light is scanned. The optical system 86 is capable of achieving a field of view (FOV) of the fundus of a fundus peripheral portion ultra-wide angle (ultra wide field), enabling a fundus region to be imaged over a wide range. More specifically, a fundus region can be imaged over a wide range of approximately 120 degrees of external light illumination angles from outside the examined eye 12 (approximately 200 degrees about an eyeball center O of the examined eye 12 as a reference position for an internal light illumination angle capable of being imaged in practice by illuminating the fundus of the examined eye 12 with scanning light). The optical system 86 may be configured employing plural lens sets instead of a slit mirror and elliptical mirror. The X-direction scanning device 82 and the Y-direction scanning device 84 may each also be a scanning device employing a two-dimensional scanner configured by MEMS mirrors.
A configuration may employ a system using an elliptical mirror as described in International Applications PCT/JP2014/084619 or PCT/JP2014/084630 in cases in which a system including a slit mirror and an elliptical mirror is used as the optical system 86. The respective disclosures of International Application PCT/JP2014/084619 (International Publication WO2016/103484) filed on Dec. 26, 2014 and International Application PCT/JP2014/084630 (International Publication WO2016/103489) filed on Dec. 26, 2014 are incorporated by reference herein in their entireties.
Note that when the ophthalmic device 110 is installed on a horizontal plane, the “X direction” corresponds to a horizontal direction and the “Y direction” corresponds to a direction perpendicular to the horizontal plane. A direction joining the center of the pupil of the anterior eye portion of the examined eye 12 and the center of the eyeball is referred to as the “Z direction”. The X direction, the Y direction, and the Z direction are accordingly perpendicular to one another.
A color fundus image is obtained by imaging the fundus of the examined eye 12 simultaneously with G-light and R-light. More specifically, the control unit 20 controls the light sources 42, 44 such that the light sources 42, 44 emit light at the same time, and scans the G-light and R-light across the fundus of the examined eye 12 using the wide-angle optical system 80. G-light reflected from the fundus of the examined eye 12 is detected by the G-light detection element 72, and image data of a second fundus image (a G fundus image) is generated by the CPU 22 of the ophthalmic device 110. Similarly, R-light reflected from the fundus of the examined eye 12 is detected by the R-light detection element 74, and image data of a first fundus image (R fundus image) is generated by the CPU 22 of the ophthalmic device 110. In cases in which IR-light is illuminated, IR-light reflected from the fundus of the examined eye 12 is detected by the IR-light detection element 76, and image data of an IR fundus image is generated by the CPU 22 of the ophthalmic device 110.
The structure of the eye is configured by the vitreous body covered by plural layers that each have a different structure. These plural layers include the retina, the choroid, and the sclera in sequence from the side closest to the vitreous body outward. R-light passes through the retina and travels as far as the choroid. Accordingly, the first fundus image (R fundus image) includes information relating to blood vessels present in the retina (retinal blood vessels) and information relating to blood vessels present in the choroid (choroidal blood vessels). By contrast, G-light only travels as far as the retina. Accordingly, the second fundus image (G fundus image) includes information relating to the blood vessels (retinal blood vessels) present in the retina.
The CPU 22 of the ophthalmic device 110 mixes the first fundus image (R fundus image) and the second fundus image (G fundus image) together at a specific ratio, and displays the resulting color fundus image on the display 32. Note that a configuration may be adopted in which instead of the color fundus image, the first fundus image (R fundus image), the second fundus image (G fundus image), or an IR fundus image is displayed.
Image data of the first fundus image (R fundus image), image data of the second fundus image (G fundus image), and image data of the IR fundus image is sent from the ophthalmic device 110 to the management server 140 through a communication IF 166, and stored in a memory 164, described later.
The fundus of the examined eye 12 is accordingly imaged by the G-light and R-light at the same time, and so each of the positions on the first fundus image (R fundus image) and the positions on the second fundus image (G fundus image) corresponding to these respective positions, are the same positions on the fundus.
The eye axial length measuring instrument 120 in
As one item of data about a patient, the eye axial length is saved in the memory 164 as patient information in the management server 140, and is also utilized in fundus image analysis.
Next, a configuration of the management server 140 will be described with reference to
The configuration of the image viewer 150 is similar to that of the management server 140, and so description thereof is omitted.
Next, with reference to
Next, with reference to
The image processing program is executed by the management server 140 when generating a choroidal vascular image based on the image data of the fundus images imaged by the ophthalmic device 110.
A choroidal vascular image is generated in the following manner. The image processing section 182 of the management server 140 subjects the second fundus image (G fundus image) to black hat filter processing so as to extract the retinal blood vessels from the second fundus image (G fundus image). Next, the image processing section 182 removes the retinal blood vessels from the first fundus image (R fundus image) by performing in-painting processing employing the retinal blood vessels extracted from the second fundus image (G fundus image). Namely, processing is performed that uses position information relating to the retinal blood vessels extracted from the second fundus image (G fundus image) to infill the retinal blood vessel structure in the first fundus image (R fundus image) with the same values to those of surrounding pixels. The image processing section 182 then subjects the image data of the first fundus image (R fundus image) from which the retinal blood vessels have been removed to contrast-limited adaptive histogram equalization, thereby emphasizing the choroidal blood vessels in the first fundus image (R fundus image). A choroidal vascular image as illustrated in
Moreover, although the choroidal vascular image is generated from the first fundus image (R fundus image) and the second fundus image (G fundus image), and the image processing section 182 may next generate a choroidal vascular image employing the first fundus image (R fundus image) or the IR fundus image imaged with IR light. Regarding the method used to generate the choroidal fundus image, the disclosure of Japanese Patent Application No. 2018-052246, filed on Mar. 20, 2018, is incorporated in its entirety by reference herein.
When the image processing program is started, at step 202 of
At step 204, the image processing section 182 detects the optic nerve head ONH (see also
At step 206, the image processing section 182 detects the macular M (see also
At step 208, the image processing section 182 reads the coordinates of the macular M and the coordinates of the optic nerve head ONH computed from the G fundus image as illustrated in
At step 210, the image processing section 182 analyzes the blood vessel running directions of the choroidal blood vessels, at step 212 the image processing section 182 analyzes the symmetry of the blood vessel running directions of the choroidal blood vessel, and at step 214 the image processing section 182 saves the analysis results in the memory 164.
The processing of steps 210 and 212 is described in detail later.
The blood vessel running direction analysis processing of step 210 will next be described, with reference to
As illustrated in
The image processing section 182 arranges analysis points 240KU in the first region 274 so as to be positioned in a grid pattern with uniform spacings and M (natural number) rows in the up-down direction, and N (natural number) columns in the left-right (horizontal) direction. In
The image processing section 182 arranges analysis points 240KD in the second region 272 at positions having line symmetry with reference to the straight line LIN to the analysis points 240KU arranged in the first region 274.
Note that there is no limitation to positioning in a grid pattern of uniform spacing as long as the analysis points 240KU, 240KD are positioned in the first region 274 and the second region 272 at positions having line symmetry with reference to the straight line LIN, and configurations without a uniform spacing, or without a grid pattern, may also be adopted.
The size of the first region 274 and the second region 272 may be changed according to the eye axial length. L, M, and N may also be set to various values without limitation to those of the example described above. Increasing the number thereof increases the resolution.
At step 224, the image processing section 182 computes the blood vessel running direction of the choroidal blood vessels at each of the analysis points. Specifically, the image processing section 182 repeats the processing described below for all the analysis points. Namely, as illustrated in
The region 244 is illustrated in
The image processing section 182 then calculates the brightness gradient direction for each of the pixels in the cell 244 (expressed as an angle from 0° up to but not including 180°, with 0° defined as the direction of the straight line LIN (horizontal line)) based on the brightness values of the pixels surrounding the pixel being calculated. The gradient direction calculation is performed for all of the pixels in the cell 244.
Next, in order to create a histogram 242H with nine bins (each bin width being 20°) of gradient directions 0°, 20°, 40°, 60°, 80°, 100°, 120°, 140°, and 160° with reference to an angle reference line, the image processing section 182 counts the number of pixels inside the cell 244 with a gradient direction corresponding to each of the bins. The angle reference line is the straight line LIN. The width of a single bin in the histogram is 20°, and the number (count value) of pixels in the cell 244 having a gradient direction of from 0° up to but not including 10°, or a gradient direction of from 170° up to but not including 180° is set for the 0° bin. The number (count value) of pixels in the cell 244 having a gradient direction of from 10° up to but not including 30° is set for the 20° bin. The count values for the bins 40°, 60°, 80°, 100°, 120°, 140°, and 160° are set in a similar manner. Due to there being nine bins in the histogram 242, the blood vessel running direction at the analysis point 242 is defined as being in one of nine direction types. Note that the resolution of the blood vessel running direction can be raised by narrowing the width of each bin and increasing the number of bins.
The count values of each of the bins (the vertical axis in the histogram 242H) is normalized, and the histogram 242H is created for the analysis point 242 illustrated in
Next, the image processing section 182 identifies the blood vessel running direction at each analysis point from the histogram 242H. Specifically, the bin with the angle having the smallest count value, the 60° bin in the example illustrated in
A histogram 246H is similarly created for a cell 248 set for the analysis point 246. The 160° bin is identified as the bin having the smallest count from out of the bins of histogram 246H. The blood vessel running direction of the analysis point 246 is accordingly identified as 160°.
The histogram 242H and the histogram 246H are examples of a “first histogram” and a “second histogram” of the technology disclosed herein.
By performing the processing described above for all of the analysis points in both the first region and the second region, the blood vessel running directions are identified for each of the analysis points set in the choroidal vascular image. Namely, histograms are derived for each of the analysis points, as illustrated in
At step 226, the image processing section 182 saves the following data. Namely, the image processing section 182 saves in the memory 164 the position of the macular M, the position of the optic nerve head ONH, the rotation angle for rotating the choroidal vascular image through to make the straight line LIN horizontal, the positions (XY coordinates) of each of the analysis points (L analysis points), pairing information between analysis points that have line symmetry with reference to the straight line LIN (pairs of numbers of respective analysis points in the first and second regions), the blood vessel running direction at each of the analysis points, and the histogram at each of the analysis points.
Next, with reference to
At step 234, the image processing section 182 computes an asymmetric indicator value for pairs of analysis points having line symmetry with reference to the straight line LIN. The asymmetric indicator value is a difference in blood vessel running directions, and this difference is found from the histograms at each of the analysis points of the respective pair. A difference Δh in counts between each respective degree number bin in a given histogram pairs is found, and then Δh squared. Then the sum of Δh2 for all the bins is found by calculating ΣΔh2. The larger the value of ΣΔh2, the greater the difference in shape between the histograms, and the larger the asymmetry therebetween, and the smaller the value thereof, the closer the shapes of the histograms are, and the smaller the asymmetry.
The histograms at each of the analysis points in the respective pairs are examples of a “first blood vessel running direction” and a “second blood vessel running direction” of the technology disclosed herein.
Note that the asymmetric indicator value is not limited to such a sum of squared differences between the histograms of the analysis points in the respective pairs. A representative angle may be determined from the histograms at the analysis points in the respective pairs, and an absolute value difference computed therefrom.
At step 236, the image processing section 182 detects pairs of asymmetric analysis points. Specifically, the image processing section 182 detects any pairs for which the asymmetric indicator value for the pair is a threshold or greater as such being asymmetric analysis points. The threshold is a fixed value set in advance, or is an overall average value of the asymmetric indicator values for the pairs.
At step 238, the image processing section 182 saves the following data in the memory 164. Namely, for each of the pairs, the image processing section 182 saves in the memory 164 the asymmetric indicator values, a flag indicating whether or not the asymmetric indicator value is the threshold or greater (asymmetric or not), and the angle of the blood vessel running directions at the analysis points of the respective pairs.
Next description follows regarding a display screen in a choroidal vascular analysis mode. The memory 164 of the management server 140 holds data to create a display screen for the following choroidal vascular analysis mode, or contents data for display on this display screen.
This is specifically the following data. Image data for the fundus images (the first fundus image (R fundus image) and the second fundus image (G fundus image)) is transmitted from the ophthalmic device 110 to the management server 140, and the management server 140 holds the image data for the fundus images (the first fundus image (R fundus image) and the second fundus image (G fundus image)). The management server 140 also holds the image data of the choroidal vascular image (see
Moreover, personal information about a patient is also input to the ophthalmic device 110 when the fundus of the patient is being imaged. The personal information includes an ID, name, age, visual acuity, and the like of the patient. Moreover, information indicating whether the eye whose fundus is imaged is either the right eye or the left eye is also input when the fundus of the patient is being imaged. Furthermore, the imaging date/time is also input when the fundus of the patient is being imaged. Data for the personal information, right eye/left eye information, and imaging date/time is transmitted from the ophthalmic device 110 to the management server 140. The management server 140 holds the data for the personal information, right eye/left eye information, and imaging date/time. The management server 140 also holds data for the eye axial length.
Thus as described above, the management server 140 holds data for creating the display screen for the above choroidal vascular analysis mode.
When diagnosing a patient, the ophthalmologist performs the diagnosis while looking at the display screen of the choroidal vascular analysis mode being displayed on the image viewer 150. When this is being performed, the ophthalmologist uses the image viewer 150 to transmit a demand to display the choroidal vascular analysis mode screen to the management server 140 using a non-illustrated menu screen. On receipt of this demand, the display control section 184 of the management server 140 uses the contents data for the specified patient ID to create the choroidal vascular analysis mode display screen, and the processing section 186 transmits the image data for the display screen to the image viewer 150.
Note that the processing section 186 is an example of an “output section” of technology disclosed herein.
On receipt of the data for the choroidal vascular analysis mode display screen, the image viewer 150 displays the choroidal vascular analysis mode display screen 300 illustrated in
Explanation follows regarding the choroidal vascular analysis mode display screen 300 illustrated in
The personal information display field 302 includes a patient ID display field 304, a patient name display field 306, an age display field 308, an eye axial length display field 310, a visual acuity display field 312, and a patient selection icon 314. Various information is displayed in the patient ID display field 304, the patient name display field 306, the age display field 308, the eye axial length display field 310, and the visual acuity display field 312. Note that when the patient selection icon 314 is clicked, a list of patients is displayed on the display 172 of the image viewer 150, so as to let a user (ophthalmologist or the like) select the patient to be analyzed.
The image display field 320 includes imaging date display fields from 322N1 to 322N3, a right eye information display field 324R, a left eye information display field 324L, an RG image display field 326, a choroidal vascular image display field 328, and an information display field 342. Note that the RG image is an image obtained by combining the first fundus image (R fundus image) and the second fundus image (G fundus image) with the magnitudes of the respective pixel values combined at a specific ratio (for example, 1:1).
The choroid analysis tool display field 330 is a field displaying various icons to select plural choroid analysis tools. These include a vortex vein position icon 332, a symmetry icon 334, a blood vessel diameter icon 336, a vortex vein to macular/optic nerve head icon 338, and a choroid analysis report icon 340.
The vortex vein position icon 332 instructs display of vortex vein positions. The symmetry icon 334 instructs display of analysis point symmetry. The blood vessel diameter icon 336 instructs display of analysis results related to the diameters of the choroidal blood vessels. The vortex vein to macular/optic nerve head icon 338 instructs display of analysis results of analyzed positions between the vortex vein, and the macular and optic nerve head. The choroid analysis report icon 340 instructs display of a choroid analysis report.
Icons and buttons for instructing image generation, described later, are displayed on the display screen, also described later, of the image viewer 150. When the user of the image viewer 150 (an ophthalmologist or the like) clicks on one of the icons etc., an instruction signal corresponding to the clicked icon etc. is transmitted from the image viewer 150 to the management server 140. On receipt of the instruction signal from the image viewer 150, the management server 140 generates an image corresponding to the instruction signal, and transmits image data of the generated image to the image viewer 150. The image viewer 150 that has received the image data from the management server 140 then displays an image based on the received image data on a display 172. Display screen generation processing is performed in the management server 140 by performing a display screen generation program by operation of the CPU 162.
When the symmetry icon 334 is clicked in the choroid analysis tool display field 330 of
The asymmetric histogram display icon 346 and the asymmetric color display icon 348 are provided in the image display field 320 of the display screen of
When the asymmetry histogram display icon 346 is clicked in the image display field 320 of the display screen of
Moreover, instead of the RG image display field 326, the image viewer 150 displays histograms of each of the analysis points in the histogram display field 350, and emphasizes the display of pairs of histograms for which the asymmetric indicator value (ΣΔh2) is the specific value or greater. The analysis point numbers of asymmetric pairs are also displayed in the information display field 342.
In
In the histogram display field 350, the image viewer 150 displays boxes of the same color for the same pair and boxes of different colors for other pairs, such that respective boxes surrounding the histogram U1H of the analysis point U1 and the histogram D1H of the analysis point D1 are both a first color (for example, red), and respective boxes surrounding the histogram U11H of the analysis point U11 and the histogram D11H of the analysis point D11 are both a second color (for example, orange). The color of the boxes surrounding the analysis points in the choroidal vascular image display field 328 and the color of the boxes surrounding the histograms in the histogram display field 350, are the same color for the same analysis point number, thereby raising the visibility thereof.
In the information display field 342, the image viewer 150 displays the fact that the analysis point U1 and the analysis point D1, and the analysis point U11 and the analysis point D11, are asymmetric pairs, specifically by displaying text of “U1 and D1 are asymmetric”, and “U11 and D11 are asymmetric”.
A configuration may be adopted in which, when the asymmetric color display icon 348 is clicked on the image display field 320 of the display screen of
First, predetermined colors are associated with the magnitudes of the asymmetric indicator value (ΣΔh2) and stored in the memory 164 of the image viewer 150. For example, darker colors are associated with asymmetric indicator values (ΣΔh2) of larger magnitude. Moreover, the rectangular regions in the color map display field 360 are determined according to the respective analysis point numbers and positions.
Based on the magnitude of the asymmetric indicator value (ΣΔh2) corresponding to each of the analysis points and the predetermined color associated with that magnitude of asymmetric indicator value (ΣΔh2), the image viewer 150 displays rectangular regions corresponding to the analysis points, displayed in the color associated with the magnitude of the asymmetric indicator value (ΣΔh2) corresponding to each of the analysis points.
Moreover, in the histogram display field 360, the image viewer 150 displays boxes of the same color around the rectangular regions corresponding the same pair of asymmetric analysis points, and boxes of a different color for other pairs. For example, boxes are displayed in a first color (for example, red) around the rectangular regions RU1, RD1 corresponding to the analysis points U1, D1, and boxes are displayed in a second color (for example, orange) around the rectangular regions RU11, RD11 corresponding to the analysis points U11, D11.
As described above, in the present exemplary embodiment, the choroidal vascular image is analyzed, the asymmetry of pairs of analysis points having line symmetry with reference to the straight line LIN joining the macular M and the optic nerve head together is analyzed, and pairs of asymmetric analysis point are displayed with emphasis. This enables the asymmetry in the running direction of the choroidal blood vessels to be ascertained. Furthermore, fundus diagnosis by an ophthalmologist can be supported by the visualization of asymmetry in choroidal blood vessel running direction.
Moreover, UWF-SLO images can be obtained with ultrawide-angle of a range of 200° or greater for the angle about the eyeball center by using an SLO unit employing a wide-angle optical system. This enables analysis of asymmetry to be performed for a wide range including a peripheral area of the fundus by employing such UWF-SLO images.
Next, description follows regarding various modified examples of the technology disclosed herein.
First Modified ExampleIn the exemplary embodiment described above, the choroidal vascular image is divided into the first region and the second region by the straight line LIN joining the macular and the optic nerve head together, and the analysis points are disposed at positions in the first region and the second region having line symmetry with reference to the straight line LIN. However, the technology disclosed herein is not limited thereto. For example, the choroidal vascular image may be divided into a temporal region and a nasal region by a line (orthogonal line) orthogonal to the line LIN and having an origin at the midpoint between the macular and the optic nerve head. The analysis points may then be arranged at positions having line symmetry with reference to the orthogonal line. Furthermore, the choroidal vascular image may be divided by a line (intersecting line) having an origin at the midpoint between the macular and the optic nerve head and intersecting the straight line LIN at a specific angle, for example 45° or 135°, and the analysis points arranged at positions having line symmetry with reference to the intersecting line.
Second Modified ExampleIn the above exemplary embodiment the blood vessel running direction of the choroidal blood vessels is generated at each of the analysis points. However, the technology disclosed herein is not limited thereto. For example, a three-dimensional position of each pixel of the choroidal vascular image may be identified, and the blood vessel running direction of the choroidal blood vessels computed as a direction in three-dimensional space. Optical coherence tomography (OCT) volume (3D) data obtained using a non-illustrated OCT unit provided to the ophthalmic instrument 110 may employed to compute the three-dimensional positions and the directions in three-dimensional space.
Third Modified ExampleAlthough in the exemplary embodiment described above the management server 140 executes the image processing program illustrated in
In the exemplary embodiment described above an example has been described in which a fundus image is acquired by the ophthalmic device 110 with an internal light illumination angle of about 200 degrees. However, the technology disclosed herein is not limited thereto, and the technology disclosed herein may be applied even when the fundus image has been imaged by an ophthalmic device with an internal illumination angle of 100 degrees or less, and may also be applied to a montage image obtained by combining plural fundus images.
Fifth Modified ExampleIn the exemplary embodiment described above the fundus image is imaged by the ophthalmic device 110 equipped with an SLO imaging unit. However, the technology disclosed herein may be applied to a fundus image imaged using a fundus camera capable of imaging the choroidal blood vessels, and to images obtained by OCT angiography.
Sixth Modified ExampleAlthough in the above exemplary embodiment the asymmetry is analyzed from the choroidal blood vessel running direction, the technology disclosed herein may also be applied to analyze asymmetry from the retinal blood vessel running direction.
Seventh Modified ExampleIn the exemplary embodiment described above, the management server 140 executes the image processing program. However the technology disclosed herein is not limited thereto. For example, the image processing program may be executed by the ophthalmic device 110 or the image viewer 150.
Eighth Modified ExampleThe exemplary embodiment described above describes an example of the ophthalmic system 100 equipped with the ophthalmic device 110, the eye axial length measuring instrument 120, the management server 140, and the image viewer 150; however the technology disclosed herein is not limited thereto. For example, as a first example, the eye axial length measuring instrument 120 may be omitted, and the ophthalmic device 110 may be configured so as to further include the functionality of the eye axial length measuring instrument 120. Moreover, as a second example, the ophthalmic device 110 may be configured so as to further include the functionality of one or both of the management server 140 and the image viewer 150. For example, the management server 140 may be omitted in cases in which the ophthalmic device 110 includes the functionality of the management server 140. In such cases, the image processing program is executed by the ophthalmic device 110 or the image viewer 150. Moreover, the image viewer 150 may be omitted in cases in which the ophthalmic device 110 includes the functionality of the image viewer 150. As a third example, the management server 140 may be omitted, and the image viewer 150 may be configured so as to execute the functionality of the management server 140.
Other Modified ExamplesThe data processing described in the exemplary embodiment described above is merely an example thereof. Obviously, unnecessary steps may be omitted, new steps may be added, and the sequence of processing may be changed within a range not departing from the spirit thereof.
Moreover, although in the exemplary embodiment described above an example has been given of a case in which data processing is implemented by a software configuration utilizing a computer, the technology disclosed herein is not limited thereto. For example, instead of a software configuration utilizing a computer, the data processing may be executed solely by a hardware configuration of field programmable gate arrays (FPGAs) or application specific integrated circuits (ASICs). Alternatively, a portion of processing in the data processing may be executed by a software configuration, and the remaining processing may be executed by a hardware configuration.
Claims
1.-28. (canceled)
29. An image processing method comprising:
- setting a first analysis point and a second analysis point on a fundus image so as to be symmetrical about a reference line;
- computing a direction of a first blood vessel at the first analysis point;
- computing a direction of a second blood vessel at the second analysis point; and
- comparing the first direction against the second direction.
30. The image processing method of claim 29, wherein comparison of the first direction against the second direction comprises analysis of asymmetry between a running direction of the first blood vessel and a running direction of the second blood vessel.
31. The image processing method of claim 29, wherein comparison of the first direction against the second direction is performed by quantifying asymmetry between a running direction of the first direction and a running direction of the second direction.
32. The image processing method of claim 29, wherein the fundus image is a choroidal vascular image.
33. The image processing method of claim 29, wherein the first direction and the second direction relate to running directions of the choroidal blood vessels.
34. The image processing method of claim 29, wherein the first direction is derived based on a brightness gradient of pixels in a first specific region containing the first analysis point, and the second direction is derived based on a brightness gradient of pixels in a second specific region containing the second analysis point.
35. The image processing method of claim 29, further comprising generating an analysis screen displaying a first indicator indicating the first direction, a second indicator indicating the second direction, the fundus image, and an analysis result of analyzing asymmetry between the first direction and the second direction.
36. The image processing method of claim 35, wherein in the analysis screen, the first indicator is overlaid on the first analysis point of the fundus image, and the second indicator is overlaid on the second analysis point of the fundus image.
37. The image processing method of claim 35, wherein:
- the first indicator is a first arrow corresponding to a running direction of the first blood vessel; and
- the second indicator is a second arrow corresponding to a running direction of the second blood vessel.
38. The image processing method of claim 35, wherein:
- the first indicator is a first histogram derived based on a brightness gradient of pixels in a first specific region containing the first analysis point; and
- the second indicator is a second histogram derived based on a brightness gradient of pixels in a second specific region containing the second analysis point.
39. The image processing method of claim 35, wherein:
- the first indicator is a first angle numerical value corresponding to a running direction of the first blood vessel; and
- the second indicator is a second angle numerical value corresponding to a running direction of the second blood vessel.
40. The image processing method claim 29, wherein:
- the reference line is a straight line passing through a macular position and an optic nerve head position.
41. The image processing method of claim 29, wherein:
- a plurality of first analysis points is set in a first region in the fundus image, and a plurality of second analysis points is set in a second region in the fundus image, the first region and the second region being line symmetric about the reference line;
- the first direction of the first blood vessel is computed for each of the plurality of first analysis points, and the second direction of the second blood vessel is computed for each of the plurality of second analysis points; and
- a plurality of pairs of a first analysis point and a second analysis point that have line symmetry between the plurality of first analysis points and the plurality of second analysis points is defined, and a symmetry indicator indicating symmetry between the first direction and the second direction for each of the plurality of defined pairs is computed.
42. The image processing method of claim 41, further comprising extracting a pair in which the first direction and the second direction exhibit asymmetry based on the symmetry indicator indicating the symmetry of each of the plurality of defined pairs.
43. The image processing method of claim 42, further comprising generating an emphasized display image to emphasize display of the first analysis point and the second analysis point corresponding to the extracted pair exhibiting asymmetry.
44. The image processing method of claim 29, wherein the fundus image is obtained by wide-angle optical system.
45. A non-transitory computer-readable medium storing information processing program that causes a computer to:
- set a first analysis point and a second analysis point on a fundus image so as to be symmetrical about a reference line;
- compute a first direction of a first blood vessel at the first analysis point and compute a second direction of a second blood vessel at the second analysis point; and
- compare the first direction against the second direction.
46. An ophthalmic system comprising:
- an ophthalmic device configured to acquire a fundus image;
- a memory that stores a program that causes a processor to execute an image processing method; and
- a processor that executes the program stored in the memory to perform operations comprising:
- setting a first analysis point and a second analysis point on a fundus image so as to be symmetrical about a reference line;
- computing a first direction of a first blood vessel at the first analysis point and computing a second direction of a second blood vessel at the second analysis point; and
- comparing the first direction against the second direction.
Type: Application
Filed: Oct 15, 2020
Publication Date: Jan 28, 2021
Applicant: NIKON CORPORATION (Tokyo)
Inventor: Mariko HIROKAWA (Yokohama-shi)
Application Number: 17/071,495