INSPECTION SYSTEM AND INSPECTION METHOD
An inspection system includes: a light emitting device configured to illuminate a target object; a collimator lens arranged between the light emitting device and the target object; and an imaging device configured to image the target object. The light emitting device is capable of changing a light emission position. The inspection system further includes an image analysis unit configured to generate an analysis image in which a value of each pixel corresponds to a normal direction of a surface of the target object appearing in the pixel, by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other. As a result, time and effort in adjustment for inspection can be reduced, and defect detection accuracy is improved.
Latest OMRON CORPORATION Patents:
- Health device flow path formation member, health device flow path formation unit, and health device
- Control system, information processing device, and non-transitory computer readable medium
- Sphygmomanometer, blood pressure measurement method, and computer-readable recording medium
- Power conversion device that performs synchronous control of switching elements to achieve synchronous rectification
- Image processing device, image sensor, and image processing device control method
The present disclosure relates to an inspection system and an inspection method.
BACKGROUND ARTIn a factory automation (FA) field and the like, it is known to image a target object having a glossy surface such as metal while illuminating the target object, and inspect an appearance of the object by using the obtained image.
As a method of illumination, parallel light coaxial illumination is known. By using the parallel light coaxial illumination, when a defect such as a fine flaw or gentle irregularities exists on the surface of the target object, a gradation distribution of a luminance corresponding to the defect appears in the image. Therefore, the presence or absence of the defect can be inspected by checking the density distribution.
Utility Model Registration No. 3197766 (PTL 1) discloses a reflection phase shift method. In the reflection phase shift method, a target object is irradiated with slit light while being shifted by one cycle. Stripes are seen in an image obtained by imaging, and a luminance change varies depending on the presence or absence of a defect. Therefore, the defect can be detected by checking the luminance change. For example, a maximum value and a minimum value of a luminance of an image for one cycle are obtained for each position on a surface of a captured inspection object, and a defect on the surface of the inspection object is detected on the basis of a difference image between a maximum value image obtained by collecting maximum values and a minimum value image obtained by collecting minimum values at each position on the surface of the inspection object.
Patent No. 5866586 (PTL 2) discloses an inspection illumination apparatus including a filter means that forms a plurality of solid angle regions having different optical attributes as irradiation solid angles of light applied to each point of a target object. By using this inspection illumination apparatus, a shape, a size, and an inclination of the irradiation solid angle of light and the solid angle region having a specific optical attribute within the irradiation solid angle can be set substantially uniformly in the entire visual field. As a result, even a slight defect or the like can be detected under substantially the same detection condition.
CITATION LIST Patent Literature
- PTL 1: Utility Model Registration No. 3197766
- PTL 2: Patent No. 5866586
In a case of using the parallel light coaxial illumination, it is necessary to install a target object such that an optical axis of parallel light is parallel to a normal direction of a surface of the target object. Therefore, it takes time and effort to adjust installation of the target object.
In a case of using the reflection phase shift method described in PTL 1, when a diffuse reflectance of the surface of the target object is large, a contrast of observed stripes decreases, and defect detection accuracy is deteriorated.
In a case of using the inspection illumination apparatus described in PTL 2, it takes time and effort to adjust the filter means according to reflection characteristics of a surface of a target object.
The present disclosure has been made in view of the above problems, and an object thereof is to provide an inspection system and an inspection method that can reduce time and effort in adjustment for inspection and have high defect detection accuracy.
Solution to ProblemAccording to an example of the present disclosure, an inspection system for inspecting a surface of a target object includes a light emitting device configured to illuminate the target object, a collimator lens arranged between the light emitting device and the target object, and an imaging device configured to image the target object. The light emitting device is capable of changing a light emission position. The inspection system further includes an image analysis unit configured to generate a first analysis image in which a value of each pixel corresponds to a normal direction of a surface of the target object appearing in the pixel, by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other.
According to the disclosure described above, in the first analysis image, a value of a pixel in which a defect such as irregularities or a flaw appears where the normal direction of the surface changes is different from values of other pixels. Therefore, the defect can be accurately detected by checking the first analysis image.
Furthermore, a relative positional relationship between the target object and the imaging device is only required to be set such that the luminance shows a peak at any of the plurality of light emission positions. Therefore, it does not take time and effort to adjust the installation of the target object as in PTL 1. Furthermore, since the filter means as described in PTL 2 is not provided, it is not necessary to take time and effort to adjust the filter means.
As described above, according to the inspection system having the above configuration, time and effort in adjustment for inspection can be reduced, and defect detection accuracy can be enhanced.
In the disclosure described above, a value of each pixel of the first analysis image indicates a phase of a waveform indicating a relationship between a luminance of the pixel and a light emission position in a plurality of captured images.
The phase of the waveform indicating the relationship between a luminance of the pixel and the light emission position depends on the normal direction of the surface of the target object appearing in the pixel. Therefore, according to the disclosure described above, the first analysis image can accurately represent a distribution of the normal direction of the surface of the target object.
In the disclosure described above, a value of each pixel of the first analysis image indicates a light emission position at which a luminance peaks in the waveform indicating the relationship between the luminance of the pixel and the light emission position in the plurality of captured images.
The light emission position at which a luminance peaks depends on the normal direction of the surface of the target object. Therefore, according to the disclosure described above, the first analysis image can accurately represent a distribution of the normal direction of the surface of the target object.
In the disclosure described above, the image analysis unit further generates a second analysis image by analyzing a plurality of captured images. A value of each pixel of the second analysis image is an amplitude of a waveform indicating a relationship between a luminance of the pixel and a light emission position in the plurality of captured images.
According to the disclosure described above, by checking the second analysis image, it is possible to accurately detect a stain of a substance that reduces a degree of regular reflection of light or a fine flaw on a surface of a target object made of glass.
In the disclosure described above, the light emitting device includes a plurality of light sources, and sequentially switches a light source to emit light among the plurality of light sources. The plurality of light sources are arranged on a plane perpendicular to an optical axis of the collimator lens or on a spherical surface centered on the optical axis. According to the disclosure described above, the light emitting device can easily change the light emission position.
In the disclosure described above, the image analysis unit further generates a composite image obtained by combining a plurality of captured images.
According to the disclosure described above, by checking the composite image, it is possible to accurately detect a defect such as low-contrast unevenness causing diffuse reflection, without being affected by irregularities on the surface of the target object.
In the disclosure described above, a distance between the light emission position and the collimator lens is greater than or equal to a focal distance of the collimator lens.
According to the disclosure described above, in a case where the distance between the light emission position and the collimator lens is the focal distance of the collimator lens, an irradiation condition on the surface of the target object can be made uniform. In a case where the distance between the light emission position and the collimator lens is longer than the focal distance of the collimator lens, a degree of freedom of an installation location of the imaging device increases.
In the disclosure described above, the inspection system further includes a half mirror arranged between the collimator lens and the target object. Light emitted from the light emitting device and transmitted through the collimator lens is reflected by the half mirror to illuminate the target object. Light reflected by the target object is transmitted through the half mirror to be incident on the imaging device.
According to the disclosure described above, the imaging device can more easily image the target object irradiated with light from the light emitting device.
According to an example of the present disclosure, an inspection system for inspecting a surface of a target object includes a light emitting device configured to illuminate the target object, a collimator lens arranged between the light emitting device and the target object, and an imaging device configured to image the target object. The light emitting device is capable of changing a light emission position. The inspection system further includes an image analysis unit configured to generate an analysis image by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other. A value of each pixel of the analysis image is a feature quantity of a waveform indicating a relationship between a luminance of the pixel and a light emission position in the plurality of captured images.
According to the disclosure described above, a value of each pixel of the analysis image is a feature quantity of a waveform indicating a relationship between a luminance of the pixel and a light emission position in the plurality of captured images. The waveform indicating the relationship between a luminance of the pixel and the light emission position changes according to a normal direction of a surface of the target object appearing in the pixel and a degree of regular reflection of light on the surface. Therefore, a distribution of the normal direction of the surface of the target object or a distribution of a degree of regular reflection of light on the surface of the target object can be grasped by checking the analysis image in which the feature quantity of the waveform is the value of the pixel. As a result, it is possible to accurately detect defects such as irregularities and a flaw that affect the normal direction of the surface, or a defect such as a stain and a flaw that affects the degree of regular reflection of light.
Furthermore, a relative positional relationship between the target object and the imaging device is only required to be set such that the luminance shows a peak at any of the plurality of light emission positions. Therefore, it does not take time and effort to adjust the installation of the target object as in PTL 1. Furthermore, since the filter means as described in PTL 2 is not provided, it is not necessary to take time and effort to adjust the filter means.
As described above, according to the inspection system having the above configuration, time and effort in adjustment for inspection can be reduced, and defect detection accuracy can be enhanced.
According to an example of the present disclosure, an inspection method for inspecting a surface of a target object includes imaging the target object while irradiating the target object with light emitted from a light emitting device and transmitted through a collimator lens. The imaging includes switching light emission positions in the light emitting device, and acquiring a plurality of captured images from a plurality of times of imaging where the light emission positions are different from each other. The inspection method further includes generating an analysis image in which a value of each pixel corresponds to a normal direction of a surface of the target object appearing in the pixel by analyzing the plurality of captured images.
According to this disclosure as well, it is possible to reduce time and effort in adjustment for inspection and to improve defect detection accuracy.
Advantageous Effects of InventionAccording to the present disclosure, time and effort in adjustment for inspection can be reduced, and the defect detection accuracy can be enhanced.
Embodiments of the present invention will be described in detail with reference to the drawings. Note that the same or corresponding parts in the drawings are denoted by the same reference numerals, and the description thereof will not be repeated.
§1 Application ExampleWith reference to
As illustrated in
Light emitting device 10 is a device for illuminating target object 2. Light emitting device 10 is capable of changing a light emission position. In the example illustrated in
Collimator lens 12 is arranged between light emitting device 10 and target object 2, on an optical path. In the example illustrated in
Half mirror 14 is arranged between collimator lens 12 and target object 2, on the optical path. As illustrated in
As an example, imaging device 16 includes an imaging element partitioned into a plurality of pixels, such as a coupled charged device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor, in addition to an optical system such as a lens. Imaging device 16 is arranged on a side opposite to target object 2 with respect to half mirror 14 such that inspection target region 3 of target object 2 is included in a visual field. Specifically, imaging device 16 is arranged such that an optical axis 84 of imaging device 16 is orthogonal to optical axis 80 of collimator lens 12, and an angle formed by optical axis 84 of imaging device 16 and half mirror 14 is 45 degrees. This allows light reflected by target object 2 to be transmitted through half mirror 14 and incident on imaging device 16. Imaging device 16 outputs image data (hereinafter, referred to as a “captured image”) obtained by imaging, to image analysis unit 18.
Image analysis unit 18 analyzes a plurality of captured images obtained individually from a plurality of times of imaging where light emission positions are different from each other.
Light emitted from one light emission position in light emitting device 10 passes through collimator lens 12, is then reflected by half mirror 14, and illuminates inspection target region 3 of target object 2. Since collimator lens 12 is arranged between the light emission position and target object 2, an irradiation condition (a light amount, an irradiation angle, an irradiation solid angle, and the like) of each point in inspection target region 3 of target object 2 is uniform.
When the light emission position changes, the irradiation angle of light to each point of inspection target region 3 changes. In the example illustrated in
When target object 2 is an object having a glossy surface such as metal or glass, most of light applied to inspection target region 3 is regularly reflected. An amount of light (a regular reflection component) that is regularly reflected at each point of inspection target region 3 to be incident on imaging device 16 depends on the irradiation condition at the point and the normal direction of the point. Therefore, when the irradiation angle of light to each point of inspection target region 3 changes, the regular reflection component incident on imaging device 16 also changes. However, as described above, the irradiation condition of each point of inspection target region 3 is uniform. Therefore, at each point of inspection target region 3, a light emission position where the regular reflection component incident on imaging device 16 is maximized depends on the normal direction of the point. Therefore, in order to check a distribution of the normal direction of each point of inspection target region 3, image analysis unit 18 analyzes a change in luminance of each pixel for a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other.
For example, in a case where inspection target region 3 is completely flat, that is, in a case where the normal direction of each point of inspection target region 3 is constant, a change in luminance of each pixel is uniform for the plurality of captured images. Whereas, when a defect such as irregularities or a flaw exists in a part of inspection target region 3, a normal direction of the defect is different from a normal direction of a part other than the defect. Therefore, in the plurality of captured images, a change in luminance of a pixel in which a defect appears is different from a change in luminance of a pixel in which a portion other than the defect appears.
As described above, by analyzing a change in luminance of each pixel, a normal direction of the portion appearing in the pixel can be specified. Therefore, image analysis unit 18 generates an analysis image (hereinafter, referred to as a “normal line image 20”) in which a value of each pixel corresponds to a normal direction of a surface of target object 2 appearing in the pixel, by analyzing the plurality of captured images. In normal line image 20 illustrated in
A relative positional relationship between target object 2 and imaging device 16 is only required to be set such that the luminance shows a peak at any of the plurality of light emission positions. Therefore, unlike the case of using the parallel light coaxial illumination, it is not necessary to take time and effort to adjust installation of target object 2.
Further, inspection system 1 according to the present embodiment does not include a filter means that forms a plurality of solid angle regions having different optical attributes as an irradiation solid angle of light applied to each point of the target object, as described in PTL 2. Therefore, it is not necessary to take time and effort to adjust the filter means.
As described above, according to inspection system 1 of the present embodiment, time and effort in adjustment for inspection can be reduced, and defect detection accuracy can be enhanced.
§2 Specific Example A. Specific Example 1 of Inspection SystemLight emitting device 10A includes a plurality of light sources 101. Light source 101 may be a point light source or a surface light source.
The plurality of light sources 101 are arranged on virtual plane 82 perpendicular to optical axis 80 of collimator lens 12. A distance L between virtual plane 82 and collimator lens 12 coincides with focal distance f of collimator lens 12.
In the example illustrated in
Light emitting device 10A changes the light emission position by sequentially switching light source 101 to emit light among the plurality of light sources 101.
B. Specific Example 2 of Inspection SystemLight emitting device 10B includes one light source 102 and an XY stage 103 that moves light source 102 in the X direction and the Y direction on virtual plane 82 perpendicular to optical axis 80 of collimator lens 12. Light source 102 may be a point light source or a surface light source.
Light emitting device 10B changes the light emission position by moving XY stage 103.
C. Hardware Configuration of Image Analysis UnitImage analysis unit 18 typically has a structure according to a general-purpose computer architecture, and realizes various types of processing by a processor executing a program installed in advance.
Processor 180 exchanges programs (codes) and the like with system controller 183, and executes the programs and the like in a predetermined order, to implement target arithmetic processing.
System controller 183 is connected to each of processor 180, RAM 181, display controller 182, and I/O controller 184 via a bus, exchanges data with each part, and manages the entire processing of image analysis unit 18.
RAM 181 is typically a volatile storage device such as a dynamic random access memory (DRAM), and holds a program read from hard disk 185, a captured image received from imaging device 16, a processing result for the captured image, work data, and the like.
Display controller 182 is connected to a display device 5, and outputs a signal for displaying various types of information to display device 5 according to an internal command from system controller 183. As an example, display device 5 includes a liquid crystal display, an organic electro luminescence (EL) display, an organic EL, or the like.
I/O controller 184 controls data exchange between with a recording medium or an external device connected to image analysis unit 18. More specifically, I/O controller 184 is connected to hard disk 185, camera interface 186, input interface 187, communication interface 189, and memory card interface 190.
Hard disk 185 is typically a nonvolatile magnetic storage device, and stores an analysis program 191 and the like to be executed by processor 180. Analysis program 191 to be installed in this hard disk 185 is distributed in a state of being stored in a memory card 6 or the like. Further, hard disk 185 stores captured images. Note that, instead of hard disk 185, a semiconductor storage device such as a flash memory or an optical storage device such as a digital versatile disk random access memory (DVD-RAM) may be adopted.
Camera interface 186 corresponds to an input unit that receives a captured image generated by capturing an image of target object 2, and mediates data transmission between processor 180 and imaging device 16. More specifically, an imaging instruction is outputted from processor 180 to imaging device 16 via camera interface 186. As a result, imaging device 16 images a subject, and outputs a generated captured image to processor 180 via camera interface 186.
Input interface 187 mediates data transmission between processor 180 and an input device 7 such as a keyboard, a mouse, a touch panel, or a dedicated console. That is, input interface 187 receives an operation command given by a user operating input device 7.
Communication interface 189 mediates data transmission between processor 180 and another personal computer (not illustrated), a server device, or the like. Communication interface 189 typically includes Ethernet (registered trademark), a universal serial bus (USB), or the like. Note that, as will be described later, instead of a form of installing a program stored in memory card 6 into image analysis unit 18, a program downloaded from a distribution server or the like may be installed into image analysis unit 18 via communication interface 189.
Memory card interface 190 mediates data transmission between processor 180 and memory card 6 which is a recording medium. That is, analysis program 191 and the like to be executed by image analysis unit 18 are distributed in a state of being stored in memory card 6, and memory card interface 190 reads analysis program 191 from memory card 6. In response to an internal command of processor 180, memory card interface 190 writes a captured image acquired by imaging device 16 and/or a processing result in image analysis unit 18 into memory card 6. Note that memory card 6 includes a general-purpose semiconductor storage device such as secure digital (SD), a magnetic recording medium such as a flexible disk, an optical recording medium such as a compact disk read only memory (CD-ROM), or the like.
In a case where a computer having a structure according to a general-purpose computer architecture as described above is used, an operating system (OS) for providing basic functions of the computer may be installed in addition to an application for providing functions according to the present embodiment. In this case, a program according to the present embodiment may execute processing among program modules provided as a part of the OS, by calling necessary modules in a predetermined order and/or timing. That is, there is case where the program itself according to the present embodiment does not include the modules as described above, and processing is executed in cooperation with the OS.
Furthermore, analysis program 191 according to the present embodiment may be provided by being incorporated in a part of another program. Also in this case, the program itself does not include modules included in another program to be combined as described above, and processing is executed in cooperation with such another program. That is, analysis program 191 according to the present embodiment may have a form of being incorporated in such another program.
Note that, alternatively, some or all of functions provided by executing analysis program 191 may be implemented as a dedicated hardware circuit.
D. Image Analysis MethodWith reference to
As illustrated in
As illustrated in
As described above, a phase of a waveform indicating a relationship between a luminance of each pixel and light emission position n depends on a normal direction of a surface of target object 2 appearing in the pixel. By using this point, image analysis unit 18 analyzes the selected seven captured images to generate a normal line image 20Y in which a value of each pixel corresponds to a normal direction of a surface of target object 2 appearing in the pixel.
Image analysis unit 18 performs discrete Fourier transform on the waveform indicating the relationship between a luminance of each pixel and light emission position n, to obtain a phase of a component of frequency 1. Specifically, for each pixel, image analysis unit 18 performs sine wave fitting using the following Equation (1) on the waveform indicating the relationship between a luminance of the pixel and light emission position n, and calculates a phase ϕ. In Equation (1), N represents the number of captured images. For example, in a case where the seven captured images surrounded by the solid line illustrated in
[Formula 1] Formula 1
Image analysis unit 18 generates normal line image 20Y in which phase ϕ is a pixel value. Phase ϕ corresponds to a normal direction of a surface of target object 2 appearing in the pixel.
As illustrated in
As illustrated in
Note that, as a value of each pixel of normal line image 20Y, image analysis unit 18 may use a value indicating light emission position n at which a luminance peaks in the waveform indicating a relationship between the luminance of the pixel and light emission position n in the plurality of captured images, instead of phase ϕ calculated by Equation (1). Also in this case, the value of each pixel of normal line image 20Y corresponds to the normal direction of the surface of target object 2 appearing in the pixel.
Image analysis unit 18 may generate another analysis image in addition to normal line image 20Y or instead of normal line image 20Y. For example, image analysis unit 18 may calculate an amplitude of the waveform indicating the relationship between a luminance of each pixel and light emission position n, and generate an analysis image (hereinafter, referred to as a “direct reflection image”) in which the amplitude is a value of each pixel.
The amplitude of the waveform is larger as a regular reflection component of light applied to the surface of target object 2 is more (in other words, as a diffusely reflected component is less). In order for regular reflection light to be incident on imaging device 16, it is necessary for the light emission position in light emitting device 10, the normal direction of the surface, and optical axis 84 of imaging device 16 to satisfy predetermined conditions. For this reason, in a pixel in which a surface with a large amount of a regularly reflected component of irradiated light appears, a change in luminance is large when the light emission position is changed. As a result, the amplitude increases. On the contrary, in a pixel in which a surface having a large amount of a diffusely reflected component in irradiated light appears, a change in luminance when the light emission position is changed is small. As a result, the amplitude decreases. Therefore, a direct reflection image having the amplitude as the value of each pixel indicates a distribution of a degree of regular reflection of light on a surface of target object 2 appearing in the image.
Specifically, image analysis unit 18 calculates an amplitude A of the waveform indicating the relationship between a luminance of each pixel and light emission position n according to the following Equation (2). Image analysis unit 18 generates a direct reflection image having the calculated amplitude A as a pixel value.
[Formula 2] Formula 2
In a region surrounded by a frame line 62 of direct reflection image 22Y illustrated in
As described above, when a stain of a substance (for example, ink) that reduces the degree of regular reflection of light attaches to the surface of target object 2, the stain can be accurately detected by checking direct reflection image 22Y.
In the examples of
Furthermore, image analysis unit 18 may generate a composite image obtained by combining a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other. Specifically, image analysis unit 18 calculates a value B represented by the following Equation (3) for each pixel. Image analysis unit 18 generates a composite image having the calculated value B as a pixel value.
[Formula 3] Formula 3
As described above, an analysis image (a normal line image, a direct reflection image, or a composite image) generated by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other is effective for detecting a defect. The analysis image used for inspection is appropriately selected from the normal line image, the direct reflection image, and the composite image according to a type of a defect to be detected.
The direct reflection image is effective not only for a stain of a substance that reduces a degree of regular reflection of light, but also for detection of a fine flaw on a surface of target object 2 made of glass, for example.
As illustrated in
Image analysis unit 18 may generate another analysis image by executing image processing on a normal line image and a direct reflection image.
Next, image analysis unit 18 generates normal line image 20X and a direct reflection image 22X by analyzing a plurality of captured images when the light emission position is changed in the X direction (step S2). Furthermore, image analysis unit 18 generates normal line image 20Y and direct reflection image 22Y by analyzing a plurality of captured images when the light emission position is changed in the Y direction (step S3). Image analysis unit 18 generates composite image 24 by combining a plurality of captured images when the light emission position is changed in the X direction and the Y direction (step S4).
Next, image analysis unit 18 generates a differential image 21X by applying a differential filter in the X direction to normal line image 20X (step S5). Further, image analysis unit 18 generates a differential image 21Y by applying a differential filter in the Y direction to normal line image 20Y (step S6).
Next, image analysis unit 18 generates an image 23X by multiplying differential image 21X and direct reflection image 22X (step S7). Further, image analysis unit 18 generates an image 23Y by multiplying differential image 21Y and direct reflection image 22Y (step S7).
Next, image analysis unit 18 generates an irregularity image 25 by adding image 23X and image 23Y (step S9). Thereafter, image analysis unit 18 performs binarization processing on irregularity image 25 to generate a binary image 26 (step S10).
In the example illustrated in
For example, in a case of a defect having an irregular shape, a normal direction is different between the defect and a portion other than the defect. Therefore, the defect can be accurately detected by using normal line images 20X and 20Y. Alternatively, in a case of a defect having an irregular shape, the normal direction changes steeply at a boundary between the defect and a portion other than the defect. Therefore, the defect can be accurately detected by using differential images 21X and 21Y.
Depending on a surface state of target object 2, a change in luminance in a captured image when the light emission position is changed in the X direction may be greatly different from a change in luminance in a captured image when the light emission position is changed in the Y direction. For example, in a case of target object 2 in which a hairline is formed on the surface, a change in luminance in the captured image when the light emission position is changed in the X direction is greatly different from a change in luminance in the captured image when the light emission position is changed in the Y direction. In such a case, an influence of the hairline is excluded by adding images 23X and 23Y generated respectively by multiplying differential images 21X and 21Y and direct reflection images 22X and 22Y. That is, only a change in normal direction due to the defect is reflected in a value of each pixel of irregularity image 25. Consequently, even in target object 2 having a hairline formed on the surface, a defect having an irregular shape can be accurately detected by using irregularity image 25. Alternatively, also when binary image 26 generated by binarizing irregularity image 25 is used, a defect having an irregular shape can be accurately detected.
E. Method of Changing Light Emission Position of Light Emitting Device 10AIn a case of using light emitting device 10A (see
As illustrated in
Normal line images 20X and 20Y are generated using regular reflection light at each point of inspection target region 3 of target object 2. Therefore, imaging device 16 preferably receives regular reflection light when light is emitted in at least one light emission position among the plurality of light emission positions, for each point of inspection target region 3.
In a case where the optical system included in imaging device 16 is the same collimator lens as collimator lens 12, imaging device 16 is arranged such that a position of the optical system coincides with a position of a virtual image 96 of collimator lens 12, and a position of an imaging element of imaging device 16 coincides with a position of a virtual image 802 of light emitting device 10A. As a result, imaging device 16 can receive regular reflection light from all points on inspection target region 3.
In a case where a size of the optical system included in imaging device 16 is smaller than that of collimator lens 12 (for example, in a case where the optical system included in imaging device 16 is a pinhole lens), a light emission position at which a luminance peaks may change according to a position of a point of inspection target region 3. Therefore, even if inspection target region 3 is completely flat, a change (error) is observed in a value of each pixel in normal line images 20X and 20Y. However, a degree of change is smaller than a degree of change caused by a defect such as irregularities or a flaw. Therefore, the defect can be detected by checking normal line images 20X and 20Y or differential images 21X and 21Y obtained respectively by differentiating normal line images 20X and 20Y. Note that, as will be described later, the above error can be reduced by making distance L between light emitting devices 10A and 10B and collimator lens 12 larger than focal distance f of collimator lens 12.
In a case where the normal direction of inspection target region 3 is distributed within a certain error range, a region where imaging device 16 can be arranged is further limited according to the error range, in order to receive regular reflection light of each point of inspection target region 3.
In a case where imaging device 16 has a telecentric optical system, imaging device 16 may simply be arranged such that the telecentric optical system overlaps with region 94. Therefore, a degree of freedom of arrangement of imaging device 16 is increased.
G. Action and EffectAs described above, inspection system 1 (1A, 1B) inspects a surface of target object 2. Inspection system 1 (1A, 1B) includes light emitting device 10 (10A, 10B) configured to illuminate target object 2, collimator lens 12 arranged between light emitting device 10 (10A, 10B) and target object 2, and imaging device 16 configured to image target object 2. Light emitting device 10 (10A, 10B) is capable of changing a light emission position. Inspection system 1 (1A, 1B) further includes image analysis unit 18 configured to analyze a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other. Image analysis unit 18 generates normal line image 20 (20X, 20Y) in which a value of each pixel corresponds to a normal direction of a surface of target object 2 appearing in the pixel.
In normal line image 20 (20X, 20Y), a value of a pixel in which a defect such as irregularities and a flaw appears where the normal direction of the surface changes is different from values of other pixels. As a result, the defect can be accurately detected by checking normal line image 20 (20X, 20Y).
Furthermore, a relative positional relationship between target object 2 and imaging device 16 is only required to be set such that a luminance shows a peak at any of the plurality of light emission positions. Therefore, it does not take time and effort to adjust a relative positional and orientational relationship between target object 2 and imaging device 16. Further, since inspection system 1 (1A, 1B) does not include the filter means as described in PTL 2, it is not necessary to take time and effort to adjust the filter means.
As described above, according to inspection system 1 (1A, 1B), time and effort in adjustment for inspection can be reduced, the defect detection accuracy can be enhanced.
A value of each pixel of normal line image 20 (20X, 20Y) indicates phase ϕ of a waveform indicating a relationship between a luminance of the pixel and a light emission position in the plurality of captured images. As illustrated in
A value of each pixel of normal line image 20 (20X, 20Y) may indicate a light emission position at which a luminance peaks in a waveform indicating a relationship between the luminance of the pixel and the light emission position in the plurality of captured images. As illustrated in
Image analysis unit 18 may further generate direct reflection images 22X and 22Y by analyzing the plurality of captured images. A value of each pixel of direct reflection images 22X and 22Y is amplitude A of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
By checking direct reflection images 22X and 22Y, it is possible to accurately detect a stain of a substance (for example, ink) that reduces a degree of regular reflection of light as illustrated in
Light emitting device 10A includes a plurality of light sources 101, and sequentially switches the light source to emit light among the plurality of light sources 101. The plurality of light sources 101 are arranged on virtual plane 82 perpendicular to optical axis 80 of collimator lens 12. This allows light emitting device 10A to easily change the light emission position.
Image analysis unit 18 may further generate composite image 24 obtained by combining a plurality of captured images. By checking composite image 24, it is possible to accurately detect a defect such as low-contrast unevenness causing diffuse reflection, without being affected by irregularities on the surface of target object 2.
Note that image analysis unit 18 may generate at least one of normal line image 20 (20X, 20Y) or direct reflection images 22X and 22Y. That is, image analysis unit 18 generates an analysis image by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other. A value of each pixel of the analysis image (including at least one of normal line image 20 (20X, 20Y) or direct reflection images 22X and 22Y) is a feature quantity (including a phase and an amplitude) of a waveform indicating a relationship between a luminance of the pixel and the light emission position in a plurality of captured images.
H. ModificationIn Specific Example 1 illustrated in
Light emitting device 10C includes a plurality of light sources 101 similarly to light emitting device 10A of Specific Example 1. The plurality of light sources 101 are arranged on virtual plane 82 perpendicular to optical axis 80 of collimator lens 12. Distance L between virtual plane 82 and collimator lens 12 coincides with focal distance f of collimator lens 12.
The plurality of light sources 101 are arranged along a circumferential direction. Specifically, the plurality of light sources 101 includes one light source 101a, four light sources 101b, 12 light sources 101c, and 16 light sources 101d. Light source 101a is located on optical axis 80 of collimator lens 12. The four light sources 101b are arranged at equal intervals along a circumference having a radius of r1 from optical axis 80. The 12 light sources 101c are arranged at equal intervals along a circumference having a radius r2 (> r1) from optical axis 80. The 16 light sources 101d are arranged at equal intervals along a circumference having a radius r3 (> r2) from optical axis 80.
In four captured images obtained individually by imaging in the first to fourth lighting states, a lighting state in which a luminance of each pixel is maximized depends on an angle θ (see
In the example illustrated in
Light emitting device 10D includes a plurality of light sources 101 similarly to light emitting device 10C of Modification 1. However, the plurality of light sources 101 are arranged on a spherical virtual curved surface 88 having a center on optical axis 80 of collimator lens 12. A distance L1 between light source 101 located on optical axis 80 and collimator lens 12 is the same as focal distance f of collimator lens 12. A center and a radius of the sphere having virtual curved surface 88 as a surface are set according to the field curvature of collimator lens 12. As a result, it is possible to suppress an influence of distortion of a light beam caused by the field curvature of collimator lens 12.
In the example illustrated in
As illustrated in
As described above, the present embodiment includes the disclosure as follows.
Configuration 1An inspection system (1, 1A to 1D) for inspecting a surface of a target object (2), the inspection system (1, 1A to 1D) including:
- a light emitting device (10, 10A to 10D) configured to illuminate the target object (2);
- a collimator lens (12) arranged between the light emitting device (10, 10A to 10D) and the target object (2); and
- an imaging device (16) configured to image the target object (2),
- wherein the light emitting device (10, 10A to 10D) is capable of changing a light emission position, and
- the inspection system (1, 1A to 1D) further includes:
- an image analysis unit (18) configured to generate a first analysis image (20, 20X, 20Y) in which a value of each pixel corresponds to a normal direction of a surface of the target object (2) appearing in the pixel, by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other.
The inspection system (1, 1A to 1D) according to Configuration 1, wherein a value of each pixel of the first analysis image (20, 20X, 20Y) indicates a phase of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
Configuration 3The inspection system (1, 1A to 1D) according to Configuration 1, wherein a value of each pixel of the first analysis image (20, 20X, 20Y) indicates the light emission position at which a luminance peaks in a waveform indicating a relationship between the luminance of the pixel and the light emission position in the plurality of captured images.
Configuration 4The inspection system (1, 1A to 1D) according to Configuration 1, wherein
- the image analysis unit (18) further generates a second analysis image (22X, 22Y) by analyzing the plurality of captured images, and
- a value of each pixel of the second analysis image (22X, 22Y) is an amplitude of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
The inspection system (1, 1A, 1C, 1D) according to any one of Configurations 1 to 4, wherein
- the light emitting device (10, 10A, 10C, 10D) includes a plurality of light sources (101, 101a to 101d), and sequentially switches a light source to emit light among the plurality of light sources (101, 101a to 101d), and
- the plurality of light sources (101, 101a to 101d) are arranged on a plane (82) perpendicular to an optical axis (80) of the collimator lens (12) or on a spherical surface (88) centered on the optical axis (80).
The inspection system (1, 1A to 1D) according to any one of Configurations 1 to 5, wherein the image analysis unit (18) further generates a composite image (24) obtained by combining the plurality of captured images.
Configuration 7The inspection system (1, 1A to 1D) according to any one of Configurations 1 to 6, wherein a distance between the light emission position and the collimator lens (12) is greater than or equal to a focal distance of the collimator lens (12).
Configuration 8The inspection system (1, 1A to 1D) according to any one of Configurations 1 to 7, further including:
- a half mirror (14) arranged between the collimator lens (12) and the target object (2),
- wherein light emitted from the light emitting device (10, 10A to 10D) and transmitted through the collimator lens (12) is reflected by the half mirror (14) to illuminate the target object (2), and
- light reflected by the target object (2) is transmitted through the half mirror (14) to be incident on the imaging device (16).
An inspection system (1, 1A to 1D) for inspecting a surface of a target object (2), the inspection system (1, 1A to 1D) including:
- a light emitting device (10, 10A to 10D) configured to illuminate the target object (2);
- a collimator lens (12) arranged between the light emitting device (10, 10A to 10D) and the target object (2); and
- an imaging device (16) configured to image the target object,
- wherein the light emitting device (10, 10A to 10D) is capable of changing a light emission position,
- the inspection system (1, 1A to 1D) further includes:
- an image analysis unit (18) configured to generate an analysis image by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other, and
- a value of each pixel of the analysis image is a feature quantity of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
An inspection method for inspecting a surface of a target object (2), the inspection method including:
- imaging the target object (2) while irradiating the target object (2) with light emitted from a light emitting device (10, 10A to 10D) and transmitted through a collimator lens (12),
- wherein the imaging includes:
- switching a light emission position in the light emitting device (10, 10A to 10D); and
- acquiring a plurality of captured images from a plurality of times of imaging where the light emission positions are different from each other, and
- the inspection method further includes:
- generating an analysis image (20, 20X, 20Y) in which a value of each pixel corresponds to a normal direction of a surface of the target object (2) appearing in the pixel, by analyzing the plurality of captured images.
Although the embodiment of the present invention has been described, it should be considered that the embodiment disclosed herein is illustrative in all respects and not restrictive. The scope of the present invention is defined by the claims, and it is intended to include all modifications within the meaning and scope equivalent to the claims.
REFERENCE SIGNS LIST1, 1A to 1D: inspection system, 2: target object, 3: inspection target region, 5: display device, 6: memory card, 7: input device, 10, 10A to 10D: light emitting device, 12: collimator lens, 14: half mirror, 16: imaging device, 18: image analysis unit, 20, 20X, 20Y: normal line image, 21X, 21Y: differential image, 22X, 22Y: direct reflection image, 23X, 23Y: image, 24: composite image, 25: irregularity image, 26: binary image, 28: difference image, 50, 51, 52: pixel, 60, 62: frame line, 80, 84: optical axis, 82: virtual plane, 86: normal direction, 88: virtual curved surface, 90, 92: luminous flux, 94: region, 96, 801, 802: virtual image, 101, 101a to 101d, 102: light source, 103: XY stage, 180: processor, 181: RAM, 182: display controller, 183: system controller, 184: I/O controller, 185: hard disk, 186: camera interface, 187: input interface, 189: communication interface, 190: memory card interface, 191: analysis program, F, F1: defect, L, L1: distance, LA, LB, LC: light, P, Q: end point, PA, PB, PC: light emission position
Claims
1. An inspection system for inspecting a surface of a target object, the inspection system comprising:
- a light emitting device configured to illuminate the target object;
- a collimator lens arranged between the light emitting device and the target object; and
- an imaging device configured to image the target object,
- wherein the light emitting device is capable of changing a light emission position, and
- the inspection system further comprises: an image analysis unit configured to generate a first analysis image in which a value of each pixel corresponds to a normal direction of a surface of the target object appearing in the pixel, by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other.
2. The inspection system according to claim 1, wherein a value of each pixel of the first analysis image indicates a phase of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
3. The inspection system according to claim 1, wherein a value of each pixel of the first analysis image indicates the light emission position at which a luminance peaks in a waveform indicating a relationship between the luminance of the pixel and the light emission position in the plurality of captured images.
4. The inspection system according to claim 1, wherein
- the image analysis unit further generates a second analysis image by analyzing the plurality of captured images, and
- a value of each pixel of the second analysis image is an amplitude of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
5. The inspection system according to claim 1, wherein
- the light emitting device includes a plurality of light sources, and sequentially switches a light source to emit light among the plurality of light sources, and
- the plurality of light sources are arranged on a plane perpendicular to an optical axis of the collimator lens or on a spherical surface centered on the optical axis.
6. The inspection system according to claims 1, wherein the image analysis unit further generates a composite image obtained by combining the plurality of captured images.
7. The inspection system according to claim 1, wherein a distance between the light emission position and the collimator lens is greater than or equal to a focal distance of the collimator lens.
8. The inspection system according to claim 1, further comprising a half mirror arranged between the collimator lens and the target object,
- wherein light emitted from the light emitting device and transmitted through the collimator lens is reflected by the half mirror to illuminate the target object, and
- light reflected by the target object is transmitted through the half mirror to be incident on the imaging device.
9. An inspection system for inspecting a surface of a target object, the inspection system comprising:
- a light emitting device configured to illuminate the target object;
- a collimator lens arranged between the light emitting device and the target object; and
- an imaging device configured to image the target object,
- wherein the light emitting device is capable of changing a light emission position, and
- the inspection system further comprises: an image analysis unit configured to generate an analysis image by analyzing a plurality of captured images obtained individually from a plurality of times of imaging where the light emission positions are different from each other, and a value of each pixel of the analysis image is a feature quantity of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
10. An inspection method for inspecting a surface of a target object, the inspection method comprising:
- imaging the target object while irradiating the target object with light emitted from a light emitting device and transmitted through a collimator lens,
- wherein the imaging includes: switching a light emission position in the light emitting device; and acquiring a plurality of captured images from a plurality of times of imaging where the light emission positions are different from each other, and the inspection method further comprises: generating an first analysis image in which a value of each pixel corresponds to a normal direction of a surface of the target object appearing in the pixel, by analyzing the plurality of captured images.
11. The inspection method according to claim 10, wherein a value of each pixel of the first analysis image indicates a phase of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
12. The inspection method according to claim 10, wherein a value of each pixel of the first analysis image indicates the light emission position at which a luminance peaks in a waveform indicating a relationship between the luminance of the pixel and the light emission position in the plurality of captured images.
13. The inspection method according to claim 10, further comprising:
- generating a second analysis image by analyzing the plurality of captured images, a value of each pixel of the second analysis image being an amplitude of a waveform indicating a relationship between a luminance of the pixel and the light emission position in the plurality of captured images.
14. The inspection method according to claim 10, wherein
- the light emitting device includes a plurality of light sources, the plurality of light sources being arranged on a plane perpendicular to an optical axis of the collimator lens or on a spherical surface centered on the optical axis, and
- the switching includes sequentially switching a light source to emit light among the plurality of light sources.
15. The inspection method according to claim 10, further comprising:
- generating a composite image obtained by combining the plurality of captured images.
16. The inspection method according to claim 10, wherein a distance between the light emission position and the collimator lens is greater than or equal to a focal distance of the collimator lens.
17. The inspection method according to claim 10, further comprising:
- arranging a half mirror between the collimator lens and the target object, wherein
- the imaging includes: illuminating the target object with the light by reflecting the light by the half mirror; and allowing light reflected by the target object to be transmitted through the half mirror to be incident on the imaging device.
Type: Application
Filed: Mar 8, 2021
Publication Date: Aug 10, 2023
Applicant: OMRON CORPORATION (Kyoto-shi, Kyoto)
Inventor: Yasuyuki IKEDA (Kyoto-shi, Kyoto)
Application Number: 17/928,564