MEASUREMENT DEVICE THAT MEASURES SHAPE OF OBJECT TO BE MEASURED, MEASUREMENT METHOD, SYSTEM, AND ARTICLE PRODUCTION METHOD
A measurement device measuring a shape of an object, including a processing unit obtaining information on the shape of the object based on an image obtained by imaging the object on which a pattern light including a plurality of lines in which a distinguishing portion that distinguishes the lines from each other has been provided. In the measurement device the processing unit acquires, in a luminosity distribution of the image, in a direction intersecting the lines, positions including positions in which luminance is the largest and is the smallest, and the processing unit specifies a position to be excluded from the positions in which the luminance is the largest and is the smallest on a basis of the position of the distinguishing portion and obtains the information on the shape of the object based on positions except for the specified position.
1. Field of the Invention
The present disclosure relates to a measurement device that measures a shape of an object to be measured, a measurement method, a system, and an article production method.
2. Description of the Related Art
As a technique to measure a shape of an object to be measured, an optical measurement device is known. There are various methods that are used by the optical measurement device and one of the methods is referred to as a pattern projection method. In the pattern projection method, the shape of the object to be measured is obtained by projecting a predetermined pattern onto the object to be measured and picking up the image thereof, detecting the pattern in the taken image, and calculating range information at each pixel position using the principle of triangulation. There are various modes in the pattern used in the projection method, a representative pattern of which is a pattern (a dot line pattern) in which disconnection dots (dots) are disposed on a pattern including alternating bright lines and dark lines (see Japanese Patent No. 2517062). Information on the coordinates of the detected dots provides indexes that indicate to which line each of the projected line corresponds on the pattern of the mask, which is the pattern generation unit, such that the projected lines can be distinguished from each other. As described above, the dots serve as distinguishing portions that distinguish the lines from each other.
Influence of random noise of the taken image is included in the factors that decrease the measuring accuracy of the pattern projection method. In detecting the pattern in the taken image, typically, the coordinates of the pattern are specified by detecting the peak where the luminance value of the image of the pattern is the largest. In the Meeting on Image Recognition and Understanding (MIRU 2009), pp. 222, in addition to such a peak, by also detecting a negative peak in which the luminance value of the image of the pattern is the smallest, an increase in the density (the number of detection points per unit area) of the detection point is achieved. By increasing the detection points when detecting the pattern in the taken image, the S/N ratio is improved and the influence of the random noise of the taken image can be reduced. In the Meeting on Image Recognition and Understanding (MIRU 2009), pp. 222, measurement is performed by projecting a grid pattern and no dot line pattern is disclosed. It has been found that in the pattern projection method using a dot line pattern, when the negative peak is detected as in Meeting on Image Recognition and Understanding (MIRU 2009), pp. 222, an error occurs in the detecting position of the negative peak at an area around the dot (the distinguishing portion). As described above, a positional error may occur at the detection point near the dot (the distinguishing portion).
SUMMARY OF THE INVENTIONA measurement device that is an aspect of the present disclosure that overcomes the above problem is a measurement device that measures a shape of an object to be measured, including a processing unit that obtains information on the shape of the object to be measured on a basis of an image obtained by imaging the object to be measured on which a pattern light including a plurality of lines in which a distinguishing portion that distinguishes the lines from each other has been provided. In the measurement device the processing unit acquires, in a luminosity distribution of the image, in a direction intersecting the lines, the plurality of positions including a position in which luminance is largest and a position in which the luminance is smallest, the processing unit specifies a position to be excluded from the position in which the luminance is the largest and from the position in which the luminance is the smallest on a basis of the position of the distinguishing portion, and the processing unit obtains the information on the shape of the object to be measured based on the plurality of positions except for the position that has been specified.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, preferred embodiments of the present disclosure will be described with reference to the accompanying drawings. Note that in each drawing, the same members will be attached to the same reference numerals and redundant description thereof will be omitted.
First Exemplary EmbodimentThe projection unit 2 includes, for example, a light source unit 21, a pattern generation unit 22, and an optical projection system 23, and projects a predetermined pattern onto the object 5 to be measured. The light source unit 21 performs, for example, Kohler illumination such that light radiated from a light source is on the pattern generation unit 22 in a uniform manner. The pattern generation unit 22 creates a pattern light that is projected onto the object 5 to be measured and, in the present exemplary embodiment, is a mask on which a pattern is formed by performing chrome etching on a glass substrate. Note that the pattern generation unit 22 may be a digital light processing (DLP) projector, a liquid crystal projector, or a DMD, which is capable of generating any pattern. The optical projection system 23 is an optical system that projects the pattern light generated by the pattern generation unit 22 onto the object 5 to be measured.
The image pickup unit 3 includes, for example, an image-pickup optical system 31 and an image pickup element 32, and obtains an image by taking the object 5 to be measured. In the present exemplary embodiment, the image pickup unit 3 performs image pickup of the object 5 to be measured on which the dot line pattern PT has been projected to acquire a so-called range image that is an image that includes the portion corresponding to the dot line pattern PT. The image-pickup optical system 31 is an image-forming optical system that forms an image of the dot line pattern PT projected on the object 5 to be measured on the image pickup element 32. The image pickup element 32 is an image sensor including a plurality of pixels that performs image pickup of the object 5 to be measured on which the pattern has been projected, and includes a CMOS sensor or a CCD sensor, for example.
Based on the image acquired with the image pickup unit 3, the processing unit 4 obtains the shape of the object 5 to be measured. The processing unit 4 includes a control unit 41, a memory 42, a pattern detection unit 43, and a calculation unit 44, and is constituted by a processor, such as a CPU, a RAM, a controller chip, and the like. The control unit 41 controls the operations of the projection unit 2 and the image pickup unit 3, specifically, the control unit 41 controls the projection of the pattern onto the object 5 to be measured and the image pickup of the object 5 to be measured on which the pattern has been projected. The memory 42 stores the image acquired by the image pickup unit 3. Using the image stored in the memory 42, the pattern detection unit 43 detects the peaks, the edges, and the dots (the position subject to the detection) of the pattern light in the image to obtain the coordinates of the pattern, in other words, to obtain the position of the pattern light in the image. Using the indexes of the lines distinguished from the information of the positions (coordinates) subject to the detection and the dots, the calculation unit 44 calculates the range information (three-dimensional information) of the object 5 to be measured at each pixel position of the image pickup element 32 using the principle of triangulation.
Hereinafter, pattern detection with the pattern detection unit 43 will be described in detail. The pattern detection unit 43 detects the image of the dot line pattern PT included in the range image and specifies the position of the dot line pattern PT in the range image. Specifically, the pattern detection unit 43 specifies the positions of the lines of the dot line pattern PT in the range image from the optical image information, in other words, from the luminosity distribution (the light intensity distribution), of the evaluation sections each extending in a direction intersecting the lines of the dot line pattern PT, for example, in a direction intersecting the lines.
A relationship between the distance from the dot in the Y direction in which the lines of the dot line pattern extend and the measurement error will be described next.
As for the peak position P, regardless of the distance from the dot DT, since there is no displacement of the detection point, there is almost no measurement error. As for the edge position E, a measurement error of 42 μm occurs at the position closest to the dot DT due to the displacement of the detection point caused by the dot DT. Note that the occurrence of the same amount of measurement error has been confirmed in the evaluation of the edge with the smallest luminance gradient as well. As for the negative peak position NP, a measurement error of 380 μm occurs at the position closest to the dot DT due to the displacement of the detection point caused by the dot DT.
Other than the peak positions P, when the negative peak positions NP are included as the detection points of the pattern light detected by the pattern detection unit 43, the density of the detection points (the number of detection points per unit area) is doubled. Furthermore, when the two positions, namely, the positions in which the luminance gradient is at its maximum and the positions in which the luminance gradient is at its minimum are included, the density of the detection points is quadrupled. Accordingly, data for calculating the distance increases with the increase in the density of the detection points, and the S/N ratio with respect to the random noise of the image pickup element 32 is improved enabling measurement to be performed with higher accuracy.
However, as described above, regarding the detection points around the dots, compared with the random noise of the image pickup element 32, which is of a tens of micrometers, the measurement error of each negative peak positions NP is larger. Accordingly, depending on the dot density and the number of lines in the dot line pattern PT, there are cases in which the measurement accuracy improves when the negative peak positions NP are not employed as the detection points.
Accordingly, in the present exemplary embodiment, information on the shape of the object to be measured is obtained while the negative peak positions NP near the dots are excluded from the detection points. A flow of the measurement is illustrated in
It has been described that displacement occurs in the detection result of the edges and negative peaks near the dots in the present exemplary embodiment. Since the dot positions are specified by the dot detection described above, the detected negative peaks that are near the dot positions may be selected and excluded. As regards the negative peaks that are not near the dots, almost no displacement occurs in the detection result. Note that since the dots are shorter than the bright portions in the bright lines and the number of detection points in portions other than the vicinities of the dots are larger than the number of detection points in the vicinities of the dots, the advantageous effect obtained through increase in the detection points can be sufficiently obtained even if the detection points in the vicinities of the dots are excluded.
As described above, in the present embodiment, measurement accuracy is improved with the increase in the density of the detection points, while information on the shape of the object to be measured is obtained with a higher accuracy by not using the negative peak positions with relatively low measurement accuracies as the detection points. Furthermore, with the increase in the density of the detection points, it is possible to measure the size of a smaller object to be measured.
Second Exemplary EmbodimentDescription of a second exemplary embodiment will be given next. In the present exemplary embodiment, the dot line pattern is different from that of the first exemplary embodiment. Note that description that overlaps the first exemplary embodiment will be omitted.
In the present exemplary embodiment, the dot line pattern is a periodical pattern alternately including dark lines, in which dark portions and dots (bright portions) continue in a single direction, and bright lines extending in the single direction. The dots are each provided on the dark line and between the dark portions so as to disconnect the dark portions with respect each other in the direction in which the dark portions extend. The dots are distinguishing portions that distinguish the dark lines from each other. In other words, in the pattern of the present exemplary embodiment, the bright and dark of the first exemplary embodiment are inverted with respect to each other.
As in
Accordingly, in the present exemplary embodiment, based on the positions of the dots, the pattern detection unit 43 specifies the largest (maximum) peak positions that are to be excluded from the detection points among the plurality of detection points obtained from luminosity distribution of the evaluation sections. The positions that are excluded are positions that are affected by the displacement caused by the dot, and are located on the dot or around the dot, for example, the peak positions in the positions between the first dark line in which the dot is provided and second dark lines that are next to the first dark line are excluded from the detection points. Subsequently, using the positions of the detection points (the negative peaks and the peaks) other than the peak positions that have been excluded, the calculation unit 44 calculates the range information and obtains information on the shape of the object to be measured.
As described above, in the pattern of the second exemplary embodiment as well, by calculating the distance while excluding the detection points in which the measurement errors occur, an advantageous effect that is similar to that of the first exemplary embodiment is obtained.
Third Exemplary EmbodimentDescription of a third exemplary embodiment will be given next. Note that description that overlaps the first exemplary embodiment will be omitted.
While in the first exemplary embodiment, an example in which the negative peak positions near the dots are excluded from the detection points have been described, in the present exemplary embodiment, an example in which the edge positions near the dots are excluded from the detection points will be given.
As illustrated in
In the present exemplary embodiment, the pattern detection unit 43 uses the acquired image to obtain detection points by calculating the peak positions P or the negative peak positions NP, and the edge positions E of each position in the Y direction from the luminosity distribution (evaluation sections) in the X direction. Then, the positions of the lines of the pattern light are detected from the detection points. Note that similar to that first exemplary embodiment, the edge position is not limited to the extremal value of the luminance gradient, but may be a position that is determined from an evaluation value that is an evaluation of the luminance gradient. Furthermore, the edge position may be obtained by calculating a position that is a median value between the maximum and minimum value of the luminance, or may be obtained by calculating the intermediate point between the peak position P and the negative peak position NP. In other words, the intermediate position between the peak position P and the negative peak position NP may be detected.
Subsequently, based on the positions of the dots, the pattern detection unit 43 specifies the edge positions that are to be excluded from the detection points among the plurality of detection points obtained from luminosity distribution of the evaluation sections. Then, using the edge positions, and the negative peak positions or the peak positions that are detection points other than the edge positions that have been excluded, the calculation unit 44 calculates the range information and obtains information on the shape of the object to be measured.
Note that since the measurement errors related to the edge positions E are small compared to those of the negative peak positions, according to conditions such as when the measurement accuracy is low due to low density of the detection points and due to influence of random noise of the image pickup element, the edge positions may be employed as the detection points while the range information is calculated.
The following method may be considered for determining whether the edge positions are excluded from the detection points. When comparison between the positions of the dots that have been detected through the dot detection processing and the edge positions near the dots that have been detected through edge detection processing show that there is a large deviation therebetween, it can be considered that that there are errors in the positions of the detection points. Such detection points of the edge positions may be determined as unsuitable detection points and the edge positions may be excluded to exclude the detection points in which measurement errors occur, and, as a result, the measurement accuracy can be increased.
Exemplary embodiments of the present disclosure have been described above; however, the present disclosure is not limited by the exemplary embodiments and various modification can be made without departing from the scope of the disclosure.
In the exemplary embodiments described above, the duty ratio of each bright line and each dark line of the dot line pattern PT is 1:1; however, the duty ratio does not necessarily have to be 1:1. However, it is favorable that the ratio is 1:1 in detecting the edge positions.
According to
Meanwhile,
Furthermore, the pattern that is generated by the pattern generation unit 22 and that is projected on the object 5 to be measured is not limited to a dot line pattern. Not limited to the bright portion and the dark portion, the pattern may be any pattern that includes a plurality of lines, such as a tone pattern or a multicolor pattern. Furthermore, the lines may be straight lines or a curved line. Furthermore, the distinguishing portion does not have to be a dot and may be any mark that allows each line to be distinguished from each other, such as a round shaped portion or a portion with a narrowed width. Furthermore, in the bright line BP, the areas in which the dots occupy may be larger than the areas in which the bright portions occupy.
Fourth Exemplary EmbodimentThe measurement device 1 according to one or more of the exemplary embodiments described above may be used while being supported by a support member. In the present exemplary embodiment, a control system that is used while being attached to a robot arm 300 (holding device) as in
Operation of the processing unit or the control unit according to one or more of the exemplary embodiments described above may be performed with the following configuration.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)M), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2015-081064, filed Apr. 10, 2015, which is hereby incorporated by reference herein in its entirety.
Claims
1. A measurement device that measures a shape of an object to be measured, comprising:
- a processing unit that obtains information on the shape of the object to be measured on a basis of an image obtained by imaging the object to be measured on which a pattern light including a plurality of lines in which a distinguishing portion that distinguishes the lines from each other has been provided, wherein
- the processing unit acquires, in a luminosity distribution of the image, in a direction intersecting the lines, the plurality of positions including a position in which luminance is largest and a position in which the luminance is smallest,
- the processing unit specifies a position to be excluded from the position in which the luminance is the largest and from the position in which the luminance is the smallest on a basis of the position of the distinguishing portion, and
- the processing unit obtains the information on the shape of the object to be measured based on the plurality of positions except for the position that has been specified.
2. The measurement device according to claim 1, wherein
- the position that has been specified is at least one of the position in which the luminance is the largest and the position in which the luminance is the smallest, the at least one of the positions being located on the distinguishing portion or around the distinguishing portion.
3. The measurement device according to claim 1, wherein
- the position that has been specified is a position that is affected by a displacement caused by the distinguishing portion.
4. The measurement device according to claim 1, wherein
- the position that is to be excluded from the position in which the luminance is the largest and the position in which the luminance is the smallest is specified based on a number of pixels from a center position of the distinguishing portion in a direction in which the plurality of lines in the image extends.
5. The measurement device according to claim 1, wherein
- the pattern light includes a bright line and a dark line alternating each other, and
- the distinguishing portion is a distinguishing portion that distinguishes the bright line or the dark line.
6. The measurement device according to claim 5, wherein
- the distinguishing portion is a dark portion that is provided in the bright line, and
- the position that has been specified is a position between a first bright line in which the distinguishing portion is provided and a second bright line next to the first bright line.
7. The measurement device according to claim 5, wherein
- the distinguishing portion is a dark portion that is provided in the bright line, and
- the position that has been specified is the position in which the luminance is the smallest that is located around the distinguishing portion.
8. The measurement device according to claim 5, wherein
- the distinguishing portion is a bright portion that is provided in the dark line, and
- the position that has been specified is a position between a first dark line in which the distinguishing portion is provided and a second dark line next to the first dark line.
9. The measurement device according to claim 5, wherein
- the distinguishing portion is a bright portion that is provided in the dark line, and
- the position that has been specified is the position in which the luminance is the largest that is located around the distinguishing portion.
10. The measurement device according to claim 1, wherein
- the plurality of positions include an intermediate position between the position in which the luminance is the largest and the position in which the luminance is the smallest, and
- the position that has been specified includes the intermediate position.
11. The measurement device according to claim 10, wherein
- the intermediate position is a position determined by an evaluation value of a luminance gradient obtained from the luminosity distribution of the image in the direction intersecting the lines.
12. The measurement device according to claim 11, wherein
- the intermediate position is a position where a value of the luminance gradient is extremal.
13. The measurement device according to claim 10, wherein
- the intermediate position is a middle point between the position in which the luminance is the largest and the position in which the luminance is the smallest.
14. A measurement device that measures a shape of an object to be measured, comprising:
- a processing unit that obtains information on the shape of the object to be measured on a basis of an image obtained by imaging the object to be measured on which a pattern light including a plurality of lines in which a distinguishing portion that distinguishes the lines from each other has been provided, wherein
- the processing unit acquires, in a luminosity distribution of the image, in a direction intersecting the lines, the plurality of positions including a position in which luminance is largest and a position in which the luminance is smallest and an intermediate position between the position in which the luminance is the largest and the position in which the luminance is the smallest,
- the processing unit specifies the intermediate position to be excluded on the basis of the position of the distinguishing portion, and
- the processing unit obtains the information on the shape of the object to be measured based on the plurality of positions except for the position that has been specified.
15. The measurement device according to claim 1, wherein
- the processing unit detects the position of the distinguishing portion from the luminosity distribution of the image, and
- the processing unit specifies the position that is to be excluded on a basis of the position of the distinguishing portion that has been detected.
16. The measurement device according to claim 14, wherein
- the processing unit detects the position of the distinguishing portion from the luminosity distribution of the image, and
- the processing unit specifies the position that is to be excluded on a basis of the position of the distinguishing portion that has been detected.
17. A method of measuring a shape of an object to be measured, the method comprising:
- obtaining information on the shape of the object to be measured on a basis of an image of the object to be measured obtained by imaging the object to be measured on which a pattern light including a plurality of lines in which a distinguishing portion that distinguishes the lines from each other has been provided,
- acquiring, in the obtaining step and in a luminosity distribution of the image, in a direction intersecting the lines, the plurality of positions including a position in which luminance is largest and a position in which the luminance is smallest,
- specifying a position to be excluded from the position in which the luminance is the largest and from the position in which the luminance is the smallest on a basis of the position of the distinguishing portion, and
- obtaining the information on the shape of the object to be measured based on the plurality of positions except for the position that has been specified.
18. A method of measuring a shape of an object to be measured, the method comprising:
- obtaining information on the shape of the object to be measured on a basis of an image obtained by picking up an image of the object to be measured on which a pattern light including a plurality of lines in which a distinguishing portion that distinguishes the lines from each other has been provided,
- acquiring, in the obtaining step and in a luminosity distribution of the image, in a direction intersecting the lines, the plurality of positions including a position in which luminance is largest and a position in which the luminance is smallest and an intermediate position between the position in which the luminance is the largest and the position in which the luminance is the smallest,
- specifying, on the basis of the position of the distinguishing portion, the intermediate position to be excluded, and
- obtaining the information on the shape of the object to be measured based on the plurality of positions except for the position that has been specified.
19. A system, comprising:
- the measurement device according to claim 1, the measurement device measuring an object to be measured; and
- a robot that moves the object to be measured on a basis of a measurement result of the measurement device.
20. A method of manufacturing an article, comprising:
- moving a component with the robot of the system according to claim 19; and
- manufacturing an article by installing the component to another component with the robot.
21. The measurement device according to claim 1, further comprising:
- a projection unit that projects, onto the object to be measured, the pattern light; and
- an image pickup unit that acquires an image of the object to be measured by imaging the object to be measured on which the pattern light has been projected.
22. The measurement device according to claim 14, further comprising:
- a projection unit that projects, onto the object to be measured, the pattern light;
- an image pickup unit that acquires an image of the object to be measured by imaging the object to be measured on which the pattern light has been projected.
Type: Application
Filed: Apr 5, 2016
Publication Date: Oct 13, 2016
Inventors: Tsuyoshi Kitamura (Utsunomiya-shi), Takumi Tokimitsu (Utsunomiya-shi)
Application Number: 15/091,374