Pattern edge detection method
The present invention relates to a pattern edge detection method applicable to a semiconductor inspection apparatus that performs a pattern inspection using pattern design data. This method includes: generating an image of a pattern; detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern; repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges; determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition; producing an edge detection model by machine learning using the training data; generating an image of other pattern; and detecting an edge of the other pattern on the image using the edge detection model.
Latest TASMIT, INC. Patents:
- Method and apparatus for generating a correction line indicating relationship between deviation of an edge of a wafer pattern from an edge of a reference pattern and space width of the reference pattern, and a computer-readable recording medium
- Autofocus method for a scanning electron microscope
- Apparatus and method for measuring energy spectrum of backscattered electrons
- Pattern defect detection method
- Image generation method
This application is a 35 U.S.C. § 371 filing of International Application No. PCT/JP2019/008345 filed Mar. 4, 2019, which claims the benefit of priority to Japanese Patent Application No. 2018-059802 filed Mar. 27, 2018, each of which is incorporated herein by reference in its entirety.
TECHNICAL FIELDThe present invention relates to a pattern edge detection method applicable to a semiconductor inspection apparatus that performs a pattern inspection using pattern design data.
BACKGROUND ARTAn optical pattern inspection device, which uses a die-to-die comparison method, is used for wafer pattern inspection in a semiconductor integrated circuit manufacturing process, or for pattern inspection of a photomask for forming the pattern. The die-to-die comparison method is a method of detecting a defect by comparing an image of a semiconductor device called a die to be inspected with an image obtained from the same position of the semiconductor device in its neighboring die.
On the other hand, a method called die-to-database comparison is used to inspect a photomask called a reticle that does not entail a neighboring die. This method is such that mask data is converted into an image and this image is used as a substitute for the image of the neighboring die used in the die-to-die comparison method, so that the same inspection as described above is performed. The mask data is data obtained by correcting design data for a photomask (see, for example, Patent Document 1).
However, when the die-to-database comparison method is used for wafer inspection, a corner round of a pattern formed on a wafer is detected as a defect. In a photomask inspection, a smoothing filter is applied to an image, which has been converted from the mask data, to form a corner round so that the corner round is not detected as a defect. However, since the corner round formed by the smoothing filter is not the same as an actual corner round formed on the wafer, the actual corner round may be detected as a defect. It is possible to set an allowable amount of pattern deformation so as to ignore such a difference between the corner rounds. However, this may arise a problem that a minute defect existing at a location other than a corner cannot be detected.
Focusing on problems in semiconductor integrated circuit production, a defect that occurs repeatedly is regarded as more important than a random defect that is caused by dust or the like. Such a repetitive defect (or a systematic defect) is defined as a defect that is repeatedly generated in all dies on a wafer due to a photomask defect or the like. The repetitive defect cannot be detected by the die-to-die comparison because it occurs in both a die to be inspected and a neighboring die to be compared. Therefore, there is a need for wafer inspection with the die-to-database comparison method.
CITATION LIST Patent LiteraturePatent document 1: Patent document 1: U.S. Pat. No. 5,563,702
SUMMARY OF INVENTION Technical ProblemIn the die-to-database method, an image of a pattern formed on a wafer is generated and an inspection based on design data is performed on this image. Prior to this inspection, pattern edges are detected based on the design data. However, this edge detection may not be performed accurately, and an erroneous edge on the image may be detected or a detection failure may occur. The reason for this is that in the conventional method, an edge on an image is detected based on design data according to a fixed rule, but such a method based on the fixed rule may result in a failure in an appropriate edge detection due to a quality of an image or pattern deformation.
Therefore, the present invention provides a method capable of highly accurate edge detection using machine learning.
Solution to ProblemMachine learning is applied to an edge detection of a pattern. In this method, as training data for machine learning algorithm, images of patterns to be inspected and images of detected pattern edges are used. The method generates a model for detecting a pattern edge on an image, and uses this model to detect an edge of a pattern on an image to be actually inspected.
The processing by the machine learning is more accurate than the processing based on the conventional fixed rule, and can realize a redundancy to variety of data. Such training of the model of machine learning using the training data makes it possible to perform highly accurate and highly redundant processing on similar data to the training data. Therefore, the edge detection model generated by the machine learning is expected to be able to appropriately detect an edge of a pattern.
On the other hand, if the training data is not suitable for learning of the algorithm of machine learning, the accuracy of the edge detection model is lowered, and the accuracy of edge detection using such edge detection model is also lowered.
Cases where the training data is inappropriate for the learning are as follows.
Case 1.1 A pattern contains a defect. In this case, a pattern edge detected by the die-to-database method may not correspond to an actual pattern edge. Therefore, the pattern containing a defect should not be included in the training data.
Case 1.2 A pattern edge deformation is large, or amounts of pattern edge deformation are discontinuous. In this case, in a process of generating a pattern edge image included in the training data, there is a high possibility that the edge detection has failed or a wrong pattern edge that does not correspond to an edge has been detected.
Case 1.3 The number of pattern edges included in the training data is highly unbalanced for each of pattern types. In this case, an edge detection model generated using a small number of pattern edges may not be able to detect an edge accurately or precisely.
Further, since an edge detected by the machine learning is not associated with the design data, it is necessary to associate the detected edge with the design data. In this association process, in some cases, the association between the detected edge and the design data may not be performed correctly. This is referred to as case 2.1.
In an embodiment, there is provided a method of detecting an edge of a pattern, comprising: generating an image of a pattern; detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern; repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges; determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition; producing an edge detection model by machine learning using the training data; generating an image of other pattern; and detecting an edge of the other pattern on the image using the edge detection model.
The predetermined disqualification condition is that a pattern includes a defect.
The predetermined disqualification condition is that a bias inspection value of a pattern edge is larger than a preset upper limit or smaller than a preset lower limit, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
The predetermined disqualification condition is that an edge of a pattern is not detected correctly in producing of the training-data candidates.
The predetermined disqualification condition is that a bias inspection value of a pattern edge is out of a preset range, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
The predetermined disqualification condition is that a bias inspection value of a pattern edge varies discontinuously, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
The predetermined disqualification condition is that a bias inspection value of a pattern edge is out of a preset range and that the bias inspection value varies discontinuously, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
The method further comprises: classifying the plurality of images included in the training-data candidates into a plurality of image groups according to pattern type; and removing images from the training-data candidates such that the numbers of images belonging to the respective image groups are equal to each other.
In an embodiment, there is provided a method of detecting an edge of a pattern, comprising: generating an image of a pattern; detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern; repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training data including a plurality of images and corresponding pattern edges; producing edge detection models by machine learning using the training data, wherein producing the edge detection modes by the machine learning comprises (i) dividing each of the pattern edges included in the training data into a plurality of edges according to edge attribute based on the design data, and (ii) producing the edge detection models respectively for the plurality of edges by the machine learning using the training data; generating an image of other pattern; and detecting an edge of the other pattern on the image using the edge detection models.
Detecting the edge of the other pattern on the image comprises: detecting edges of the other pattern on the image using the edge detection models, respectively; and associating the detected edges with design data of the other pattern.
According to the above-mentioned embodiments, the above cases 1.1, 1.2, 1.3 and 2.1 are improved. Specifically, the training data is improved, and as a result, the accuracy of pattern edge detection using the machine learning is improved.
Embodiments of the present invention will now be described below with reference to the drawings.
The main controller 1 includes CPU (Central Processing Unit), and controls the entire apparatus as a whole. The storage device 2 is coupled to the main controller 1. The storage device 2 may be in the form of a hard disk, a solid state drive, a flexible disk, or an optical disk. The input device 4 (e.g., a keyboard and a mouse), the display device 5 (e.g., a display for displaying input data and calculation results), and the printing device 6 (e.g., printer) are coupled to the main controller 1 via the input and output controller 3.
The main controller 1 has an internal memory (internal storage device) storing a control program such as an OS (Operating System), a program for pattern edge detection, and required data, so that the main controller 1 performs the pattern edge detection and generation of an edge detection model according to the programs. These programs can be stored in a flexible disk, an optical disk, etc., and can be read into the memory, the hard disk, etc. before execution.
The irradiation system 10 includes an electron gun 11, a converging lens 12 for converging primary electrons emitted by the electron gun 11, an X deflector 13 and a Y deflector 14 for deflecting an electron beam (i.e., a charged particle beam) in an X direction and a Y direction, and an objective lens 15. The specimen chamber 20 includes an XY stage 21 configured to be movable in the X direction and the Y direction. A wafer W, which is a specimen, is carried in and out of the specimen chamber 20 by a wafer transporting device 40.
In the irradiation system 10, the primary electrons emitted by the electron gun 11 converge in the converging lens 12, and are then deflected by the X deflector 13 and the Y deflector 14, while being focused by the objective lens 15 onto the surface of the wafer W as a specimen.
When the wafer W is irradiated with the primary electrons, secondary electrons are emitted from the wafer W, which are detected by the secondary electron detector 30. The converging lens 12 and the objective lens 15 are coupled to a lens controller 16, which is coupled to a control computer 50. The secondary electron detector 30 is coupled to an image acquisition device 17, which in turn is coupled to the control computer 50 as well. Intensities of the secondary electrons detected by the secondary electron detector 30 are converted into a voltage contrast image by the image acquisition device 17. An irradiation area of the primary electrons in which a maximum voltage contrast image with no distortion can be obtained is defined as a field of view.
The X deflector 13 and the Y deflector 14 are coupled to a deflection controller 18, which is also coupled to the control computer 50. The XY stage 21 is coupled to a XY stage controller 22, and this XY stage controller 22 is also coupled to the control computer 50. The wafer transporting device 40 is also coupled to the control computer 50 as well. The control computer 50 is coupled to an operation computer 60.
Next, the main controller 1 shown in
As shown in
As shown in
The threshold method, which is one method of edge detection from the brightness profile, will be described with reference to
If the sampling point having the determined edge brightness value is not on the brightness profile, as shown in
As shown in
In
In this way, the main controller 1 can distinguish “thickening deformation” and “thinning deformation” of the wafer pattern based on the bias inspection value. For example, a positive bias inspection value means that the wafer pattern is in a state of thickening deformation, and a negative bias inspection value means that the wafer pattern is in a state of thinning deformation. An upper limit and a lower limit may be predetermined for the bias inspection value. In this case, the main controller 1 can detect a thickening defect at which the bias inspection value exceeds the upper limit, and can also detect a thinning defect at which the bias inspection value is lower than the lower limit (step 8 in
In step 2, the pattern inspection apparatus shown in
Next, the main controller 1 determines training data to be used for the machine learning by excluding pattern edges, which satisfy predetermined disqualification conditions, and corresponding sample images from the training-data candidates (step 3). The training data is determined using the sample images and the pattern edge detection results. In the present embodiment, the disqualification conditions include (i) a pattern has a defect, and (ii) an edge of a pattern was not correctly detected in producing of the training-data candidates.
In step 2 of
Another example of the pattern edge in which the pattern edge detection may not have been correctly performed in the die-to-database inspection is a pattern edge whose bias inspection value varies discontinuously. The bias inspection value is an index value indicating the magnitude and direction of the deviation of the pattern edge from an edge of the reference pattern generated from the corresponding design data. Therefore, when the edge detection is not correctly performed, the bias inspection values are discontinuous, and as a result, the bias inspection values vary abruptly.
When the bias inspection values vary abruptly as in this example (e.g., when a difference between the bias inspection values of two adjacent bias lines is not less than a predetermined threshold value), the pattern edge and the corresponding sample image are removed from the training-data candidates. For example, when the difference between the bias inspection values of the two adjacent bias lines 305 and 306 in
In step 3 of
In step 4 of
Such operation can improve the accuracy of the machine learning by reducing unbalance of the pattern types (i.e., increasing the comprehensiveness of the pattern types used for the training data). Moreover, the total number of training data can be reduced by reducing the unbalance in the pattern types of the training data.
Returning back to
In step 5 of
If u<0, f(u)=0; if u≥0, f(u)=u
In the present embodiment, the input feature of the feedforward neural network is a brightness value of a pixel forming a sample image included in the training data.
In the present embodiment, the main controller 1 uses the training data to produce a plurality of edge detection models by the machine learning for a plurality of corresponding edges divided according to the edge attribute. Where X1 denotes a brightness vector of the patch extracted from the sample image, y denotes a brightness value of the patch X1 output from the feedforward neural network, and t denotes a brightness value of a patch T1 of the pattern edge image corresponding to the position of the patch X1, it is desirable that the brightness value y output from the feedforward neural network be as close to the brightness value t of the pattern edge image as possible.
Brightness vectorXi=[xi1,xi2,xi3,xi4,xi5,xi6,xi7,xi8,xi9]
where i is the number that identifies the patch.
In one embodiment, an evaluation index for closeness between yi and ti uses a mean squared error, which is defined as
E=Σ(yi−ti)2/N
where N is the number of patches evaluated at the same time.
When the mean squared error E is small, the output image is expected to be close to the pattern edge image.
The main controller 1 updates the synaptic weights such that the mean squared error E becomes smaller. For example, in a weight parameter wαβ between a unit α of an M-th layer and a unit β of an M+1-st layer, the main controller 1 determines a partial differential value ∂E/∂wαβ with respect to wαβ of E, and uses a gradient method of replacing the parameter wαβ with wαβ−ε*∂E/∂wαβ. The symbol ε is a learning coefficient which is a predetermined numerical value. An Adam method or a stochastic gradient method may be use in order to advance the learning. A backpropagation method is known as an algorithm for efficiently calculating the partial differential ∂E/∂wαβ using chain rule of composite function.
As described above, the synaptic weights are optimized, and the plurality of edge detection models are produced for the plurality of divided edges, respectively.
In step 2, the main controller 1 applies the plurality of edge detection models for the plurality of divided edges to the image of the pattern to perform the edge detection of the pattern. For example, the main controller 1 detects an upper edge by using the edge detection model for the upper edge, detects a right edge by using the edge detection model for the right edge, detects a left edge by using the edge detection model for the left edge, and detects a lower edge by using the edge detection model for the lower edge. The synaptic weights included in each edge detection model have already been optimized by the learning using the training data. As the edge detection results, a plurality of images of the detected edges are generated. Each image of the detected edge has the same size as the sample image. In each image of the detected edge, numerical value of 1 is assigned to pixels corresponding to the edge, while numerical value of 0 is assigned to pixels that do not correspond to the edge.
In step 3, the detected edges are associated with design data for the pattern whose image has been generated in the step 1.
In step 2 of
Similarly, bias lines are created for upper edges, left edges, and lower edges of the patterns classified by the edge attribute based on the design data, and information of these bias lines is integrated based on the design data, and constitutes die-to-database inspection results.
Returning back to
The previous description of embodiments is provided to enable a person skilled in the art to make and use the present invention. Moreover, various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles and specific examples defined herein may be applied to other embodiments. Therefore, the present invention is not intended to be limited to the embodiments described herein but is to be accorded the widest scope as defined by limitation of the claims.
INDUSTRIAL APPLICABILITYThe present invention can be used for a pattern edge detection method applicable to a semiconductor inspection device that performs a pattern inspection using pattern design data.
REFERENCE SIGNS LIST
-
- 1 main controller
- 2 storage device
- 3 input and output controller
- 4 input device
- 5 display device
- 6 printing device
- 7 image generating device
- 10 irradiation system
- 11 electron gun
- 12 converging lens
- 13 X deflector
- 14 Y deflector
- 15 objective lens
- 16 lens controller
- 17 image acquisition device
- 20 specimen chamber
- 21 XY stage
- 22 XY stage controller
- 30 secondary electron detector
- 40 wafer transporting device
- 50 control computer
- 60 operation computer
Claims
1. A method of detecting an edge of a pattern, comprising:
- generating an image of a pattern;
- detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern;
- repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges;
- determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition;
- producing an edge detection model by machine learning using the training data;
- generating an image of other pattern; and
- detecting an edge of the other pattern on the image using the edge detection model,
- wherein the predetermined disqualification condition is that a bias inspection value of a pattern edge is larger than a preset upper limit or smaller than a preset lower limit, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
2. A method of detecting an edge of a pattern, comprising:
- generating an image of a pattern;
- detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern;
- repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges;
- determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition;
- producing an edge detection model by machine learning using the training data;
- generating an image of other pattern; and
- detecting an edge of the other pattern on the image using the edge detection model,
- wherein the predetermined disqualification condition is that a bias inspection value of a pattern edge is out of a preset range, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
3. A method of detecting an edge of a pattern, comprising:
- generating an image of a pattern;
- detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern;
- repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges;
- determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition;
- producing an edge detection model by machine learning using the training data;
- generating an image of other pattern; and
- detecting an edge of the other pattern on the image using the edge detection model,
- wherein the predetermined disqualification condition is that a bias inspection value of a pattern edge varies discontinuously, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
4. A method of detecting an edge of a pattern, comprising:
- generating an image of a pattern;
- detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern;
- repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges;
- determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition;
- producing an edge detection model by machine learning using the training data;
- generating an image of other pattern; and
- detecting an edge of the other pattern on the image using the edge detection model,
- wherein the predetermined disqualification condition is that a bias inspection value of a pattern edge is out of a preset range and that the bias inspection value varies discontinuously, the bias inspection value being an index value indicative of magnitude and direction of a deviation of the pattern edge from an edge of a reference pattern generated from corresponding design data.
5. A method of detecting an edge of a pattern, comprising:
- generating an image of a pattern;
- detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern;
- repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training-data candidates including a plurality of images and corresponding pattern edges;
- classifying the plurality of images included in the training-data candidates into a plurality of image groups according to pattern type;
- removing images from the training-data candidates such that the numbers of images belonging to the respective image groups are the same;
- determining training data by removing pattern edges and corresponding images from the training-data candidates, the pattern edges to be removed being pattern edges satisfying a predetermined disqualification condition;
- producing an edge detection model by machine learning using the training data;
- generating an image of other pattern; and
- detecting an edge of the other pattern on the image using the edge detection model.
6. A method of detecting an edge of a pattern, comprising:
- generating an image of a pattern;
- detecting an edge of the pattern on the image based on a reference pattern generated from design data for the pattern;
- repeating generating of an image of a pattern and detecting of an edge of the pattern on the image to produce training data including a plurality of images and corresponding pattern edges;
- producing edge detection models by machine learning using the training data, wherein producing the edge detection modes by the machine learning comprises (i) dividing each of the pattern edges included in the training data into a plurality of edges according to edge attribute based on the design data, and (ii) producing the edge detection models respectively for the plurality of edges by the machine learning using the training data;
- generating an image of other pattern; and
- detecting an edge of the other pattern on the image using the edge detection models.
7. The method according to claim 6, wherein detecting the edge of the other pattern on the image comprises:
- detecting edges of the other pattern on the image using the edge detection models, respectively; and
- associating the detected edges with design data of the other pattern.
5475766 | December 12, 1995 | Tsuchiya |
5563702 | October 8, 1996 | Emery et al. |
20010012390 | August 9, 2001 | Watanabe |
20070165938 | July 19, 2007 | Matsumura |
20080130982 | June 5, 2008 | Kitamura |
20090238441 | September 24, 2009 | Yamashita |
20100158345 | June 24, 2010 | Kitamura |
20130234019 | September 12, 2013 | Miyamoto |
20130336574 | December 19, 2013 | Nasser-Ghodsi |
20160154922 | June 2, 2016 | Kuncha |
20180005363 | January 4, 2018 | Nagatomo |
4-125779 | April 1992 | JP |
5-046764 | February 1993 | JP |
2001-175857 | June 2001 | JP |
2006-012069 | January 2006 | JP |
2013-058036 | March 2013 | JP |
- International Patent Application No. PCT/JP2019/008345; International Search Report and Written Opinion dated May 21, 2019, 9 pgs.
Type: Grant
Filed: Mar 4, 2019
Date of Patent: Sep 6, 2022
Patent Publication Number: 20210027473
Assignee: TASMIT, INC. (Yokohama)
Inventor: Masahiro Oya (Yokohama)
Primary Examiner: Samir A Ahmed
Application Number: 17/040,533
International Classification: G06K 9/00 (20220101); G06T 7/13 (20170101); G01N 23/2251 (20180101); G06K 9/62 (20220101); G06T 7/00 (20170101);