Patents by Inventor Robert Kamil Bryll
Robert Kamil Bryll has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20170078549Abstract: A method for providing an extended depth of field (EDOF) image includes: Periodically modulating an imaging system focus position at a high frequency; using an image exposure comprising discrete image exposure increments acquired at discrete focus positions during an image integration time comprising a plurality of modulation periods of the focus position; and using strobe operations having controlled timings configured to define a set of evenly spaced focus positions for the image exposure increments. The timings are configured so that adjacent focus positions in the set are acquired at times that are separated by at least one reversal of the direction of change of the focus position during its periodic modulation. This solves practical timing problems that may otherwise prevent obtaining closely spaced discrete image exposure increments during high frequency focus modulation. Deconvolution operations may be used to improve clarity in the resulting EDOF image.Type: ApplicationFiled: November 23, 2016Publication date: March 16, 2017Inventors: Casey Edward Emtman, Robert Kamil Bryll
-
Publication number: 20170078532Abstract: An image acquisition system is operated to provide an image that is relatively free of the effect of longitudinal chromatic aberration. The system includes a variable focal length lens (e.g., a tunable acoustic gradient index of refraction lens) that is operated to periodically modulate a focus position. First, second, third, etc., wavelength image exposure contributions are provided by operating an illumination system to provide instances of strobed illumination of first, second, third, etc., wavelengths (e.g., green, blue, red, etc.) timed to correspond with respective phase timings of the periodically modulated focus position which focus the respective wavelength image exposure contributions at the same focus plane. The respective phase timings of the periodically modulated focus position compensate for longitudinal chromatic aberration of at least the variable focal length lens.Type: ApplicationFiled: September 15, 2015Publication date: March 16, 2017Inventors: Robert Kamil Bryll, Mark Lawrence Delaney
-
Publication number: 20170061601Abstract: A method is provided for defining operations for acquiring a multi-exposure image of a workpiece including first and second regions of interest at different Z heights. The multi-exposure image is acquired by a machine vision inspection system including strobed illumination and a variable focal length lens (e.g., a tunable acoustic gradient index of refraction lens) used for periodically modulating a focus position. During a learn mode, first and second multi-exposure timing values for instances of strobed illumination are determined that correspond with first and second phase timings of the periodically modulated focus position that produce sufficient image focus for the first and second regions of interest. Data indicative of the multi-exposure timing difference is recorded and is subsequently utilized (e.g., during a run mode) to define operations for acquiring a multi-exposure image of first and second regions of interest on a current workpiece that is similar to the representative workpiece.Type: ApplicationFiled: August 31, 2015Publication date: March 2, 2017Inventor: Robert Kamil Bryll
-
Publication number: 20160103443Abstract: A method for programming a three-dimensional (3D) workpiece scan path for a metrology system comprising a 3D motion control system, a first type of Z-height sensing system, and a second type of Z-height sensing system that provides less precise surface Z-height measurements over a broader Z-height measuring range. The method comprises: placing a representative workpiece on a stage of the metrology system, defining at least a first workpiece scan path segment for the representative workpiece, determining preliminary actual surface Z-height measurements along the first workpiece scan path segment, and determining a precise 3D scan path for moving the first type of Z-height sensing system to perform precise surface Z-height measurements. The precise 3D scan path is based on the determined preliminary actual surface Z-height measurements. The precise 3D scan path may be used for performing precise surface Z-height measurements or stored to be used in an inspection program.Type: ApplicationFiled: October 9, 2014Publication date: April 14, 2016Inventor: Robert Kamil Bryll
-
Patent number: 9177222Abstract: A user interface for setting parameters for an edge location video tool is provided. In one implementation, the user interface includes a multi-dimensional parameter space representation with edge zones that allows a user to adjust a single parameter combination indicator in a zone in order to adjust multiple edge detection parameters for detecting a corresponding edge. The edge zones indicate the edge features that are detectable when the parameter combination indicator is placed within the edge zones. In another implementation, representations of multiple edge features that are detectable by different possible combinations of the edge detection parameters are automatically provided in one or more windows. When a user selects one of the edge feature representation, the corresponding combination of edge detection parameters is set as the parameters for the edge location video tool.Type: GrantFiled: December 18, 2012Date of Patent: November 3, 2015Assignee: Mitutoyo CorporationInventors: Yuhua Ding, Robert Kamil Bryll, Mark Lawrence Delaney, Michael Nahum
-
Patent number: 9060117Abstract: A method of automatically adjusting lighting conditions improves the results of points from focus (PFF) 3D reconstruction. Multiple lighting levels are automatically found based on brightness criteria and an image stack is taken at each lighting level. In some embodiments, the number of light levels and their respective light settings may be determined based on trial exposure images acquired at a single global focus height which is a best height for an entire region of interest, rather than the best focus height for just the darkest or brightest image pixels in a region of interest. The results of 3D reconstruction at each selected light level are combined using a Z-height quality metric. In one embodiment, the PFF data point Z-height value that is to be associated with an X-Y location is selected based on that PFF data point having the best corresponding Z-height quality metric value at that X-Y location.Type: GrantFiled: December 21, 2012Date of Patent: June 16, 2015Assignee: Mitutoyo CorporationInventors: Robert Kamil Bryll, Shannon Roy Campbell
-
Publication number: 20150145980Abstract: A method for operating an imaging system of a machine vision inspection system to provide an extended depth of field (EDOF) image. The method comprises (a) placing a workpiece in a field of view; (b) periodically modulating a focus position of the imaging system without macroscopically adjusting the spacing between elements in the imaging system, the focus position is periodically modulated over a plurality of positions along a focus axis direction in a focus range including a workpiece surface height; (c) exposing a first preliminary image during an image integration time while modulating the focus position in the focus range; and (d) processing the first preliminary image to remove blurred image contributions occurring in the focus range during the image integration time to provide an EDOF image that is focused throughout a larger depth of field than the imaging system provides at a single focal position.Type: ApplicationFiled: November 27, 2013Publication date: May 28, 2015Applicant: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Patent number: 8995749Abstract: A method is provided for enhancing edge detection for edges of irregular surfaces in a machine vision inspection system. The inspection system comprises an edge feature video tool configured to determine profile data for an edge feature based on a plurality of differently focused images. An edge-referenced alignment compensation is provided related to substantially minimizing a respective offset amount of the edge feature at respective locations along a directional filtering direction used for directionally filtering the plurality of differently focused images prior to determining the profile data for the edge feature. In some embodiments, the plurality of differently focused images may be directionally filtered using a directional filtering sub region (DFS) defined relative to a point corresponding to a PFF basis pixel location in each of the plurality of images, each DFS having a relatively longer dimension along the directional filtering direction.Type: GrantFiled: March 28, 2013Date of Patent: March 31, 2015Assignee: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Patent number: 8917940Abstract: A reliable method for discriminating between a plurality of edges in a region of interest of an edge feature video tool in a machine vision system comprises determining a scan direction and an intensity gradient threshold value, and defining associated gradient prominences. The gradient threshold value may be required to fall within a maximum range that is based on certain characteristics of an intensity gradient profile derived from an image of the region of interest. Gradient prominences are defined by limits at sequential intersections between the intensity gradient profile and the edge gradient threshold. A single prominence is allowed to include gradient extrema corresponding to a plurality of respective edges. A gradient prominence-counting parameter is automatically determined that is indicative of the location of the selected edge in relation to the defined gradient prominences. The gradient prominence-counting parameter may correspond to the scan direction.Type: GrantFiled: April 26, 2013Date of Patent: December 23, 2014Assignee: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Patent number: 8885945Abstract: A method for improving repeatability in edge location measurement results of a machine vision inspection system comprises: placing a workpiece in a field of view of the machine vision inspection system; providing an edge measurement video tool comprising an edge-referenced alignment compensation defining portion; operating the edge measurement video tool to define a region of interest of the video tool which includes an edge feature of the workpiece; operating the edge measurement video tool to automatically perform scan line direction alignment operations such that the scan line direction of the edge measurement video tool is aligned along a first direction relative to the edge feature, wherein the first direction is defined by predetermined alignment operations of the edge-referenced alignment compensation defining portion; and performing edge location measurement operations with the region of interest in that position.Type: GrantFiled: December 27, 2012Date of Patent: November 11, 2014Assignee: Mitutoyo CorporationInventors: Robert Kamil Bryll, Yuhua Ding
-
Publication number: 20140321731Abstract: A reliable method for discriminating between a plurality of edges in a region of interest of an edge feature video tool in a machine vision system comprises determining a scan direction and an intensity gradient threshold value, and defining associated gradient prominences. The gradient threshold value may be required to fall within a maximum range that is based on certain characteristics of an intensity gradient profile derived from an image of the region of interest. Gradient prominences are defined by limits at sequential intersections between the intensity gradient profile and the edge gradient threshold. A single prominence is allowed to include gradient extrema corresponding to a plurality of respective edges. A gradient prominence-counting parameter is automatically determined that is indicative of the location of the selected edge in relation to the defined gradient prominences. The gradient prominence-counting parameter may correspond to the scan direction.Type: ApplicationFiled: April 26, 2013Publication date: October 30, 2014Applicant: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Publication number: 20140294284Abstract: A method is provided for enhancing edge detection for edges of irregular surfaces in a machine vision inspection system. The inspection system comprises an edge feature video tool configured to determine profile data for an edge feature based on a plurality of differently focused images. An edge-referenced alignment compensation is provided related to substantially minimizing a respective offset amount of the edge feature at respective locations along a directional filtering direction used for directionally filtering the plurality of differently focused images prior to determining the profile data for the edge feature. In some embodiments, the plurality of differently focused images may be directionally filtered using a directional filtering sub region (DFS) defined relative to a point corresponding to a PFF basis pixel location in each of the plurality of images, each DFS having a relatively longer dimension along the directional filtering direction.Type: ApplicationFiled: March 28, 2013Publication date: October 2, 2014Inventor: Robert Kamil Bryll
-
Publication number: 20140185910Abstract: A method for improving repeatability in edge location measurement results of a machine vision inspection system comprises: placing a workpiece in a field of view of the machine vision inspection system; providing an edge measurement video tool comprising an edge-referenced alignment compensation defining portion; operating the edge measurement video tool to define a region of interest of the video tool which includes an edge feature of the workpiece; operating the edge measurement video tool to automatically perform scan line direction alignment operations such that the scan line direction of the edge measurement video tool is aligned along a first direction relative to the edge feature, wherein the first direction is defined by predetermined alignment operations of the edge-referenced alignment compensation defining portion; and performing edge location measurement operations with the region of interest in that position.Type: ApplicationFiled: December 27, 2012Publication date: July 3, 2014Applicant: MITUTOYO CORPORATIONInventors: Robert Kamil Bryll, Yuhua Ding
-
Publication number: 20140126804Abstract: A user interface for setting parameters for an edge location video tool is provided. In one implementation, the user interface includes a multi-dimensional parameter space representation with edge zones that allows a user to adjust a single parameter combination indicator in a zone in order to adjust multiple edge detection parameters for detecting a corresponding edge. The edge zones indicate the edge features that are detectable when the parameter combination indicator is placed within the edge zones. In another implementation, representations of multiple edge features that are detectable by different possible combinations of the edge detection parameters are automatically provided in one or more windows. When a user selects one of the edge feature representation, the corresponding combination of edge detection parameters is set as the parameters for the edge location video tool.Type: ApplicationFiled: December 18, 2012Publication date: May 8, 2014Applicant: MITUTOYO CORPORATIONInventors: Yuhua Ding, Robert Kamil Bryll, Mark Lawrence Delaney, Michael Nahum
-
Publication number: 20130162806Abstract: A method for operating an edge focus tool to focus the optics of a machine vision inspection system proximate to an edge adjacent to a beveled surface feature is provided. The method comprises defining a region of interest (ROI) including the edge in a field of view of the machine vision inspection system; acquiring an image stack of the ROI over a Z range including the edge; generating a point cloud including a Z height for a plurality of points in the ROI, based on determining a best focus Z height measurement for the plurality of points; defining a proximate subset of the point cloud comprising points proximate to the beveled surface feature and corresponding to the shape of the beveled surface feature; defining a Z-extremum subset of the proximate subset of the point cloud; and focusing the optics at a Z height corresponding to the Z-extremum subset.Type: ApplicationFiled: December 23, 2011Publication date: June 27, 2013Applicant: MITUTOYO CORPORATIONInventors: Yuhua Ding, Shannon Roy Campbell, Mark Lawrence Delaney, Robert Kamil Bryll
-
Patent number: 8055466Abstract: A method for global calibration of a multi-view vision-based touch probe measurement system is provided which encompasses calibrating camera frame distortion errors as well as probe form errors. The only required features in the calibration images are the markers on the touch probe. The camera frame distortion calibration comprises a process that depends on a portable calibration jig and the touch probe, but that process is unaffected by probe form distortion errors in the touch probe and/or tip. The probe tip position calibration depends on applying the results of the camera, frame distortion calibration. When the same probe tip is used throughout the global calibration, the probe tip position calibration uses images from the set of images used by the camera frame distortion calibration. The global calibration method is particularly advantageous for low cost portable versions of multi-view vision-based touch probe measurement systems.Type: GrantFiled: March 18, 2008Date of Patent: November 8, 2011Assignee: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Patent number: 7885480Abstract: A correlation function peak finding method for an image correlation displacement sensing system is provided. The method may include capturing a reference image and a displaced image, and searching for an initial peak correlation function value point (CFVP) with pixel-level resolution based on correlating the reference image and displaced images at a plurality of offset positions. Then a complete set of CFVPs may be determined at each offset position in an analysis region surrounding the initial peak CFVP. Then a preliminary correlation function peak location is estimated with sub-pixel resolution, based on CFVPs included in the analysis region. Finally, curve-fitting operations are performed to estimate a final correlation function peak location with sub-pixel accuracy, wherein the curve-fitting operations apply a windowing function and/or a set of weighting factors that is/are located with sub-pixel resolution based on the preliminary sub-pixel correlation function peak location.Type: GrantFiled: October 31, 2006Date of Patent: February 8, 2011Assignee: Mitutoyo CorporationInventors: Robert Kamil Bryll, Benjamin Keith Jones, Karl Gustav Masreliez
-
Patent number: 7724942Abstract: A high-accuracy optical aberration correction system and method. Z-heights determined by an auto-focus tool, which would otherwise vary depending on the orientation angle of surface features or edges in the focus region of interest, and on the location of the focus region of interest in the field of view, are corrected based on a novel error calibration method. Error calibration data includes a set of Z-corrections for a range of different feature orientation angles (e.g., 180 degrees), for multiple locations in the field of view. Error calibration data may be interpolated to correspond to the location of the current focus region of interest. During auto-focus measurements, a Z-correction may be adapted for a current measurement by weighting orientation-dependent error calibration data based on a histogram of the gradient (edge) directions present in the current focus region of interest.Type: GrantFiled: October 31, 2005Date of Patent: May 25, 2010Assignee: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Patent number: 7636478Abstract: A method is provided that increases throughput and decreases the memory requirements for matching multiple templates in image. The method includes determining a set of inter-template early elimination values that characterize the degree of matching between various templates and the image, at various locations in the image. A later-analyzed template may be rejected as a potential match at a location in the image based on comparing a value characterizing its degree of match at that location to an inter-template early elimination value corresponding to the degree of match of an earlier-analyzed template at that location. The compared values may be determined by different sets of operations, and may be normalized such that they are properly comparable. The inter-template early elimination conditions may be stored in a shared correlation map. The shared correlation map may be analyzed to determine the matching locations for multiple templates in the image.Type: GrantFiled: July 31, 2006Date of Patent: December 22, 2009Assignee: Mitutoyo CorporationInventor: Robert Kamil Bryll
-
Publication number: 20080239327Abstract: A method for global calibration of a multi-view vision-based touch probe measurement system is provided which encompasses calibrating camera frame distortion errors as well as probe form errors. The only required features in the calibration images are the markers on the touch probe. The camera frame distortion calibration comprises a process that depends on a portable calibration jig and the touch probe, but that process is unaffected by probe form distortion errors in the touch probe and/or tip. The probe tip position calibration depends on applying the results of the camera, frame distortion calibration. When the same probe tip is used throughout the global calibration, the probe tip position calibration uses images from the set of images used by the camera frame distortion calibration. The global calibration method is particularly advantageous for low cost portable versions of multi-view vision-based touch probe measurement systems.Type: ApplicationFiled: March 18, 2008Publication date: October 2, 2008Applicant: MITUTOYO CORPORATIONInventor: Robert Kamil Bryll