Abstract:
A method is provided for operating a tunable acoustic gradient (TAG) lens imaging system. The method includes: (a) providing a smart lighting pulse control routine/circuit (SLPCRC) that provides a first mode of exposure control corresponding to a points from focus (PFF) mode of the TAG lens imaging system and a second mode of exposure control corresponding to an extended depth of focus (EDOF) mode of the TAG lens imaging system; (b) placing a workpiece in a field of view of the TAG lens imaging system; and (c) periodically modulating a focus position of the TAG lens imaging system without macroscopically adjusting the spacing between elements in the TAG lens imaging system, wherein the focus position is periodically modulated over a plurality of focus positions along a focus axis direction in a focus range including a surface height of the workpiece, at a modulation frequency of at least 30 kHz.
Abstract:
An image acquisition system is operated to provide an image that is relatively free of the effect of longitudinal chromatic aberration. The system includes a variable focal length lens (e.g., a tunable acoustic gradient index of refraction lens) that is operated to periodically modulate a focus position. First, second, third, etc., wavelength image exposure contributions are provided by operating an illumination system to provide instances of strobed illumination of first, second, third, etc., wavelengths (e.g., green, blue, red, etc.) timed to correspond with respective phase timings of the periodically modulated focus position which focus the respective wavelength image exposure contributions at the same focus plane. The respective phase timings of the periodically modulated focus position compensate for longitudinal chromatic aberration of at least the variable focal length lens. An image is produced that is relatively free of the effect of longitudinal chromatic aberration by combining the image exposure contributions.
Abstract:
A user interface for setting parameters for an edge location video tool is provided. In one implementation, the user interface includes a multi-dimensional parameter space representation with edge zones that allows a user to adjust a single parameter combination indicator in a zone in order to adjust multiple edge detection parameters for detecting a corresponding edge. The edge zones indicate the edge features that are detectable when the parameter combination indicator is placed within the edge zones. In another implementation, representations of multiple edge features that are detectable by different possible combinations of the edge detection parameters are automatically provided in one or more windows. When a user selects one of the edge feature representation, the corresponding combination of edge detection parameters is set as the parameters for the edge location video tool.
Abstract:
A user interface for setting parameters for an edge location video tool is provided. In one implementation, the user interface includes a multi-dimensional parameter space representation with edge zones that allows a user to adjust a single parameter combination indicator in a zone in order to adjust multiple edge detection parameters for detecting a corresponding edge. The edge zones indicate the edge features that are detectable when the parameter combination indicator is placed within the edge zones. In another implementation, representations of multiple edge features that are detectable by different possible combinations of the edge detection parameters are automatically provided in one or more windows. When a user selects one of the edge feature representation, the corresponding combination of edge detection parameters is set as the parameters for the edge location video tool.