Abstract:
A method for optically communicating, from a user to a laser tracker, a command to control tracker operation includes providing a rule of correspondence between commands and temporal patterns, and selecting by the user a first command. Also, projecting a first light from the tracker to the retroreflector, reflecting a second light from the retroreflector, the second light being a portion of the first light, obtaining first sensed data by sensing a third light which is a portion of the second light, creating by the user, between first and second times, a first temporal pattern which includes a decrease in the third optical power followed by an increase in the third optical power, the first temporal pattern corresponding to the first command, determining the first command based on processing the first sensed data per the rule of correspondence and executing the first command with the tracker.
Abstract:
A method for verifying performance of a light projector includes establishing a reference artifact that include reflective makers and an interior edge line, determining with a laser-tracker-based three-dimensional (3D) measuring system 3D coordinates of the reflective targets and the interior edge line, determining with the light projector angles to the reflective markers with the light projector, and projecting with the light projector a pattern of light onto the interior edge line.
Abstract:
A three-dimensional (3D) coordinate measurement device combines tracker and scanner functionality. The tracker function is configured to send light to a retroreflector and determine distance to the retroreflector based on the reflected light. The tracker is also configured to track the retroreflector as it moves, and to determine 3D coordinates of the retroreflector. The scanner is configured to send a beam of light to a point on an object surface and to determine 3D coordinate of the point. In addition, the scanner is configured to adjustably focus the beam of light.
Abstract:
A method and system are provided for controlling a measurement device remotely through gestures performed by a user. The method includes providing a relationship between a command and a gestures. A gesture is performed by the user with the user's body that corresponds to the user gesture. The gesture performed by the user is detected. A command is determined based at least in part on the detected gesture. Then the command is executed with the laser tracker.
Abstract:
An apparatus includes a kinematic nest that supports an element having a spherical surface, a rotation mechanism that rotates the element, and processor that activates the rotation mechanism.
Abstract:
A dimensional measuring device includes an overview camera and a triangulation scanner. A six-DOF tracking device tracks the dimensional measuring device as the triangulation scanner measures three-dimensional (3D) coordinates on an exterior of the object. Cardinal points identified by the overview camera are used to register in a common frame of reference 3D coordinates measured by the triangulation scanner on the interior and exterior of the object.
Abstract:
A three-dimensional (3D) scanner having two cameras and a projector is detachably coupled to a device selected from the group consisting of: an articulated arm coordinate measuring machine, a camera assembly, a six degree-of-freedom (six-DOF) tracker target assembly, and a six-DOF light point target assembly.
Abstract:
A measurement device having a camera captures images of an object at three or more different poses. A processor determines 3D coordinates of an edge point of the object based at least in part on the captured 2D images and pose data provided by the measurement device.
Abstract:
A method and system of combining 2D images into a 3D image. The method includes providing a coordinate measurement device and a triangulation scanner having an integral camera associated therewith, the scanner being separate from the coordinate measurement device. In a first instance, the coordinate measurement device determines the position and orientation of the scanner and the integral camera captures a first 2D image. In a second instance, the scanner is moved, the coordinate measurement device determines the position and orientation of the scanner, and the integral camera captures a second 2D image. A common feature point in the first and second images is found and is used, together with the first and second images and the positions and orientations of the scanner in the first and second instances, to create the 3D image.
Abstract:
A method of combining 2D images into a 3D image includes providing a coordinate measurement device and a triangulation scanner having an integral camera associated therewith, the scanner being separate from the coordinate measurement device. In a first instance, the coordinate measurement device determines the position and orientation of the scanner and the integral camera captures a first 2D image. In a second instance, the scanner is moved, the coordinate measurement device determines the position and orientation of the scanner, and the integral camera captures a second 2D image. A cardinal point common to the first and second images is found and is used, together with the first and second images and the positions and orientations of the scanner in the first and second instances, to create the 3D image.