Abstract:
A laser scanner measures 3D coordinates from a first position and a second position and uses a sensor unit that includes at least an accelerometer and gyroscope to register the 3D coordinates, the registration based at least in part on comparison to a measured sensor displacement to a preferred displacement value.
Abstract:
A laser scanner scans an object by measuring first and second angles with angle measuring devices, sending light onto an object and capturing the reflected light to determine a distances and gray-scale values to points on the object, capturing a sequence of color images with a color camera at different exposure times, determining 3D coordinates and gray-scale values to points on the object, determining from the sequence of color images an enhanced color image having a higher dynamic range than available from any single color image, and superimposing the enhanced color image on the 3D gray-scale image to obtain an enhanced 3D color image.
Abstract:
A system and method for measuring three-dimensional (3D) coordinate values of an environment is provided. The system includes a movable base unit a first scanner and a second scanner. One or more processors performing a method that includes causing the first scanner to determine first plurality of coordinate values in a first frame of reference based at least in part on a measurement by at least one sensor. The second scanner determines a second plurality of 3D coordinate values in a second frame of reference as the base unit is moved from a first position to a second position. The determining of the first coordinate values and the second plurality of 3D coordinate values being performed simultaneously. The second plurality of 3D coordinate values are registered in a common frame of reference based on the first plurality of coordinate values.
Abstract:
A method for optically scanning and measuring an environment using a 3D measurement device is provided. The method includes steps that are performed prior to operation. These steps include positioning a near-field communication (NFC) device adjacent the 3D measurement device. An NFC link is established between the NFC device and the 3D measurement device. An identifier is transmitted from the NFC device to the 3D measurement device. It is determined that the NFC device is authorized to communicate with the 3D measurement device based at least in part on the identifier. Commands are transferred to the 3D measurement device from the NFC device based at least in part on determining the first NFC device is authorized. At least one communication path is activated. The 3D measurement device is connected to a network of computers and measurement data is transmitted from the 3D measurement device to the network of computers.
Abstract:
A method for scanning and measuring using a 3D measurement device is provided. The method includes providing the 3D measurement device having a light emitter, a light receiver and a command and evaluation device. The 3D measurement device is further includes a first near-field communication (NFC) device having a first antenna. A second NFC device having a second antenna is positioned adjacent the 3D measurement device. An NFC link is established between the first NFC device and the 3D measurement device. An identifier is transmitted from the second NFC device to the 3D measurement device. It is determined that the second NFC device is authorized to communicate with the 3D measurement device. Commands are transferred to the 3D measurement device from the second NFC device based at least in part on the determination that the second NFC device is authorized to communicate with the 3D measurement device.
Abstract:
A three-dimensional (3D) measuring instrument includes a registration camera and a surface measuring system having a projector and an autofocus camera. For the instrument in a first pose, the registration camera captures a first registration image of first registration points. The autofocus camera captures a first surface image of first light projected onto the object by the projector and determines first 3D coordinates of points on the object. For the instrument in a second pose, the registration camera captures a second registration image of second registration points. The autofocus camera adjusts the autofocus mechanism and captures a second surface image of second light projected by the projector. A compensation parameter is determined based at least in part on the first registration image, the second registration image, the first 3D coordinates, the second surface image, and the projected second light.
Abstract:
A three-dimensional (3D) measurement system, a method of measuring 3D coordinates, and a method of generating dense 3D data is provided. The method of measuring 3D coordinates includes using a first 3D measurement device and a second 3D measurement device in a cooperative manner is provided. The method includes acquiring a first set of 3D coordinates with the first 3D measurement device. The first set of 3D coordinates are transferred to the second 3D measurement device. A second set of 3D coordinates is acquired with the second 3D measurement device. The second set of 3D coordinates are registered to the first set of 3D coordinates in real-time while the second 3D measurement device is acquiring the second set of 3D coordinates.
Abstract:
A system and method of determining three-dimensional coordinates is provided. The method includes, with a projector, projecting onto an object a projection pattern that includes collection of object spots. With a first camera, a first image is captured that includes first-image spots. With a second camera, a second image is captured that includes second-image spots. Each first-image spot is divided into first-image spot rows. Each second-image spot is divided into second-image spot rows. Central values are determined for each first-image and second-image spot row. A correspondence is determined among first-image and second-image spot rows, the corresponding first-image and second-image spot rows being a spot-row image pair. Tach spot-row image pair having a corresponding object spot row on the object. Three-dimensional (3D) coordinates of each object spot row are determined on the central values of the corresponding spot-row image pairs. The 3D coordinates of the object spot rows are stored.