Abstract:
A method for scanning and obtaining three-dimensional (3D)l coordinates is provided. The method includes providing a 3D measuring device having a projector, a first camera and a second camera. The method records images of a light pattern emitted by the projector onto an object. A deviation in a measured parameter from an expected parameter is determined. The calibration of the 3D measuring device may be changed when the deviation is outside of a predetermined threshold.
Abstract:
A device for scanning and obtaining three-dimensional coordinates is provided. The device may be a hand-held scanner that includes a carrying structure having a front and reverse side, the carrying structure having a first arm, a second arm and a third arm arranged in a T-shape or a Y-shape. A housing is coupled to the reverse side, a handle is positioned opposite the carrying structure, the housing and carrying structure defining an interior space. At least one projector is configured to project at least one pattern on an object, the projector being positioned within the interior space and oriented to project the at least one pattern from the front side. At least two cameras are provided spaced apart from each other, the cameras being configured to record images of the object. The cameras and projector are spaced apart from each other by a pre-determined distance.
Abstract:
A device for optically scanning and measuring an environment is provided. The device includes at least one projector for producing at least one uncoded pattern on an object in the environment. A first camera is provided for recording at least one first image of the object provided with the pattern, the first camera having a first image plane. A second camera is provided for recording at least one second image of the object provided with the uncoded pattern, the second camera being spaced apart from the first camera in order to acquire the uncoded pattern on a second image plane. A controller is provided having a processor configured to determine the three-dimensional coordinates of points on the surface of the object based at least in part on the uncoded pattern, the at least one first image and the at least one second image.
Abstract:
A three-dimensional (3D) coordinate measuring system includes an external projector that projects a pattern of light onto an object and an aerial drone attached to a 3D imaging device, the 3D imaging device and the external projector cooperating to obtain 3D coordinates of the object.
Abstract:
A system includes a measurement device configured to measure a distance, a first angle, and a second angle to a retroreflector target. The system further includes a probe having the retroreflector target, an inclinometer sensor, a camera, and a processor, the inclinometer sensor configured to determine a two-dimensional inclination of the probe relative to a gravity vector, the camera configured to capture an image of a light emitted from or reflected by the measurement device, the processor configured to determine six degrees of freedom of the probe based at least in part on the distance, the first angle, the second angle, the two-dimensional inclination, and the captured image of the camera.
Abstract:
A method for scanning and measuring an environment is provided. The method includes providing a three-dimensional (3D) measurement device having a controller. Images of the environment are recorded and a 3D scan of the environment is produced with a three-dimensional point cloud. A video image of the environment is recorded. The video image is displayed on a first portion of a display. A portion of the three-dimensional point cloud is displayed on a second portion of the display, the second portion of the display being arranged about the periphery of the first portion of the display. Wherein a portion of the 3D point cloud displayed in the second portion represents a portion of the environment outside of a field of view of the video image.
Abstract:
A method for calibration of a system for tracking a measurement system is provided. The method includes recording a path of relative poses that the system is in when acquiring three-dimensional (3D) coordinates of an object by a measurement system. The recording the path includes moving the system sequentially through relative poses between a measurement system and an object, at which the measurement system measures the 3D coordinates and the pose tracking system records a relative pose. The pose manipulation system is caused to move sequentially along the poses in the path, and the measurement system measures the 3D coordinates and applying the recorded poses. The relative pose is calibrated between the tracked manipulated coordinate system and one of a coordinate system of the object and the measurement system. The measured positions are transformed based on the calibrated relative pose and the tracked relative pose into the common coordinate system.
Abstract:
A system includes a pose manipulation system operationally that sets a pose of a position measurement system with respect to an object that is to be measured. The system further includes a pose tracking system configured to record a relative pose between a coordinate system associated with the position measurement system and a coordinate system of the object. The pose tracking system records a path along which the position measurement system is enabled to measure 3D coordinates of a surface of a type of an object, wherein recording the path comprises moving the pose manipulation system sequentially through a plurality of poses and recording, at each pose, the relative pose to measure the 3D coordinates. The pose manipulation system follows the path again, and the position measurement system measures the 3D coordinates by applying one or more of the recorded poses.
Abstract:
According to one aspect of the disclosure, a three-dimensional coordinate scanner is provided. The scanner includes a projector configured to emit a pattern of light; a sensor arranged in a fixed predetermined relationship to the projector, the sensor having a photosensitive array comprised of a plurality of event-based pixels, each of the event-based pixels being configured to transmit a signal in response to a change in irradiance exceeding a threshold. One or more processors are electrically coupled to the projector and the sensor, the one or more processors being configured to modulate the pattern of light and determine a three-dimensional coordinate of a surface based at least in part on the pattern of light and the signal.
Abstract:
A projector projects an uncoded pattern of uncoded spots onto an object, which is imaged by a first camera and a second camera, 3D coordinates of the spots on the object being determined by a processor based on triangulation, the processor further determining correspondence among the projected and imaged spots based at least in part on a nearness of intersection of lines drawn from the projector and image spots through their respective perspective centers.