Abstract:
A method and system are provided for controlling a laser tracker remotely from the laser tracker through gestures performed by a user. The method includes providing a rule of correspondence between each of a plurality of commands and each of a plurality of user gestures. A gesture is performed by the user with the user's body that corresponds to one of the plurality of user gestures. The gesture performed by the user is detected. The gesture recognition engine determines a first command from one of the plurality of commands that correspond with the detected gesture. Then the first command is executed with the laser tracker.
Abstract:
A three-dimensional (3D) coordinate measurement device and method of operating combines tracker and scanner functionality. The method includes selecting an operating mode on the coordinate measurement device. A first light is emitted from the coordinate measurement device. At least two angles associated with the emitting of the first light are measured. A second light is received with an optical detector, wherein the second light is a reflection of the first light off of the retroreflector or the surface. A distance is determined based at least in part on the selected mode, the emitting of the first light, and the receiving of the second light. Three dimensional coordinates of at least one point in the environment are determined based at least in part on the measuring of the at least two angles and the determination of the distance.
Abstract:
A three-dimensional (3D) coordinate measurement device and method of operating combines tracker and scanner functionality. The method includes selecting an operating mode on the coordinate measurement device. A first light is emitted from the coordinate measurement device. At least two angles associated with the emitting of the first light are measured. A second light is received with an optical detector, wherein the second light is a reflection of the first light off of the retroreflector or the surface. A distance is determined based at least in part on the selected mode, the emitting of the first light, and the receiving of the second light. Three dimensional coordinates of at least one point in the environment are determined based at least in part on the measuring of the at least two angles and the determination of the distance.
Abstract:
A device is provided that includes a housing and a first motor. The first motor rotates about a first axis. A second motor is coupled to rotate the housing, the second motor rotating about a second axis. A device frame of reference is defined by the first and second axis. A mirror is rotated about the first axis by the first motor. A first and second angle measuring devices measure a first and second angle of rotation. A 3D time-of-flight camera is arranged within the housing coaxially with the first axis. The camera acquires an image of an object reflected from the mirror. A processor determines at least one first 3D coordinate of at least one point on the object, the first 3D coordinate based at least in part on the image acquired by the camera, the first angle of rotation, and the second angle of rotation.
Abstract:
A method and system are provided for controlling a laser tracker remotely from the laser tracker through gestures performed by a user. The method includes providing a rule of correspondence between each of a plurality of commands and each of a plurality of user gestures. A gesture is performed by the user with the user's body that corresponds to one of the plurality of user gestures. The gesture performed by the user is detected. The gesture recognition engine determines a first command from one of the plurality of commands that correspond with the detected gesture. Then the first command is executed with the laser tracker.
Abstract:
A method and system are provided for controlling a laser tracker remotely from the laser tracker through gestures performed by a user. The method includes providing a rule of correspondence between each of a plurality of commands and each of a plurality of user gestures. A gesture is performed by the user with the user's body that corresponds to one of the plurality of user gestures. The gesture performed by the user is detected. The gesture recognition engine determines a first command from one of the plurality of commands that correspond with the detected gesture. Then the first command is executed with the laser tracker.
Abstract:
A method and system are provided for controlling a laser tracker remotely from the laser tracker through gestures performed by a user. The method includes providing a rule of correspondence between each of a plurality of commands and each of a plurality of user gestures. A gesture is performed by the user with the user's body that corresponds to one of the plurality of user gestures. The gesture performed by the user is detected. The gesture recognition engine determines a first command from one of the plurality of commands that correspond with the detected gesture. Then the first command is executed with the laser tracker.
Abstract:
An articulated arm coordinate measurement machine (AACMM) that includes a noncontact 3D measurement device, position transducers, a camera, and a processor operable to project a spot of light to an object point, to measure first 3D coordinates of the object point based on readings of the noncontact 3D measurement device and the position transducers, to capture the spot of light with the camera in a camera image, and to attribute the first 3D coordinates to the spot of light in the camera image.
Abstract:
An articulated arm coordinate measurement machine (AACMM) that includes a noncontact 3D measurement device, position transducers, a camera, and a processor operable to project a spot of light to an object point, to measure first 3D coordinates of the object point based on readings of the noncontact 3D measurement device and the position transducers, to capture the spot of light with the camera in a camera image, and to attribute the first 3D coordinates to the spot of light in the camera image.
Abstract:
A three-dimensional (3D) coordinate measurement device combines tracker and scanner functionality. The tracker function is configured to send light to a retroreflector and determine distance to the retroreflector based on the reflected light. The tracker is also configured to track the retroreflector as it moves, and to determine 3D coordinates of the retroreflector. The scanner is configured to send a beam of light to a point on an object surface and to determine 3D coordinate of the point. In addition, the scanner is configured to adjustably focus the beam of light.