Abstract:
Apparatus and method is disclosed for determining position of a robot relative to objects in a workspace which includes the use of a camera, scanner, or other suitable device in conjunction with object recognition. The camera, etc is used to receive information from which a point cloud can be developed about the scene that is viewed by the camera. The point cloud will be appreciated to be in a camera centric frame of reference. Information about a known datum is used and compared to the point cloud through object recognition. For example, a link from a robot could be the identified datum so that, when recognized, the coordinates of the point cloud can be converted to a robot centric frame of reference since the position of the datum would be known relative to the robot.
Abstract:
Apparatus and method is disclosed for determining position of a robot relative to objects in a workspace which includes the use of a camera, scanner, or other suitable device in conjunction with object recognition. The camera, etc is used to receive information from which a point cloud can be developed about the scene that is viewed by the camera. The point cloud will be appreciated to be in a camera centric frame of reference. Information about a known datum is used and compared to the point cloud through object recognition. For example, a link from a robot could be the identified datum so that, when recognized, the coordinates of the point cloud can be converted to a robot centric frame of reference since the position of the datum would be known relative to the robot.
Abstract:
A calibration article is provided for calibrating a robot and 3D camera. The calibration article includes side surfaces that are angled inward toward a top surface The robot and camera are calibrated by capturing positional data of the calibration article relative to the robot and the camera. The captured data is used to generate correlation data between the robot and the camera. The correlation data is used by the controller to align the robot with the camera during operational use of the robot and camera.
Abstract:
A teleoperated robotic system that utilizes a graphical user interface (GUI) to perform work on a workpiece(s) using a robot. A coordinate system of the GUI can be correlated to the tool center point (TCP) of the robot and the TCP or workspace of a teleoperated member, such as a haptic joystick. Operable manipulation of the teleoperated member can be correlated to a movement at a particular location in the robot station, such as movement of the TCP of the robot. The GUI can also provide digital representations of the workpiece, which can be based on inputted and/or scanned information relating to a reference workpiece and/or the particular workpiece on which the robot is performing work. The GUI can further provide indications of the various stages of assembly of the workpiece, as well as an indication of work already, or to be, performed on the workpiece.
Abstract:
Three-dimensional visual servoing for positioning a robot in an environment is facilitated. Three-dimensional point cloud data of a scene of the environment is obtained, the scene including a feature. The three-dimensional point cloud data is converted into a two-dimensional image, and a three-dimensional position of the feature is identified based on the two-dimensional image. An indication of the identified three-dimensional position of the feature is then provided.
Abstract:
Automatic scanning and representing an environment having a plurality of features, for example, includes scanning the environment along a scanning path, interspersing a plurality of localized scanning of the plurality of features in the environment during the scanning along the scanning path of the environment wherein the interspersed localized scanning of the plurality of features in the environment being different from the scanning the environment along the scanning path, and obtaining a representation of at least a portion of the environment based on the scanning of the environment and the interspersed localized scanning of the plurality of features in the environment.
Abstract:
Robot positioning is facilitated by obtaining, for each time of a first sampling schedule, a respective indication of a pose of a camera system of a robot relative to a reference coordinate frame, the respective indication of the pose of the camera system being based on a comparison of multiple three-dimensional images of a scene of an environment, the obtaining providing a plurality of indications of poses of the camera system; obtaining, for each time of a second sampling schedule, a respective indication of a pose of the robot, the obtaining providing a plurality of indications of poses of the robot; and determining, using the plurality of indications of poses of the camera system and the plurality of indications of poses of the robot, an indication of the reference coordinate frame and an indication of a reference point of the camera system relative to pose of the robot.
Abstract:
An industrial robot that has uses a simulated force vector to allow a work piece held by the robot and effector to be mated with a work piece whose location and orientation is not precisely known to the robot. When the end effector makes contact with the location and orientation in which the other work piece is held the robot provides a velocity command to minimize the force of the contact and also provides a search pattern in all directions and orientations to cause the end effector to bring the work piece it is holding in contact with the other work piece. The search pattern and the velocity command are continued until the two pieces mate.
Abstract:
An industrial robot that has uses a simulated force vector to allow a work piece held by the robot and effector to be mated with a work piece whose location and orientation is not precisely known to the robot. When the end effector makes contact with the location and orientation in which the other work piece is held the robot provides a velocity command to minimize the force of the contact and also provides a search pattern in all directions and orientations to cause the end effector to bring the work piece it is holding in contact with the other work piece. The search pattern and the velocity command are continued until the two pieces mate.
Abstract:
There is described a technique for detecting the success of an automated process that produces an article of manufacture. A statistically significant number of successful and failed articles are produced by the automated process. Each of these articles is interacted with a test platform to measure interaction signatures that indicate successful and failed articles. A correlation of the difference between the interaction signatures is calculated. An interaction signature is then obtained for an article manufactured by the process after the earlier made articles. The new interaction signature is analyzed against the calculated correlation difference to automatically categorize as either a success or a failure the additional article of manufacture. There is also described a technique for optimizing the motion used to test the manufactured articles to improve the correlation of the difference between the interaction signals of successful articles and the interaction signals of failed articles.