Abstract:
A motion determination system is disclosed. The system may receive a first and a second camera image from a camera, the first camera image received earlier than the second camera image. The system may identify corresponding features in the first and second camera images. The system may receive range data comprising at least one of a first and a second range data from a range detection unit, corresponding to the first and second camera images, respectively. The system may determine first positions and the second positions of the corresponding features using the first camera image and the second camera image. The first positions or the second positions may be determined by also using the range data. The system may determine a change in position of the machine based on differences between the first and second positions, and a VO-based velocity of the machine based on the determined change in position.
Abstract:
A method and system for determining an alignment error between sensors mounted to a machine is disclosed. The method may include calculating a first orientation value based on a signal received from a first sensor. The method may further include calculating a second orientation value based on a signal received from a second sensor. The method may further include calculating an alignment error between the first sensor and the second sensor based on a difference between the first orientation value and the second orientation value.
Abstract:
A light detection and ranging (LIDAR) controller is disclosed. The LIDAR controller may determine, based on a position of an implement, a scan area of the LIDAR sensor, wherein the scan area has an increased point density relative to another area of a field of view, of the LIDAR sensor, that includes the implement. The LIDAR controller may cause the LIDAR sensor to capture, with the increased point density, LIDAR data associated with the scan area. The LIDAR controller may process the LIDAR data to determine whether an object of interest is in an environment of the machine that is associated with the scan area. The LIDAR controller may perform an action based on the environment of the machine.
Abstract:
A vehicle pose sharing diagnostic system includes a first communication module and a first pose module in communication with the first communication module. The first pose module is configured to generate a pose signal corresponding to the first machine. The system further includes a first sensing module configured to generate a pose signal corresponding to at least one of a second machine and an infrastructure. The system includes a control module communicably coupled to the first communication module. The control module is configured to determine an operational error in the first communication module and the first pose module. The control module is also configured to generate diagnosis information corresponding to the determined operational error. Further, the system includes a feedback device communicably coupled to the control module. The feedback device is configured to receive the diagnosis information from the control module and display the diagnosis information thereon.
Abstract:
A swing control assembly for a first machine is provided. The swing control assembly includes a position detection module configured to generate a signal indicative of a relative position of a second machine with respect to the first machine. The swing control assembly includes a controller communicably coupled to the position detection module. The controller is configured to receive the signal indicative of the relative position of the second machine with respect to the first machine. The controller is configured to determine a direction of swing associated with the first machine based on the received signal. The controller is configured to provide an instruction to initiate a swing operation of the first machine based on the determined direction of swing.
Abstract:
A motion determination system is disclosed. The system may receive a first camera image and a second camera image. The system may receive a first range image corresponding to the first camera image. The system may generate a first range map by fusing the first camera image and the first range image. The system may iteratively process a plurality of first features in the first range map to determine a change in position of the machine. The plurality of second features may correspond to the plurality of first features, and each of the plurality of first and second features is denoted by feature points in an image space of the camera.
Abstract:
A motion estimation system is disclosed. The motion estimation system may include one or more memories storing instructions, and one or more processors configured to execute the instructions to receive, from a scanning device, scan data representing at least one object obtained by a scan over at least one of the plurality of sub-scanning regions, and generate, from the scan data, a sub-pointcloud for one of the sub-scanning regions. The sub-pointcloud includes a plurality of surface points of the at least one object in the sub-scanning region. The one or more processors may be further configured to execute the instructions to estimate the motion of the machine relative to the at least one object by comparing the sub-pointcloud with a reference sub-pointcloud.
Abstract:
A method, system, and non-transitory computer-readable storage medium for calibrating an implement actuation sensor of a machine are disclosed. The method may include calculating a first elevation value of an implement of the machine in a gravity reference frame of the machine. The method may further include calculating a second elevation value of a ground-engaging device of the machine in the gravity reference frame of the machine. The method may further include determining a difference between the first elevation value and the second elevation value. The method may further include calibrating the implement actuation sensor based on the determined difference.
Abstract:
A machine navigation system and method for estimating velocity of a machine is disclosed. The method may include receiving, from an odometer, a first signal indicative of a distance traveled by the machine and calculating a scale factor to compensate for an error associated with the first signal. The method may further include determining whether a second signal indicative of a location of the machine is received by the machine and selectively adjusting the scale factor using machine parameters to generate an adjusted scale factor, where selectively adjusting may be performed based on whether the second signal is received by the machine. The method may further include estimating the velocity of the machine based on the first signal and the adjusted scale factor.
Abstract:
A vehicle pose sharing diagnostic system includes a first communication module and a first pose module in communication with the first communication module. The first pose module is configured to generate a pose signal corresponding to the first machine. The system further includes a first sensing module configured to generate a pose signal corresponding to at least one of a second machine and an infrastructure. The system includes a control module communicably coupled to the first communication module. The control module is configured to determine an operational error in the first communication module and the first pose module. The control module is also configured to generate diagnosis information corresponding to the determined operational error. Further, the system includes a feedback device communicably coupled to the control module. The feedback device is configured to receive the diagnosis information from the control module and display the diagnosis information thereon.