Abstract:
An autonomous inspector mobile platform robot that is used to inspect a pipe or network of pipes. The robot includes a locomotion device that enables the device to autonomously progress through the pipe and accurately track its pose and odometry during movement. At the same time, image data is autonomously captured to detail the interior portions of the pipe. Images are taken at periodic intervals using a wide angle lens, and additional video images may be captured at locations of interest. Either onboard or offboard the device, each captured image is unwarped (if necessary) and combined with images of adjacent pipe sections to create a complete image of the interior features of the inspected pipe. Optional features include additional sensors and measurement devices, various communications systems to communicate with an end node or the surface, and/or image compression software.
Abstract:
A sensor suite for a vehicle, the sensor suite comprising a 3D imaging system, a video camera, and one or more environmental sensors. Data from the sensor suite is combined to detect and identify threats during a structure clearing or inspection operation. Additionally, a method for detecting and identifying threats during a structure clearing or inspection operation. The method comprises: gathering 3D image data including object range, volume, and geometry; gathering video data in the same physical geometry of the 3D image; gathering non-visual environmental characteristic data; and combining and analyzing the gathered data to detect and identify threats.
Abstract:
A smart camera system provides focused images to an operator at a host computer by processing digital images at the imaging location prior to sending them to the host computer. The smart camera has a resident digital signal processor for preprocessing digital images prior to transmitting the images to the host. The preprocessing includes image feature extraction and filtering, convolution and deconvolution methods, correction of parallax and perspective image error and image compression. Compression of the digital images in the smart camera at the imaging location permits the transmission of very high resolution color or high resolution grayscale images at real-time frame rates such as 30 frames per second over a high speed serial bus to a host computer or to any other node on the network, including any remote address on the Internet.
Abstract:
Methods and tools for automatically performing work within a pipe or pipe network based on sensed impedance information. A robot, which may be tethered or un-tethered, includes a computer controller and a work tool for performing work within the pipe. With or without impedance-based calibration, the robot senses environmental and tool-based impedance characteristics and determines, using said software, ways in which the current work performance can be altered or improved based on the impedance information. The operation of the work tool is then altered in line with the control software. Many different types of work related to the inspection, cleaning and rehabilitation of pipes can be accomplished with the present robots including reinstating laterals after lining, cutting or clearing debris, sealing pipe joints and/or other heretofore manual pipe-based processes.
Abstract:
A smart camera system provides focused images to an operator at a host computer by processing digital images at the imaging location prior to sending them to the host computer. The smart camera has a resident digital signal processor for preprocessing digital images prior to transmitting the images to the host. The preprocessing includes image feature extraction and filtering, convolution and deconvolution methods, correction of parallax and perspective image error and image compression. Compression of the digital images in the smart camera at the imaging location permits the transmission of very high resolution color or high resolution grayscale images at real-time frame rates such as 30 frames per second over a high speed serial bus to a host computer or to any other node on the network, including any remote address on the Internet.
Abstract:
A system and method for controlling a remote vehicle comprises a hand-held controller including a laser generator for generating a laser beam. The hand-held controller is manipulable to aim and actuate the laser beam to designate a destination for the remote vehicle. The remote vehicle senses a reflection of the laser beam and moves toward the designated destination. The hand-held controller allows single-handed control of the remote vehicle and one or more of its payloads. A method for controlling a remote vehicle via a laser beam comprises encoding control signals for a remote vehicle into a laser beam that is aimed and sent to a designated destination for the remote vehicle, and sensing a reflection of the laser beam, decoding the control signals for the remote vehicle, and moving toward the designated destination.
Abstract:
A portable network for connecting and utilizing functional modules to create an upgradable and reconfigurable device for controlling a remote vehicle. The portable network connects a processor configured to control a remote vehicle with recesses configured to receive functional modules.
Abstract:
A system and method for controlling a remote vehicle comprises a hand-held controller including a laser generator for generating a laser beam. The hand-held controller is manipulable to aim and actuate the laser beam to designate a destination for the remote vehicle. The remote vehicle senses a reflection of the laser beam and moves toward the designated destination. The hand-held controller allows single-handed control of the remote vehicle and one or more of its payloads. A method for controlling a remote vehicle via a laser beam comprises encoding control signals for a remote vehicle into a laser beam that is aimed and sent to a designated destination for the remote vehicle, and sensing a reflection of the laser beam, decoding the control signals for the remote vehicle, and moving toward the designated destination.
Abstract:
A sensor suite for a vehicle, the sensor suite comprising a 3D imaging system, a video camera, and one or more environmental sensors. Data from the sensor suite is combined to detect and identify threats during a structure clearing or inspection operation. Additionally, a method for detecting and identifying threats during a structure clearing or inspection operation. The method comprises: gathering 3D image data including object range, volume, and geometry; gathering video data in the same physical geometry of the 3D image; gathering non-visual environmental characteristic data; and combining and analyzing the gathered data to detect and identify threats.
Abstract:
A system and method for controlling a remote vehicle comprises a hand-held controller including a laser generator for generating a laser beam. The hand-held controller is manipulable to aim and actuate the laser beam to designate a destination for the remote vehicle. The remote vehicle senses a reflection of the laser beam and moves toward the designated destination. The hand-held controller allows single-handed control of the remote vehicle and one or more of its payloads. A method for controlling a remote vehicle via a laser beam comprises encoding control signals for a remote vehicle into a laser beam that is aimed and sent to a designated destination for the remote vehicle, and sensing a reflection of the laser beam, decoding the control signals for the remote vehicle, and moving toward the designated destination.