Abstract:
Apparatus and associated methods relate to ranging object(s) nearby an aircraft using triangulation. A light projector (28L, 28R) mounted at a projector location on the aircraft projects pulses of polarized light onto the scene external to the aircraft. The projected pulses of polarized light are polarized in a first polarization state. A camera (30L, 30R) mounted at a camera location on the aircraft has a shutter synchronized to the projector output pulse and receives a portion of the projected pulses of polarized light reflected by the object(s) in the scene and polarized at a second polarization state orthogonal to the first polarization state. Location(s) and/or range(s) of the object(s) is calculated, based on the projector location, the camera location, and pixel location(s) upon which the portion of light is imaged.
Abstract:
Apparatus and associated methods relate to calculating position and/or range data of object(s) in a scene external to an aircraft. A light projector (34) is configured to project, from an aircraft projector location, a collimated beam of light in a controllable direction onto the scene. The light projector (34) is further configured to control the intensity of the projected light, based on the controlled direction of the collimated beam of light. The reflected beam is detected by a camera (36, 38) located apart from the light projector (34). An image processor (44) is configured to use triangulation, to calculate position values and/or range data of the object(s) in the scene. The image processor (44) can be further configured to identify the object(s) in the scene and to produce, based in object(s) in the scene, one or more maps of the scene. The intensity of the collimated beam can be controlled based on the produced maps.
Abstract:
Apparatus and associated methods relate to using an image of a fiducial (62) located indicating a parking location for the aircraft (18) to provide docking guidance data to a pilot of an aircraft (18). The fiducial (62) has vertically-separated indicia (68, 70) and laterally-separated indicia (64, 66). A camera (54) is configured to mount at a camera location so as to be able to capture two-dimensional images (72A-I) of a scene external to the aircraft (18). The two-dimensional image (72A-I) includes pixel data generated by the two-dimensional array of light-sensitive pixels. A digital processor (56) identifies first and second sets of pixel coordinates corresponding to the two vertically-separated (68, 70) and the two laterally-separated (64, 66) indicia, respectively. The digital processor (56) then calculates, based at least in part on the identified first pixel coordinates corresponding to the two vertically-separated indicia (68, 70), a range (R) to the parking location.
Abstract:
Apparatus and associated methods relate to ranging object(s) nearby an aircraft (12) using triangulation of pulses of spatially-patterned light projected upon and reflected by the object(s). The projected pulses provide rapidly-changing illumination of a spatially patterned portion of the scene. A camera (36, 38) receives a reflected portion of the projected pulse and focuses the received portion onto a plurality of light-sensitive pixels, thereby forming a pulse image. The pulse image includes pixel data indicative of a rate of change of light intensity focused thereon exceeding a predetermined threshold. Pixel coordinates, corresponding to a subset of the plurality of light-sensitive pixels that are indicative of the rate of change of light intensity exceeding a predetermined threshold, are identified. Trajectory and/or range data of object(s) in the scene are calculated, based on a projector location, a camera location, and the identified pixel coordinates.
Abstract:
Apparatus and associated methods relate to ranging an object nearby an aircraft (12) by triangulation of spatially-patterned light projected upon and reflected from the object. The spatially patterned light can have a wavelength corresponding to infrared light and/or to an atmospheric absorption band. In some embodiments, images of the object are captured both with and without illumination by the spatially-patterned light. A difference between these two images can be used to isolate the spatially-patterned light. The two images can also be used to identify pixel boundaries of the object and to calculate ranges of portions of the object corresponding to pixels imaging these portions. For pixels imaging reflections of the spatially-patterned light, triangulation can be used to calculate range. For pixels not imaging reflections of the spatially-patterned light, ranges can be calculated using one or more of the calculated ranges calculated using triangulation corresponding to nearby pixels.
Abstract:
Apparatus and associated methods relate to ranging an object (70) in a scene external to an aircraft (12). A light projector (34) and two cameras (36, 38) are mounted on the aircraft (12), the cameras (36, 38) at two locations distinct from one another. The light projector (34) and the two cameras (36, 38) are coordinated so that the light projector (34) projects a linear-patterned beam of light while the cameras (36, 38) simultaneously capture a row or column of image data corresponding to an active row or column of pixels upon which a linear-patterned beam of light projected by the light projector (34) and reflected by the scene is focused. Range to the object (70) is calculated using triangulation based on the captured rows or columns of image data and the distinct locations of the two cameras (36, 38) from which the image data are simultaneously captured.
Abstract:
Apparatus and associated methods relate to ranging object(s) (54) nearby an aircraft (12) using triangulation of pulses of linearly-patterned light projected upon and reflected by the object(s) (54). A light projector (28) projects the pulses of linearly-patterned light in a controllable direction onto the scene external to the aircraft (12). A reflected portion of the projected pulses is focused onto a selected row or column of light-sensitive pixels of a two-dimensional array, thereby forming a row or column of image data (541, 581). The direction of projected light and the row or column of pixels selected are coordinated so that the portion of the projected pulses reflected by the scene is received by the camera (30) and focused on the selected one of the rows or columns of light-sensitive pixels. Location(s) and/or range(s) of object(s) (54) in the scene are calculated, based on a projector location, a camera location, and the row or column of image data (541, 581).
Abstract:
Apparatus and associated methods relate to a seeker for a Semi-Active Laser (SAL) guided missile. The seeker has a Short-Wave InfraRed (SWIR) camera (24) and a Pulse Timing Logic (PTL) detector (26). The PTL detector has a SWIR photo detector (38) axially aligned with a lens stack of the SWIR camera. The SWIR photo detector is configured to detect a sequence of SWIR pulses generated by a SAL target designator and reflected by a designated target. The PTL detector has a pulse timer (40) configured to identify a sequence pattern of the detected sequence of SWIR pulses, and to predict a timing of a next SWIR pulse in the identified sequence pattern so as to synchronize exposure of the SWIR camera to capture a next image of the designated target at the predicted timing of the next SWIR pulse. Such exposure timing can advantageously improve the signal to noise ratio of the next image.