Abstract:
An optical device includes: a first light emitting device that emits a plurality of line pattern lights spaced apart from each other to a target area; at least one sensor device that detects a received light, which is the emitted line pattern light, that is reflected from the target area; and a controller that calculates a parallax distance from a pattern of the detected received light, and acquires depth information of the target area by using the calculated parallax distance, wherein the first light emitting device includes: at least one light emitting unit including at least one light emitting element and emitting light of a specific wavelength; and a first optical member that generates light emitted from the at least one light emitting unit into at least one line pattern light.
Abstract:
A camera apparatus and an electronic device including the same are disclosed. The camera apparatus according to the present disclosure includes: a light source; a lens module configured to output light from the light source to the outside; an actuator configured to move the light source or the lens module; and an image sensor configured to convert external light into an electrical signal. Accordingly, it is possible to implement a high-quality 3D volume by dynamically moving output light.
Abstract:
Disclosed are a mobile terminal having a plurality of light emitting devices, and a method for controlling the same. The mobile terminal includes a camera; a light emitting portion including a plurality of light emitting units; and a controller configured to control the light emitting portion to emit light, such that depth information of an image received through the camera is extracted, wherein the controller determines the number of light emitting units which emit light among the plurality of light emitting units, based on a distance between a subject corresponding to the image and the camera.
Abstract:
The present invention relates to an RGB-IR sensor, and a method and an apparatus for obtaining a 3D image by using the same. The RGB-IR sensor according to the present invention comprises: a first pixel basic unit including one of each of R, G, B and IR pixels; and a second pixel basic unit in which the R, G, B and IR pixels are arranged in a different order from those in the first pixel basic unit, wherein the RGB-IR sensor is comprised by alternately arranging the first pixel basic unit and the second pixel basic unit in a horizontal direction, and wherein R, G, B and IR pixel arrangements in the first pixel basic unit and the second pixel basic unit are determined so that the position of IR-pixels in the RGB-IR sensor are not equidistance apart from each other.
Abstract:
The present invention relates to a mobile terminal including a lighting device and a method for controlling the same. A mobile terminal according to an embodiment of the present invention includes a lighting device; a camera; and a controller configured to capture a 3D image by means of the camera, wherein the lighting device includes a pattern light source configured to emit light having a predetermined pattern; and a surface light source configured to emit uniform light, and wherein the controller controls the lighting device so that the pattern light source and the surface light source alternately emit light.
Abstract:
The present invention provides an apparatus and a method for obtaining a 3D image. The apparatus for obtaining the 3D image, according to one embodiment of the present invention, comprises a light transmitting portion for emitting infrared ray (IR) structured light onto a recognized object; a light receiving portion comprising an RGB-IR sensor for receiving infrared rays and visible light reflected from the recognized object; a processor for obtaining 3D image information including depth information and a visible light image of the recognized object by using each of the infrared rays and the visible light, which are received by the light receiving portion; and a lighting portion for controlling a lighting cycle of the infrared ray (IR) structured light. Also, the present invention further comprises an image recovery portion for recovering a 3D image of the recognized object by using the 3D image information which is obtained by the processor, and a display portion for providing the recovered 3D image on a visual screen. The present invention, by means of the method and the apparatus, for obtaining the 3D image, can adaptively respond to the brightness of ambient light so as to eliminate interference by the RGB-IR sensor. As a result, more accurate 3D images can be obtained regardless of time or place of image capturing, such as night, day, a dark space, or a bright space.
Abstract:
The present disclosure relates to a mobile terminal having a lighting unit and a control method thereof. A mobile terminal according to one implementation includes a lighting unit, a camera, and a controller configured to control the lighting unit to irradiate illumination light to a subject to be captured through the camera, and control the camera to capture the subject irradiated with the illumination light, wherein the controller is configured to determine a material of the subject based on information related to the illumination light irradiated on the subject captured through the camera.
Abstract:
The present invention relates to a mobile terminal comprising a lighting device. The lighting device according to one embodiment of the present invention comprises multiple light-emitting elements and a diffractive optical element (DOE) for diffracting a part of the light which has been output from each of the multiple light-emitting elements, wherein the light, which has been output from the multiple light-emitting elements and has passed through the diffractive optical element, comprises multiple first kinds of light not diffracted by the diffractive optical element and multiple second kinds of light diffracted by the diffractive optical element, and the diffractive optical element diffracts the part of the light output from the multiple light-emitting elements such that at least some of the multiple second kinds of light is radiated into an area formed by connecting the multiple first kinds of light.
Abstract:
An autonomous driving apparatus and a vehicle including the same are disclosed. The autonomous driving apparatus including a plurality of cameras, and a processor to verify an object around a vehicle based on a plurality of images acquired from the plurality of cameras, calculate hazard severity of the object based on at least one of a movement speed, direction, distance and size of the object, and output a level of hazard severity information corresponding to the calculated hazard severity when the speed of the vehicle is lower than or equal to a first speed or the vehicle is reversed. Thereby, hazard information may be provided based on verification of objects around the vehicle.