Abstract:
At least one example embodiment discloses an apparatus for sensing an object, the apparatus including a display, a light source configured to emit light toward the display, a pattern layer on the display, the pattern layer including a first pattern for identifying a position on the display, a camera configured to generate a pattern image based on light reflected off of the pattern layer, and a controller configured to adjust an operation of at least one of the light source and the camera based on a second pattern displayed in the pattern image.
Abstract:
A user authentication method includes extracting a facial landmark from each of a first input image and a second input image; generating a first sparse code of the facial landmark extracted from the first input image and a second sparse code of the facial landmark extracted from the second input image; and determining whether a user is to be authenticated based on the first sparse code of the first input image and the second sparse code of the second input image.
Abstract:
A display apparatus and method may be used to estimate a depth distance from an external object to a display panel of the display apparatus. The display apparatus may acquire a plurality of images by detecting lights that are input from an external object and passed through apertures formed in a display panel, may generate one or more refocused images, and may calculate a depth from the external object to the display panel using the plurality of images acquired and one or more refocused images.
Abstract:
An apparatus for processing a depth image using a relative angle between an image sensor and a target object includes an object image extractor to extract an object image from the depth image, a relative angle calculator to calculate a relative angle between an image sensor used to photograph the depth image and a target object corresponding to the object image, and an object image rotator to rotate the object image based on the relative angle and a reference angle.
Abstract:
A display apparatus and method may be used to estimate a depth distance from an external object to a display panel of the display apparatus. The display apparatus may acquire a plurality of images by detecting lights that are input from an external object and passed through apertures formed in a display panel, may generate one or more refocused images, and may calculate a depth from the external object to the display panel using the plurality of images acquired and one or more refocused images.
Abstract:
A display device, and a method of operating and manufacturing the display device may receive input light from an object to be scanned that is positioned in front of a display for displaying an image, and may perform scanning of the object to be scanned.
Abstract:
A method and apparatus for a user interface using a gaze interaction is disclosed. The method for the user interface using the gaze interaction may include obtaining an image including eyes of a user, estimating a gaze position of the user, using the image including the eyes of the user, and determining whether to activate a gaze adjustment function for controlling a device by a gaze of the user, based on the gaze position of the user with respect to at least one toggle area on a display.
Abstract:
A display apparatus includes a sensor panel configured to sense a first incident light, which is incident to the sensor panel subsequent to passing through a display, and configured to at least partially block transmission of light through the sensor panel. The display apparatus further includes a controller configured to selectively control the display to pass the first incident light toward the sensor panel, and configured to control the display to pass light out of the display to display an image. The sensor panel includes a first sensor unit and a second sensor unit configured to sense the incident first light, and the first sensor unit and the second sensor unit include respective unit color filters having different color filters.
Abstract:
A method and apparatus for a user interface using a gaze interaction is disclosed. The method for the user interface using the gaze interaction may include obtaining an image including eyes of a user, estimating a gaze position of the user, using the image including the eyes of the user, and determining whether to activate a gaze adjustment function for controlling a device by a gaze of the user, based on the gaze position of the user with respect to at least one toggle area on a display.
Abstract:
A method of obtaining depth information and a display apparatus may adjust a sensor area of a sensor panel based on a distance from an object, and may obtain depth information of the object based on the adjusted sensor area.