Abstract:
A power-saving sensing module includes a light source, a first and a second sensor, a first and a second detection unit, and a controller. The first sensor detects a touch of an external object to generate a first sensing signal corresponding to the touch. The first detection unit generates a touch signal corresponding to the first sensing signal.The second sensor senses a second sensing signal corresponding to the external object in response to the light ray. When the touch signal is greater than a touch threshold value, the second detection unit outputs a displacement signal corresponding to the second sensing signal. The controller outputs a control signal in response to the touch signal of the first detection unit and the touch threshold value, so that the second detection unit operates at a dormant state or a sensing state in response to the control signal.
Abstract:
An electronic apparatus includes a circuit system, a camera sensing circuit, and an object sensing circuit. The circuit system is utilized for controlling an operation of the electronic apparatus. The camera sensing circuit is coupled to the circuit system and utilized for sensing at least a portion of a portrait of a user. The object sensing circuit is coupled to the circuit system and utilized for sensing whether any object(s) is/are near to the electronic apparatus. The operation of the object sensing circuit is different from the operation of the camera sensing circuit. The camera sensing circuit is used for determining whether to notify the circuit system to switch from a first operation mode to a second operation mode. The object sensing circuit is used for determining whether to notify the circuit system to switch from the second operation mode to the first operation mode.
Abstract:
A user recognition and confirmation device includes an image sensing unit, a face recognition unit, a display unit and an expression recognition unit. The image sensing unit captures a first image frame and a second image frame. The face recognition unit is configured to recognize a user ID according to the first image frame. The display unit is configured to show ID information of the user ID. The expression recognition unit is configured to confirm a user expression according to the second image frame and output a confirm signal. There is further provided a user recognition and confirmation method and a central control system for vehicles.
Abstract:
There is provided an optical navigation device including an image sensor, a processing unit, a storage unit and an output unit. The image sensor is configured to successively capture images. The processing unit is configured to calculate a current displacement according to the images and to compare the current displacement or an accumulated displacement with a threshold so as to determine an outputted displacement. The storage unit is configured to save the accumulated displacement. The output unit is configured to output the outputted displacement with a report rate.
Abstract:
Disclosed are a distance measuring method and a distance measuring apparatus. During the distance measuring, an image is obtained. If the location of a center of gravity of the image is within a first segment, the calculating unit calculates a distance between the object and the distance measuring apparatus corresponding to the projection point, according to a first mapping relationship and the location of a center of gravity of the image. If the location of a center of gravity of the image is within a second segment, the calculating unit calculates a distance between the object and the distance measuring apparatus corresponding to the projection point according to a second mapping relationship and the location of a center of gravity of the image.
Abstract:
An optical touch device includes a sensing area, at least a light source assembly and a light sensing component. The light source assembly is disposed beside the sensing area and includes a plurality of first point light sources configured to sequentially emit a first beam into the sensing area. The light sensing component has a field of view of the entire sensing area and is configured to sense the first beams. A light source assembly and a display module used in the optical touch device are also disclosed.
Abstract:
There is provided a pointing system including an image sensor, a plurality of reference marks and a processing unit. The image sensor is configured to capture image frames containing at least one reference mark image of the reference marks. The processing unit is configured to recognize an image number of the reference mark image and calculate an aiming point coordinate according to a positioning algorithm associated with the image number.
Abstract:
There is provided an optical touch system including at least one lighting unit, at least one image sensing module and a processing unit. The image sensing module is configured to capture light of a pointer and the lighting unit to generate a two-dimensional image and to convert entire of the two-dimensional image to a one-dimensional feature. The processing unit positions the pointer according to the one-dimensional feature.
Abstract:
A portable interactive electronic apparatus includes a shell and a touch control panel having a cover plate. The cover plate includes a first surface area and a second surface area, and the touch control panel is positioned on the shell. The first surface area is utilized for sensing a touch of a user's finger, and the second surface area is utilized for leading liquid components out from the cover plate.
Abstract:
A device for determining a gesture includes a light emitting unit, an image sensing device and a processing circuit. The light emitting unit emits a light beam. The image sensing device captures an image of a hand reflecting the light beam. The processing circuit obtains the image and determines a gesture of the hand by performing an operation on the image; wherein the operation includes: selecting pixels in the image having a brightness greater than or equal to a brightness threshold; dividing the selected pixels; and determining the gesture of the hand according to a number of group of divided pixels. A method for determining a gesture and an operation method of the aforementioned device are also provided.