Abstract:
An optical sensing method capable of changing a sensing direction of an optical sensing module is applied to a portable device, which includes a housing, an optical sensing module and an optical diverting mechanism. The optical sensing module is disposed inside the housing. The optical sensing module includes an optical emitter adapted to emit an optical sensing signal out of the housing and an optical receiver adapted to receive an optical modulated signal reflected from an external object. The optical diverting mechanism is adjacent by the optical sensing module. The optical sensing signal is directly projected while the optical sensing signal is not diverted by the optical diverting mechanism, and the optical sensing signal is transmitted to a second direction different from the first direction while the optical sensing signal is diverted by the optical diverting mechanism.
Abstract:
A data processing system has a first data processing apparatus and a second data processing apparatus. The first data processing apparatus has at least a camera sensor, a compressor and an output interface. The camera sensor generates first input multimedia data. The compressor compresses the first input multimedia data into compressed multimedia data. The output interface packs compressed multimedia data into a bitstream. The second data processing apparatus has at least an input interface, a data access circuit, and a de-compressor. The input interface un-packs the bitstream into second input multimedia data. The data access circuit stores second input multimedia data into a multimedia buffer and reads buffered multimedia data from the multimedia buffer. The de-compressor de-compresses buffered multimedia data. Alternatively, one of the compressor and the decompressor may be implemented in a third data processing apparatus coupled between the first data processing apparatus and the second data processing apparatus.
Abstract:
A data processing apparatus at a transmitter end has an output interface and a camera controller. The output interface packs a compressed multimedia data into an output bitstream transmitted via a camera interface. The camera controller refers to a compression characteristic of the compressed multimedia data to configure a transmission setting of the output interface over the camera interface. A data processing apparatus at a receiver end has an input interface and a controller. The input interface un-packs an input bitstream received via the camera interface into a compressed multimedia data. The controller configures a reception setting of the input interface over the camera interface in response to a compression characteristic of the compressed multimedia data. In addition, the data processing apparatus at the transmitter end may selectively enable a compression mode by checking the de-compression capability of the data processing apparatus at the receiver end.
Abstract:
A data processing apparatus has a first compressor, a second compressor, a first output interface, and a second output interface. The first compressor generates first compressed display data by performing compression upon display data of a first partial region of a frame according to a first compression order. The second compressor generates second compressed display data by performing compression upon display data of a second partial region of the frame according to a second compression order. There is a boundary between the first partial region and the second partial region. In a horizontal direction, the first compression order on one side of the first boundary is opposite to the second compression order on another side of the first boundary. The first and second output interfaces output the first and second compressed display data via a first display port and a second display port of a display interface, respectively.
Abstract:
A perception-based image processing apparatus includes an image analyzing circuit and an application circuit. The image analyzing circuit obtains training data, sets a perception model according to the training data, performs an object detection of at least one frame, and generates an object detection information signal based at least partly on a result of the object detection of said at least one frame. The application circuit operates in response to the object detection information signal.
Abstract:
A processor of an apparatus receives sensor data from an inertial measurement unit (IMU). The processor also receives image data. The processor performs a fusion process on the sensor data and the image data to provide a translation output. The processor then performs one or more six-degrees-of-freedom (6DoF)-related operations using the translation output.
Abstract:
An image processing device is applied to an image processing device and a related depth estimation system. The image processing device includes a receiving unit and a processing unit. The receiving unit is adapted to receive a capturing image. The processing unit is electrically connected with the receiving unit to determine a first sub-image and a second sub-image on the capturing image, to compute relationship between a feature of the first sub-image and a corresponding feature of the second sub-image, and to compute a depth map about the capturing image via disparity of the foresaid relationship. The feature of the first sub-image is correlated with the corresponding feature of the second sub-image, and a scene of the first sub-image is at least partly overlapped with a scene of the second sub-image.
Abstract:
A head-mount display (HMD) and variations thereof are described. An HMD may include a mobile device having a display unit, a camera, a light source and a processing unit that controls operations of the display unit, camera and light source. The processing unit receives data associated with one or more optical images captured by the camera, and renders a visual image displayable by the display unit. The HMD may also include an eyewear piece having a holder, one or more lenses and a reflective unit. The holder is wearable by a user and holds the mobile device in front of eyes of the user. The user can view the display unit through the one or more lenses. The reflective unit reflects an image of at least an eye of the user. The camera is oriented to capture the reflected image of the eye through the reflective unit.
Abstract:
A data processing apparatus has a camera sensor, a camera buffer, a compressor, and an output interface. The camera sensor generates an input multimedia data. The camera buffer stores a first data derived from the input multimedia data. The compressor generates a compressed multimedia data by compressing a second data derived from the input multimedia data. The output interface packs the compressed multimedia data into a bitstream, and outputs the bitstream via a camera interface. The camera interface is coupled between the data processing apparatus and another data processing apparatus that are located at different chips, and the camera interface is a chip-to-chip interface that provides direct pin connections between the different chips.
Abstract:
Methods and apparatuses pertaining to a simulated transparent device may involve capturing a first image of a surrounding of the display with a first camera, as well as capturing a second image of the user with a second camera. The methods and apparatuses may further involve constructing a see-through window of the first image, wherein, when presented on the display, the see-through window substantially matches the surrounding and creates a visual effect with which at least a portion of the display is substantially transparent to the user. The methods and apparatuses may further involve presenting the see-through window on the display. The constructing of the see-through window may involve computing a set of cropping parameters, a set of deforming parameters, or a combination of both, based on a spatial relationship among the surrounding, the display, and the user.