Abstract:
A three-dimensional (3D) display system is provided. The 3D display system includes a display device and an optical device. The display device is configured to display sets of images with parallax for a 3D display. The optical device has electric-signal-controllable optical parameters and is coupled with the display device. Further, the optical device is configured to enable directional light transmission so as to separate lights of the sets of images into predetermined viewing directions to effect the 3D display.
Abstract:
A 2D/3D switching system contains a 2D/3D switching device having a display area for selectively processing lights from 2D images and 3D images. The 2D/3D switching device includes a first substrate, a plurality of first electrodes formed on the first substrate, a second substrate, a plurality of second electrodes formed on the second substrate and arranged corresponding to the plurality of first electrodes and separated with a distance, and a liquid crystal layer placed between the first substrate and the second substrate to provide the display area. A driving unit is configured to provide driving voltages to the plurality of first electrodes and the plurality of second electrodes. The driving unit applies a plurality of voltages on the first electrodes and the second electrodes to enable the liquid crystal layer to operate in one of a full-screen 2D mode, a full-screen 3D mode, and a 2D/3D mode.
Abstract:
A three-dimensional display device is provided in the present disclosure. The three-dimensional display device includes an image displaying device, an optical control element, a direction sensor and a control unit. The optical control element is positioned on a surface of the image displaying device, the direction sensor measures and inclination angle of the display panel, and the control unit adjusts images displayed by the image displaying device according to the inclination angle. The three-dimensional display device of the present disclosure has the advantages of fast tracking speed, high tracking precision and low cost. A mobile terminal and a three-dimensional display tracking method are also provided in the present disclosure.
Abstract:
A three-dimensional (3D) display system is provided. The 3D display system includes a backlight plate, a display panel, a light-splitting device, and a polarization state controller. The display panel is configured to display a two-dimensional (2D) image in a 2D mode or to display a 3D image in a 3D mode. The light-splitting device is configured to an arrangement module configured to pass the 2D image in the 2D mode, and to separate the 3D image into a left image and a right image. Further, the polarization state controller is disposed between the display panel and the light-splitting device and is configured to rotate a polarization direction of light emitted from the display panel in the 2D mode, and to keep the polarization direction of the light emitted from the display panel in the 3D mode.
Abstract:
A method is provided for a three-dimensional (3D) image processing system including a stereoscopic display device. The method includes providing a stereoscopic image and obtaining a parallax range of the stereoscopic image and a parallax range supported by the stereoscopic display device. The method also includes determining a parallax operation to adjust the parallax range of the stereoscopic image based on the relationship between the parallax range of the stereoscopic image and the parallax range supported by the stereoscopic display device. Further, the method includes determining an offset value and an offset direction of a horizontal coordinate of each pixel of the stereoscopic image, and shifting the horizontal coordinate of each pixel of the stereoscopic image by the offset value and in the offset direction.
Abstract:
A method, apparatus and smart wearable device for fusing augmented reality and virtual reality are provided. The method for fusing augmented reality (AR) and virtual reality (VR), comprising acquiring real-world scene information collected by dual cameras mimicking human eyes in real time from an AR operation; based on virtual reality scene information from a VR operation and the acquired real-world scene information, generating a fused scene; and displaying the fused scene.
Abstract:
An object tracking method is provided. The method includes obtaining current position coordinates of a tracked object for a first time using a tracking algorithm and initializing a filter acting on a time-varying system based on the obtained current position coordinates. The method further includes updating a system state through the filter acting on the time-varying system when the tracking algorithm outputs new current position coordinates of the tracked object with a spatial delay and comparing speed of the tracked object to a compensation determination threshold. Further, the method includes when the speed of the tracked object is greater than the compensation determination threshold, compensating the new current position coordinates of the tracked object with the spatial delay through the filter acting on the time-varying system, and outputting the compensated current position coordinates.
Abstract:
An image processing method and an image processing device are provided. The image processing method includes acquiring at least one to-be-processed image, the to-be-processed image is a 3D image having a first view image and a second view image, the first view image and the second view image have a horizontal parallax between them. The image processing method also includes receiving a user instruction and determining special-effect data to be inserted into the to-be-processed image, and based on special-effect attribute information, respectively combining the special-effect data with the first view image and the second view image of the to-be-processed image to obtain a 3D special-effect image. The image processing method also includes storing the 3D special-effect image. A corresponding horizontal parallax is formed between the special-effect data combined with the first view image and the special-effect data combined with the second view image. The special-effect attribute information includes position information of the special effect in the to-be-processed image and a number of frames of the 3D special-effect images to be generated.
Abstract:
The present invention provides a 3D display method, which comprises: obtaining an interference value of a display unit displaying at least one pair of images with parallax under the coverage of a spectroscopic device; adjusting the display unit based on the interference value; and the adjusted display unit displaying the at least one pair of images with parallax. The present invention also provides a 3D display device. Through the technical solution of the present invention, the crosstalk phenomenon during the 3D display process can be relieved to optimize the 3D display result.
Abstract:
A microlens array imaging device includes a main lens, a microlens array containing a plurality of microlenses, and an image acquisition unit. The main lens projects a first image of an object for the plurality of microlenses, each of which projects a second image of the first image on the image acquisition unit. Each second image includes an image circle and a circle of confusion around the image circle. A distance between an image circle of one microlens and an image circle of an adjacent microlens is equal to or greater than a summation of a radius of the second image corresponding to the one microlens and a radius of the image circle of the adjacent microlens, and is equal to or less than a summation of the radius of the second image corresponding to the one microlens and a radius of the second image corresponding to the adjacent microlens.