Abstract:
Described herein is a polymorphic display, which is a unitary apparatus constructed such that a wide variety of electro-optic functions are enabled. The polymorphic display is used in an intelligent container system for monitoring and evaluating actions taken at the container system. In this way, relationships between displayed information and actions may be determined. The polymorphic display, even when having multiple pixels, enables sharing of selected structures among the pixels. In a multi-pixel construction, there is a set of pixels in the display that exhibit one set of operable properties, such as particular stability, sequencing, and switching properties, and another set of pixels that are different from the first set. That is, they have different stability, sequencing, or switching properties. In such a way, a highly flexible polymorphic display may be construed to satisfy a wide range of display needs, including novel applications on intelligent container systems.
Abstract:
An input system for a 3D display device includes a pen, and a tablet computing device to receive input via interaction with the pen. The tablet computing device including a display dock to dock the 3D display device and a pen dock to dock the pen, wherein docking of the 3D display device in the display dock facilitates a determination of a display position of a visual representation of the tablet computing device on the 3D display device.
Abstract:
A hybrid display (120, 600) includes a first display (605) with a high resolution having a first interface and a second display (610) with a low resolution having a second interface. A third interface is configured to receive a first command that includes a first value indicating a modification of pixels in the hybrid display. A finite state machine is configured to translate the first value to a second value indicating a modification of pixels in the first display and a third value indicating a modification of pixels in the second display. The first interface transmits a second command including the second value to the first interface and a third command including the third value to the second interface. The first and second commands are transmitted at times determined by a relative delay between the first display and the second display in order to synchronise frame rates, scan lines, and luminance, and coma of the two composite display units.
Abstract:
A presente invenção se refere a um aparelho para exibição, manipulação e interação de conteúdo de multimídia entre dispositivos móveis, com o objetivo de exibir conteúdos de multimídia, preferencialmente entre um docente e alunos em uma sala de aula. A presente invenção compreende um sistema de processamento de dados dotado de uma interface de comandos inteligente que viabiliza a interação/manipulação de conteúdo de multimídia entre o docente e os dispositivos móveis dos alunos. 0 sistema de processamento de dados é conectado por cabos flat às duas barras emissoras e duas barras receptoras de raios infravermelhos (25) que são fixadas nas bordas do painel de LCD (27). Uma moldura metálica (1) envolve as referidas barras emissoras e receptoras (25) e é fixada na parte frontal do painel de LCD, por meio de dispositivos de fixação conhecidos na técnica. A moldura metálica possui as mesmas dimensões das bordas do painel de LCD, formando, assim, um único conjunto.
Abstract:
To manage dynamic adjustment of the refresh rate of a computer display, the operating system defines at least two playback modes: one or more custom modes that can be selected by applications, and a standard mode which is a default setting for the system that can be expected by applications. The operating system provides an application programming interface that enables an application to request using a custom mode. If approved to use the custom mode, then the application presents frames for display based on the custom mode. The operating system stores timing data for each buffered frame indicating how to play the frame in both standard mode and the custom mode. If a transition back to the standard mode occurs, the operating system uses the timing data to properly present frames of video until the application stops generating frames of video in the custom mode.
Abstract:
A computer-driven wearable display device incorporating a display engine for presenting virtual images to wearers is interconnected with a computer-driven portable display device incorporating a display screen for presenting real images to the wearers of the wearable display device. An applications management and communication system including an external manager application residing on the portable display device provides for managing and launching applications residing on the wearable display device through the user interface of the portable display device.
Abstract:
Ambient light control and calibration systems and methods are provided herein. According to some embodiments, exemplary systems may include a console that includes a processor that executes logic to control a plurality of nodes to reproduce a virtual lighting scheme of a virtual environment in a physical user environment. Additionally, the system may include a plurality of nodes that each includes a light emitting device, a receiver that communicatively couples the node to the console, and a processor that executes logic to control the light emitting device.
Abstract:
Disclosed is an application interface that takes into account the user's gaze direction relative to who is speaking in an interactive multi-participant environment where audio-based contextual information and/or visual-based semantic information is being presented. Among these various implementations, two different types of microphone array devices (MADs) may be used. The first type of MAD is a steerable microphone array (a.k.a. a steerable array) which is worn by a user in a known orientation with regard to the user's eyes, and wherein multiple users may each wear a steerable array. The second type of MAD is a fixed-location microphone array (a.k.a. a fixed array) which is placed in the same acoustic space as the users (one or more of which are using steerable arrays).