Abstract:
A system for video conferencing is disclosed. The system comprises a data processor which receives from a remote location a stream of imagery data of a remote user, and displays an image of the remote user on a display device. The data processor also receives a stream of imagery data of an individual in a local scene in front of the display device, and extracts a gaze direction and/or a head orientation of the individual. The data processor varies a view of the image responsively to the gaze direction and/or the head orientation.
Abstract:
A system for computerized simulation of a make-up process is provided herein. The system includes: one or more capturing device configured to capture images of a face of a human user in controlled lighting conditions; a face reconstruction module configured to generate a 3D model of the face, based on the captured images; a display configured to present the reconstructed 3D model to the human user; a touch/3D user interface configured to receive a sequence of hand gestures and postures forming a virtual make-up session which imitates a real a make- up process; a virtual make-up simulator configured to apply virtual make-up features to the 3D model, to yield a 3D make-up model, based on said received hand gestures and postures, wherein said virtual make-up simulator repeatedly presents updated appearance of the 3D make-up model, over the display, responsive to changes made over said virtual make-up session.
Abstract:
A system and a method that implement a user interface are provided herein. The system includes a touch interface, a gesture sensor and a processing element arranged to generate an interface command that corresponds to a combination of a touch detected by the touch interface and a gesture identified by the gesture sensor, wherein the correspondence determined according to specified rules. The method implements the logic of the aforementioned system.
Abstract:
An electronic device is provided which comprises a plurality of different sensors, each configured to retrieve data relating to at least one characteristic of a user; a processor configured to: receive data retrieved by the different sensors; and establish features that characterize the user based upon data received from at least two of the different sensors; a storage configured to store information that relates to the features that characterize the user; and wherein the processor is further configured to: receive new data that has been retrieved by the different sensors which relates to the features that characterize the user; retrieve information from the storage that relates to the features that characterize the user and compare the stored information with the newly received data; and based on the comparison, determine whether to generate a user related output and/or replace stored information with information derived from the newly received data.
Abstract:
A peripheral electronic device is described which is configured to communicate with a computing device comprising a display having a screen configured to display a virtual gaze cursor; wherein the peripheral electronic device comprises at least one user interface configured to trigger at least one operational command in response to interaction with a user, wherein the at least one operational command is associated with a current location of the virtual gaze cursor at the screen, and wherein a change at the current location of the virtual gaze cursor being displayed, is determined based on a shift of a user's gaze from a first location at said screen to a different location thereat, or based on a tilt of the user's head, or based on any combination thereof.
Abstract:
An imaging system is disclosed. The system comprises: a first imaging device and a second imaging device being spaced apart and configured to provide partially overlapping field-of-views of a scene over a spectral range from infrared to visible light. The system comprises at least one infrared light source constituted for illuminating at least the overlap with patterned infrared light, and a computer system configured for receiving image data pertaining to infrared and visible light acquired by the imaging devices, and computing three-dimensional information of the scene based on the image data. The image data optionally and preferably comprises the patterned infrared light as acquired by both the imaging devices.