Abstract:
Virtual controllers for visual displays are described. In one implementation, a camera captures an image of hands against a background. The image is segmented into hand areas and background areas. Various hand and finger gestures isolate parts of the background into independent areas, which are then assigned control parameters for manipulating the visual display. Multiple control parameters can be associated with attributes of multiple independent areas formed by two hands, for advanced control including simultaneous functions of clicking, selecting, executing, horizontal movement, vertical movement, scrolling, dragging, rotational movement, zooming, maximizing, minimizing, executing file functions, and executing menu choices.
Abstract:
Disclosed is a unique system and method that facilitates gesture-based interaction with a user interface. The system involves an object sensing configured to include a sensing plane vertically or horizontally located between at least two imaging components on one side and a user on the other. The imaging components can acquire input images taken of a view of and through the sensing plane. The images can include objects which are on the sensing plane and/or in the background scene as well as the user as he interacts with the sensing plane. By processing the input images, one output image can be returned which shows the user objects that are in contact with the plane. Thus, objects located at a particular depth can be readily determined. Any other objects located beyond can be “removed” and not seen in the output image.
Abstract:
Disclosed is a unique system and method that facilitates cursor control based in part on computer vision activated by a capacitive touch sensor. When turned on, user hand gestures or movements can be tracked by a monitoring component and those movements can be converted in real-time to control or drive cursor movements and/or position on a user interface. The system comprises a monitoring component or camera that can be activated by touch or pressure applied to a capacitive touch sensor. A circuit within the sensor determines when the user is touching a button (e.g., on keyboard or mouse) that activates the monitoring component and cursor control mechanism. Thus, intentional hand movements by the user can readily be determined.
Abstract:
The detection of touch on an optical touch-sensitive device is disclosed. For example, one disclosed embodiment comprises a touch-sensitive device including a display screen, a laser, and a scanning mirror configured to scan light from the laser across the screen. The touch-sensitive device also includes a position-sensitive device and optics configured to form an image of at least a portion of the screen on the position-sensitive device. A location of an object relative to the screen may be determined by detecting a location on the position-sensitive device of laser light reflected by the object.
Abstract:
Compensation of the effects of uncontrolled light in an imaging system using a controlled light source. Light from the controlled light source reflected by an object and uncontrolled light are detected in a plurality of frequency ranges. Intensity of the uncontrolled light is determined based on the varying sensitivity of an image sensor to light in the different frequency ranges and known emission characteristics of the controlled light source in the frequency ranges. Once the intensity of the uncontrolled light is determined, the total light detected at each point is adjusted to reduce the effects of the uncontrolled light in the resulting imaging data produced by the imaging system.