Abstract:
Examples disclosed herein relate to identifying a target touch region of a touch-sensitive surface based on an image. Examples include a touch input detected at a location of a touch-sensitive surface, an image representing an object disposed between a camera that captures the image and the touch-sensitive surface, identifying a target touch region of a touch-sensitive surface based on an image, and rejecting the detected touch input when the location of the detected touch input is not within any of the at least one identified target touch region of the touch-sensitive surface.
Abstract:
Examples disclosed herein relate to aligning content displayed from a projector on to a touch sensitive mat. Examples include detecting a border of the mat, wherein the mat includes a surface area of a first spectral reflectance characteristic on to which the projector is to project the content, and the border of a second spectral reflectance characteristic different from the first spectral reflectance characteristic surrounding a perimeter of the surface area. As an example, detecting the border of the mat generally includes differentiating the second spectral reflectance characteristic of the border from the first spectral reflectance characteristic of the surface area. Examples include detecting a border of the content displayed on to the mat, and adjusting projector settings for the border of the content displayed on to the mat to fit within the detected border of the mat.
Abstract:
A projector includes a lens, an image capturing device, and a processor connected to the projector and the image capturing device. The processor includes instructions for illuminating a work space using the lens, defocusing the lens, and after defocusing the lens, capturing an image of the work space.
Abstract:
Examples disclosed herein relate to aligning content displayed from a projector on to a touch sensitive mat. Examples include detecting a border of the mat, wherein the mat includes a surface area of a first spectral reflectance characteristic on to which the projector is to project the content, and the border of a second spectral reflectance characteristic different from the first spectral reflectance characteristic surrounding a perimeter of the surface area. As an example, detecting the border of the mat generally includes differentiating the second spectral reflectance characteristic of the border from the first spectral reflectance characteristic of the surface area. Examples include detecting a border of the content displayed on to the mat, and adjusting projector settings for the border of the content displayed on to the mat to fit within the detected border of the mat.
Abstract:
Examples disclosed herein relate to identifying a target touch region of a touch-sensitive surface based on an image. Examples include a touch input detected at a location of a touch-sensitive surface, an image representing an object disposed between a camera that captures the image and the touch-sensitive surface, identifying a target touch region of a touch-sensitive surface based on an image, and rejecting the detected touch input when the location of the detected touch input is not within any of the at least one identified target touch region of the touch-sensitive surface.
Abstract:
A projector includes a lens, an image capturing device, and a processor connected to the projector and the image capturing device. The processor includes instructions for illuminating a work space using the lens, defocusing the lens, and after defocusing the lens, capturing an image of the work space.
Abstract:
A projection capture system includes a camera to capture video of objects in a capture space, and a light emitting diode (LED) projector to illuminate the objects in the capture space and to project images captured by the camera into a display space. The projector includes a sequential display mode for sequentially displaying red, green, and blue light to project images captured by the camera into the display space, and a camera flash mode for simultaneously displaying red, green, and blue light to provide white light for illuminating the objects in the capture space during video capture.
Abstract:
A digital light projector having a plurality of color channels including at least one visible color channel providing visible light and at least one invisible color channel providing invisible light. The digital light projector including a projecting device projecting light from the plurality of color channels onto an environment in the form of an array of pixels which together form a video image including a visible image and an invisible image, the video image comprising a series of frames with each frame formed by the array of pixels, wherein to form each pixel of each frame the projecting device sequentially projects a series of light pulses from light provided by each of the plurality of color channels, with light pulses from the at least one visible color channel forming the visible image and light pulses from the at least one invisible color channel forming the invisible image.
Abstract:
A digital light projector having a plurality of color channels including at least one visible color channel providing visible light and at least one invisible color channel providing invisible light. The digital light projector including a projecting device projecting light from the plurality of color channels onto an environment in the form of an array of pixels which together form a video image including a visible image and an invisible image, the video image comprising a series of frames with each frame formed by the array of pixels, wherein to form each pixel of each frame the projecting device sequentially projects a series of light pulses from light provided by each of the plurality of color channels, with light pulses from the at least one visible color channel forming the visible image and light pulses from the at least one invisible color channel forming the invisible image.
Abstract:
Examples disclosed herein relate to determining a segmentation boundary based on images representing an object. Examples include an IR image based on IR light reflected by an object disposed between an IR camera and an IR-absorbing surface, a color image representing the object disposed between the color camera and the IR-absorbing surface, and determining a segmentation boundary for the object.