Abstract:
A method includes detecting, with a passive infrared sensor (PIR), a level of infrared radiation in a field of view (FOV) of the PIR, generating a signal based on detected levels over a period of time, the signal having values that exhibit a change in the detected levels, extracting a local feature from a sample of the signal, wherein the local feature indicates a probability that a human in the FOV caused the change in the detected levels, extracting a global feature from the sample of the signal, wherein the global feature indicates a probability that an environmental radiation source caused the change in the detected levels, determining a score based on the local feature and the global feature, and determining that a human motion has been detected in the FOV based on the score.
Abstract:
A process generates a lookup table to estimate spatial depth in a visual scene. The process identifies subsets of illuminators of a camera system with image sensors and illuminators. The image sensors are associated with multiple pixels. For each pixel, and for each of multiple depths from the pixel, the process simulates a virtual surface at the depth. For each subset of the subsets of illuminators, the process simulates illumination of the virtual surface from the subset and determines an expected light intensity at the pixel from light reflected from the virtual surface due to the simulated illumination. The process forms intensity information from the expected light intensity determined for the pixel for each of the depths and each of the subsets. The process constructs a lookup table comprising the intensity information. The lookup table associates the intensity information for each pixel with the respective depth and the respective subset.
Abstract:
A process reduces false positive security alerts. The process is performed at a computing device having one or more processors, and memory storing one or more programs configured for execution by the one or more processors. The process computes a depth map for a scene monitored by a video camera using a plurality of IR images captured by the video camera and uses the depth map to identify a first region within the scene having historically above average false positive detected motion events. In some instances, the first region is a ceiling, a window, or a television. The process monitors a video stream provided by the video camera to identify motion events, excluding the first region, and generates a motion alert when there is detected motion in the scene outside of the first region and the detected motion satisfies threshold criteria.
Abstract:
A process creates a depth map of a scene. The process is performed at a computing device having one or more processors, and memory storing one or more programs configured for execution by the one or more processors. For each of a plurality of distinct subsets of illuminators of a camera system, the process receives a captured image of a first scene taken by a 2-dimensional array of image sensors of the camera system while the respective subset of illuminators are emitting light and the illuminators not in the respective subset are not emitting light. The image sensors are partitioned into a plurality of pixels. For each pixel, the process uses the captured images to form a respective vector of light intensity at the pixel and estimates a depth in the first scene at the respective pixel by looking up the respective vector in a respective lookup table.
Abstract:
A process generates lookup tables for estimating spatial depth in a scene. The process identifies subsets of illuminators of a camera system that has a 2-dimensional array of image sensors and illuminators in fixed locations relative to the array, and partitions the image sensors into a plurality of pixels. For each pixel, and for each of m distinct depths from the respective pixel, the process simulates a virtual surface at the respective depth. For each of the subsets of illuminators, the process determines an expected light intensity at the pixel based on the respective depth. The process forms an intensity vector using the expected light intensities for each of the distinct subsets and normalizes the intensity vector. For each pixel, the process constructs a lookup table comprising the normalized vectors corresponding to the pixel. The lookup table associates each normalized vector with the depth of the corresponding simulated surface.
Abstract:
A process classifies objects in a scene. The process receives a captured IR image of a scene taken by a 2-dimensional image sensor array of a camera system while one or more IR illuminators of the camera system are emitting IR light, thereby forming an IR intensity map of the scene with a respective intensity value determined for each pixel of the IR image. The process uses the IR intensity map to identify a plurality of pixels whose corresponding intensity values are within a predefined intensity range, and clusters the identified plurality of pixels into one or more regions that are substantially contiguous. The process determines that a first region of the one or more regions corresponds to a specific material based, at least in part, on the intensity values of the pixels in the first region. The process then stores information in the memory that identifies the first region.
Abstract:
A process generates depth maps of a scene in the field of vision of a camera. The camera has a plurality of illuminators, a lens assembly, an image sensing element, a processor, and memory. The illuminators provide illumination using all of the illuminators. The lens assembly focuses incident light on the image sensing element. The memory stores image data from the image sensing element. The processor executes programs to control operation of the camera. The process operates in a second mode, where each of a plurality of subsets of the illuminators provides illumination separately from other subsets. The process sequentially activates each of the subsets to illuminate a scene and receives reflected illumination from the scene incident on the lens assembly and focused onto the image sensing element. The process measures light intensity values at the image sensing element and stores those measured light intensity values.
Abstract:
A camera system includes memory, a lens assembly arranged to direct light from a scene onto an image sensing element, an image sensing element configured to receive light from the scene via the lens assembly, at least one infrared illuminator configured to transmit infrared light, and a processor, coupled to the image sensing element and the illuminators. The processor is configured to operate the illuminators and the image sensing element in a first mode whereby infrared light transmitted by the illuminators and reflected from the scene is used to generate a two-dimensional image of the scene. The processor is also configured to operate the illuminators and the image sensing element in a second mode whereby infrared light transmitted by the illuminators and reflected from the scene is used to detect a first region, located within the scene, having a specific material.
Abstract:
Systems and methods of detecting human movement with a sensor are provided, including generating a motion event signal in response to movement detected by the sensor, and generating a parameterized curve to represent the detected motion. The parameterized curve is fit to a predetermined window of sensor data captured by the sensor to filter the motion event signal. A noise magnitude estimate and a curve fit error is determined based on the fitted parameterized curve to the predetermined window. A detection threshold value is determined based on the curve fit error, a noise source signal estimate of known noise, and zero or more noise magnitudes from other sources. Human motion is determined by correlating a true motion event signal with human motion based on a comparison between a value of a point on the parameterized curve and the detection threshold value.
Abstract:
Systems and methods of detecting human movement with a sensor are provided, including generating a motion event signal in response to movement detected by the sensor, and generating a parameterized curve to represent the detected motion. The parameterized curve is fit to a predetermined window of sensor data captured by the sensor to filter the motion event signal. A noise magnitude estimate and a curve fit error is determined based on the fitted parameterized curve to the predetermined window. A detection threshold value is determined based on the curve fit error, a noise source signal estimate of known noise, and zero or more noise magnitudes from other sources. Human motion is determined by correlating a true motion event signal with human motion based on a comparison between a value of a point on the parameterized curve and the detection threshold value.