Abstract:
Certain aspects of the present disclosure relate to a system. In some aspects, the system may include an emitter configured to emit a light. The system may include a light guide configured to direct the light toward a pupil of an eye. The light guide may include a directing portion configured to propagate the light. The light guide may include a turning portion configured to receive light from the directing portion and direct the light toward the pupil of the eye and configured to receive reflected light directed from the pupil of the eye and direct the reflected light toward the directing portion. The system may include a receiver configured to receive the reflected light from the light guide. The system may include a processor configured to determine a gaze direction of the pupil of the eye based at least in part on the reflected light received by the receiver.
Abstract:
Methods, systems, computer-readable media, and apparatuses for image processing and utilization are presented. In some embodiments, an image containing at a face of a user may be obtained using a mobile device. An orientation of the face of the user within the image may be determined using the mobile device. The orientation of the face of the user may be determined using multiple stages: (a) a rotation stage for controlling a rotation applied to a portion of the image, to generate a portion of rotated image, and (b) an orientation stage for controlling an orientation applied to orientation-specific feature detection performed on the portion of rotated image. The determined orientation of the face of the user may be utilized as a control input to modify a display rotation of the mobile device.
Abstract:
Techniques disclosed herein utilize a vision sensor that integrates a special-purpose camera with dedicated computer vision (CV) computation hardware and a dedicated low-power microprocessor for the purposes of detecting, tracking, recognizing, and/or analyzing subjects, objects, and scenes in the view of the camera. The vision sensor processes the information retrieved from the camera using the included low-power microprocessor and sends “events” (or indications that one or more reference occurrences have occurred, and, possibly, associated data) for the main processor only when needed or as defined and configured by the application. This allows the general-purpose microprocessor (which is typically relatively high-speed and high-power to support a variety of applications) to stay in a low-power (e.g., sleep mode) most of the time as conventional, while becoming active only when events are received from the vision sensor.
Abstract:
Techniques disclosed herein utilize a vision sensor that integrates a special-purpose camera with dedicated computer vision (CV) computation hardware and a dedicated low-power microprocessor for the purposes of detecting, tracking, recognizing, and/or analyzing subjects, objects, and scenes in the view of the camera. The vision sensor processes the information retrieved from the camera using the included low-power microprocessor and sends “events” (or indications that one or more reference occurrences have occurred, and, possibly, associated data) for the main processor only when needed or as defined and configured by the application. This allows the general-purpose microprocessor (which is typically relatively high-speed and high-power to support a variety of applications) to stay in a low-power (e.g., sleep mode) most of the time as conventional, while becoming active only when events are received from the vision sensor.
Abstract:
An apparatus includes a hardware sensor array including a plurality of pixels arranged along at least a first dimension and a second dimension of the array, each of the pixels capable of generating a sensor reading. A hardware scanning window array includes a plurality of storage elements arranged along at least a first dimension and a second dimension of the hardware scanning window array, each of the storage elements capable of storing a pixel value based on one or more sensor readings. Peripheral circuitry for systematically transfers pixel values, based on sensor readings, into the hardware scanning window array, to cause different windows of pixel values to be stored in the hardware scanning window array at different times. Control logic coupled to the hardware sensor array, the hardware scanning window array, and the peripheral circuitry, provides control signals to the peripheral circuitry to control the transfer of pixel values.
Abstract:
An apparatus includes a hardware sensor array including a plurality of pixels arranged along at least a first dimension and a second dimension of the array, each of the pixels capable of generating a sensor reading. A hardware scanning window array includes a plurality of storage elements arranged along at least a first dimension and a second dimension of the hardware scanning window array, each of the storage elements capable of storing a pixel value based on one or more sensor readings. Peripheral circuitry for systematically transfers pixel values, based on sensor readings, into the hardware scanning window array, to cause different windows of pixel values to be stored in the hardware scanning window array at different times. Control logic coupled to the hardware sensor array, the hardware scanning window array, and the peripheral circuitry, provides control signals to the peripheral circuitry to control the transfer of pixel values.
Abstract:
Multi-level cell (MLC) static random access memory (SRAM) (MLC SRAM) cells configured to perform multiplication operations are disclosed. In one aspect, an MLC SRAM cell includes SRAM bit cells, wherein data values stored in SRAM bit cells correspond to a multiple-bit value stored in the MLC SRAM cell that serves as first operand in multiplication operation. Voltage applied to read bit line is applied to each SRAM bit cell, wherein the voltage is an analog representation of a multiple-bit value that serves as a second operand in the multiplication operation. For each SRAM bit cell, if a particular binary data value is stored, a current correlating to the voltage of the read bit line is added to a current sum line. A magnitude of current on the current sum line is an analog representation of a multiple-bit product of the first operand multiplied by the second operand.
Abstract:
Techniques describe computing computer vision (CV) features based on sensor readings from a sensor and detecting macro-features based on the CV features. The sensor may include a sensor element array that includes a plurality of sensor elements. The sensor may also include in-pixel circuitry coupled to the sensor elements, peripheral circuitry and/or a dedicated microprocessor coupled to the sensor element array. The in-pixel circuitry, the peripheral circuitry or the dedicated microprocessor may include computation structures configured to perform analog or digital operations representative of a multi-pixel computation for a sensor element (or block of sensor elements), based on sensor readings generated by neighboring sensor elements in proximity to the sensor element, and to generate CV features. The dedicated microprocessor may process the CV features and detect macro-features. Furthermore, in certain embodiments, the dedicated microprocessor may be coupled to a second microprocessor through a wired or wireless interface.
Abstract:
An apparatus includes a hardware sensor array including a plurality of pixels arranged along at least a first dimension and a second dimension of the array, each of the pixels capable of generating a sensor reading. A hardware scanning window array includes a plurality of storage elements arranged along at least a first dimension and a second dimension of the hardware scanning window array, each of the storage elements capable of storing a pixel value based on one or more sensor readings. Peripheral circuitry for systematically transfers pixel values, based on sensor readings, into the hardware scanning window array, to cause different windows of pixel values to be stored in the hardware scanning window array at different times. Control logic coupled to the hardware sensor array, the hardware scanning window array, and the peripheral circuitry, provides control signals to the peripheral circuitry to control the transfer of pixel values.
Abstract:
Certain aspects of the present disclosure support a technique for efficient implementation of neural population diversity in neural systems. A set of parameters for each class of artificial neurons of a plurality of classes can be stored in a storage medium. A generator can be configured to obtain noise parameters for each class of artificial neurons in the neural system. After that, the noise parameters can be combined with the set of parameters for each class of artificial neurons to obtain a dithered set of parameters for each class of artificial neurons. The dithered set of parameters can be stored for each class of artificial neurons to be used for a neuron model for the artificial neurons that emulates behavior of the artificial neurons in the neural system.