Abstract:
A neural network device may generate an input feature list based on an input feature map, where the input feature list includes an input feature index and an input feature value, generating an output feature index based on the input feature index corresponding to an input feature included in the input feature list and a weight index corresponding to a weight included in a weight list, and generating an output feature value corresponding to the output feature index based on the input feature value corresponding to the input feature and a weight value corresponding to the weight.
Abstract:
A projection apparatus, in which an image projection unit and a screen unit are integrally mounted on one mounting unit, is provided. The screen unit includes a screen configured to display an image, and a screen attaching part to which the screen is attached. The screen attaching part includes a front plate to which the screen is attached, a rear plate, and a strength reinforcing member interposed between the front plate and the rear plate.
Abstract:
A projection apparatus includes an image projection unit configured to form an image and project the formed image, a screen unit configured to display the image projected from the image projection unit, and a mounting unit on which the image projection unit and the screen unit are integrally mounted. The screen unit is formed to have a curvature.
Abstract:
A screen device and a projection apparatus having the same are provided. The screen device includes a screen on which an image is projected, and a support panel which includes a pair of plates and a strength reinforcement member arranged between the pair of plates, wherein the screen is attached to any one of the pair of plates.
Abstract:
An integrated circuit included in a device for performing a neural network operation includes a buffer configured to store feature map data in units of cells each including at least one feature, wherein the feature map data is for use in the neural network operation; and a multiplexing circuit configured to receive the feature map data from the buffer, and output extracted data by extracting feature data of one of features that are included within a plurality of cells in the received feature map data, the features each corresponding to an identical coordinate value.
Abstract:
A neural network device may generate an input feature list based on an input feature map, where the input feature list includes an input feature index and an input feature value, generating an output feature index based on the input feature index corresponding to an input feature included in the input feature list and a weight index corresponding to a weight included in a weight list, and generating an output feature value corresponding to the output feature index based on the input feature value corresponding to the input feature and a weight value corresponding to the weight.
Abstract:
Some example embodiments may involve performing a convolution operation of a neural network based on a Winograd transform. Some example embodiments may involve a device including neural network processing circuitry that is configured to generate, by the neural network processing circuitry, a transformed input feature map by performing a Winograd transform on an input feature map, the transformed input feature map having a matrix form and including a plurality of channels; to perform, by the neural network processing circuitry, element-wise multiplications between a feature vector of the transformed input feature map and a weight vector of a transformed weight kernel obtained based on the Winograd transform; and to add, by the neural network processing circuitry, element-wise multiplication results, the element-wise multiplications being performed channel-by-channel with respect to the feature vector including feature values on a position in the plurality of channels of the transformed input feature map.
Abstract:
A method of operating a neural network device configured to perform a neural network operation on successively input image frames includes generating, by a processing circuit, a second delta feature map by performing a linear operation on a first delta feature map generated based on a difference between a current image frame and a previous image frame; loading feature values as a second previous feature map onto the processing circuit from at least one memory, the loaded feature values being feature values corresponding to a first partial region to be updated in a first feature map stored in the at least one memory; generating, at the processing circuit, a second current feature map based on the second delta feature map and the second previous feature map; and updating the first feature map by storing the second current feature map in the at least one memory.
Abstract:
A neural network processing unit may be configured to perform an approximate multiplication operation and a system on chip may include the neural network processing unit. The neural network processing unit may include a plurality of neural processing units and may perform a computation based on one or more instances of input data and a plurality of weights. At least one neural processing unit is configured to receive a first value and a second value and perform an approximate multiplication operation based on the first value and the second value and is further configured to perform a stochastic rounding operation based on an output value of the approximate multiplication operation.
Abstract:
An electronic device is provided. The electronic device includes a controller configured to transition the electronic device into a sleep mode, and a communication interface configured to establish a network session with the external computing device when the electronic device into the sleep mode, wherein the communication interface is configured to maintain the network session with the external computing device when the electronic device is in the sleep mode, and to transition the electronic device into a non-sleep mode in response to a signal, for requesting that the electronic device gets out the sleep mode, being received through the maintained network session.