摘要:
A cache structure for use in implementing reconfigurable system configuration information storage, comprises: layered configuration information cache units: for use in caching configuration information that may be used by a certain or several reconfigurable arrays within a period of time; an off-chip memory interface module: for use in establishing communication; a configuration management unit: for use in managing a reconfiguration process of the reconfigurable arrays, in mapping each subtask in an algorithm application to a certain reconfigurable array, thus the reconfigurable array will, on the basis of the mapped subtask, load the corresponding configuration information to complete a function reconfiguration for the reconfigurable array. This increases the utilization efficiency of configuration information caches. Also provided is a method for managing the reconfigurable system configuration information caches, employing a mixed priority cache update method, and changing a mode for managing the configuration information caches in a conventional reconfigurable system, thus increasing the dynamic reconfiguration efficiency in a complex reconfigurable system.
摘要:
Disclosed are a pre-decoding analysis-based configuration information cache management system, comprising a streaming media processing module, a configuration information prefetch FIFO module, a configuration information storage unit, and a cache controller module. Also disclosed is a management method for the pre-decoding analysis-based configuration information cache management system. The present invention allows for increased dynamic reconfiguration efficiency of a large-scale coarse-grained reconfigurable system.
摘要:
Disclosed are a pre-decoding analysis-based configuration information cache management system, comprising a streaming media processing module, a configuration information prefetch FIFO module, a configuration information storage unit, and a cache controller module. Also disclosed is a management method for the pre-decoding analysis-based configuration information cache management system. The present invention allows for increased dynamic reconfiguration efficiency of a large-scale coarse-grained reconfigurable system.
摘要:
The present invention relates to the field of analog integrated circuits, and provides a multiply-accumulate calculation method and circuit suitable for a neural network, which realizes large-scale multiply-accumulate calculation of the neural network with low power consumption and high speed. The multiply-accumulate calculation circuit comprises a multiplication calculation circuit array and an accumulation calculation circuit. The multiplication calculation circuit array is composed of M groups of multiplication calculation circuits. Each group of multiplication calculation circuits is composed of one multiplication array unit and eight selection-shift units. The order of the multiplication array unit is quantized in real time by using on-chip training to provide a shared input for the selection-shift units, achieving increased operating rate and reduced power consumption. The accumulation calculation circuit is composed of a delay accumulation circuit, a TDC conversion circuit, and a shift-addition circuit in series. The delay accumulation circuit comprises eight controllable delay chains for dynamically controlling the number of iterations and accumulating data multiple times in a time domain, so as to meet the difference in calculation scale of different network layers, save hardware storage space, reduce calculation complexity, and reduce data scheduling.
摘要:
Disclosed is a cache structure for use in implementing reconfigurable system configuration information storage, comprising: layered configuration information cache units: for use in caching configuration information that may be used by a certain or several reconfigurable arrays within a period of time; an off-chip memory interface module: for use in establishing communication; a configuration anagement unit: for use in managing a reconfiguration process of the reconfigurable arrays, in mapping each subtask in an algorithm application to a certain reconfigurable array, thus the reconfigurable array will, on the basis of the mapped subtask, load the corresponding configuration information to complete a function reconfiguration for the reconfigurable array. This increases the utilization efficiency of configuration information caches. Also provided is a method for managing the reconfigurable system configuration information caches, employing a mixed priority cache update method, and changing a mode for managing the configuration information caches in a conventional reconfigurable system, thus increasing the dynamic reconfiguration efficiency in a complex reconfigurable system.