Inference focus for offline training of SRAM inference engine in binary neural network
Abstract:
A Static Random Access Memory (SRAM) device in a binary neural network is provided. The SRAM device includes an SRAM inference engine having an SRAM computation architecture with a forward path that include multiple SRAM cells forming a chain of SRAM cells such that an output of a given one of the multiple SRAM cells is an input to a following one of the multiple SRAM cells. The SRAM computation architecture is configured to compute a prediction from an input. The SRAM computation architecture is configured to store ternary data and perform local computations on the ternary data.
Information query
Patent Agency Ranking
0/0