-
公开(公告)号:US20230244746A1
公开(公告)日:2023-08-03
申请号:US17674470
申请日:2022-02-17
Applicant: THE REGENTS OF THE UNIVERSITY OF MICHIGAN
Inventor: Dennis SYLVESTER , David BLAAUW , Yu CHEN , Pierre ABILLAMA , Hun-Seok KIM
CPC classification number: G06F17/145 , G06F17/16 , G06N3/08
Abstract: A computer-implemented method is presented for performing a computation with a neural network. The method includes: receiving a first input patch of data; applying a Walsh-Hadamard transform to the input patch to yield a transformed input patch in a transformed domain; computing an element-wise product of the transformed input patch and a kernel of the neural network; applying an inverse Walsh-Hadamard transform to the element-wise product to yield an intermediate matrix; and creating a first output patch from the intermediate matrix, where the size of the first output patch is smaller than the intermediate matrix.
-
公开(公告)号:US20240028900A1
公开(公告)日:2024-01-25
申请号:US17872715
申请日:2022-07-25
Applicant: THE REGENTS OF THE UNIVERSITY OF MICHIGAN
Inventor: Hun-Seok KIM , David BLAAUW , Dennis SYLVESTER , Yu CHEN , Pierre ABILLAMA , Hyochan AN
CPC classification number: G06N3/082 , G06F17/16 , G06F7/5443
Abstract: Recent advances in model pruning have enabled sparsity-aware deep neural network accelerators that improve the energy efficiency and performance of inference tasks. SONA, a novel transform-domain neural network accelerator is introduced in which convolution operations are replaced by element-wise multiplications and weights are orthogonally structured to be sparse. SONA employs an output stationary dataflow coupled with an energy-efficient memory organization to reduce the overhead of sparse-orthogonal transform-domain kernels that are concurrently processed while maintaining full multiply-and-accumulate (MAC) array utilization without any conflicts. Weights in SONA are non-uniformly quantized with bit-sparse canonical-signed-digit (BS-CSD) representations to reduce multiplications to simpler additions.
-