-
公开(公告)号:US11494087B2
公开(公告)日:2022-11-08
申请号:US16175926
申请日:2018-10-31
Applicant: Advanced Micro Devices, Inc.
Inventor: Georgios Mappouras , Amin Farmahini Farahani , Michael Ignatowski
IPC: G06F3/06
Abstract: Memory management circuitry and processes operate to improve reliability of a group of memory stacks, providing that if a memory stack or a portion thereof fails during the product's lifetime, the system may still recover with no errors or data loss. A front-end controller receives a block of data requested to be written to memory, divides the block into sub-blocks, and creates a new redundant reliability sub-block. The sub-blocks are then written to different memory stacks. When reading data from the memory stacks, the front-end controller detects errors indicating a failure within one of the memory stacks, and recovers corrected data using the reliability sub-block. The front-end controller may monitor errors for signs of a stack failure and disable the failed stack.
-
公开(公告)号:US20200042859A1
公开(公告)日:2020-02-06
申请号:US16194958
申请日:2018-11-19
Applicant: Advanced Micro Devices, Inc.
Inventor: Georgios Mappouras , Amin Farmahini-Farahani , Sudhanva Gurumurthi , Abhinav Vishnu , Gabriel H. Loh
Abstract: Systems, apparatuses, and methods for managing buffers in a neural network implementation with heterogeneous memory are disclosed. A system includes a neural network coupled to a first memory and a second memory. The first memory is a relatively low-capacity, high-bandwidth memory while the second memory is a relatively high-capacity, low-bandwidth memory. During a forward propagation pass of the neural network, a run-time manager monitors the usage of the buffers for the various layers of the neural network. During a backward propagation pass of the neural network, the run-time manager determines how to move the buffers between the first and second memories based on the monitored buffer usage during the forward propagation pass. As a result, the run-time manager is able to reduce memory access latency for the layers of the neural network during the backward propagation pass.
-
公开(公告)号:US11775799B2
公开(公告)日:2023-10-03
申请号:US16194958
申请日:2018-11-19
Applicant: Advanced Micro Devices, Inc.
Inventor: Georgios Mappouras , Amin Farmahini-Farahani , Sudhanva Gurumurthi , Abhinav Vishnu , Gabriel H. Loh
CPC classification number: G06N3/04 , G06F9/44505 , G06F9/544 , G06N3/084
Abstract: Systems, apparatuses, and methods for managing buffers in a neural network implementation with heterogeneous memory are disclosed. A system includes a neural network coupled to a first memory and a second memory. The first memory is a relatively low-capacity, high-bandwidth memory while the second memory is a relatively high-capacity, low-bandwidth memory. During a forward propagation pass of the neural network, a run-time manager monitors the usage of the buffers for the various layers of the neural network. During a backward propagation pass of the neural network, the run-time manager determines how to move the buffers between the first and second memories based on the monitored buffer usage during the forward propagation pass. As a result, the run-time manager is able to reduce memory access latency for the layers of the neural network during the backward propagation pass.
-
公开(公告)号:US10802977B2
公开(公告)日:2020-10-13
申请号:US16218389
申请日:2018-12-12
Applicant: ADVANCED MICRO DEVICES, INC.
Inventor: Georgios Mappouras , Amin Farmahini Farahani , Nuwan Jayasena
IPC: G06F12/08 , G06F12/0882 , G06F3/06
Abstract: A processing system tracks counts of accesses to memory pages using a set of counters located at the memory module that stores the pages, wherein the counts are adjusted at least in part based on refreshes of the memory pages. This approach allows a processing system to efficiently maintain the counts with relatively small counters and with relatively low overhead. Furthermore, the rate at which the counters are adjusted, relative to the page refreshes, is adjustable, so that the access counts are useful for a wide variety of application types.
-
-
-