Noise and signal management for RPU array

    公开(公告)号:US11361218B2

    公开(公告)日:2022-06-14

    申请号:US16427559

    申请日:2019-05-31

    Abstract: Advanced noise and signal management techniques for RPU arrays during ANN training are provided. In one aspect of the invention, a method for ANN training includes: providing an array of RPU devices with pre-normalizers and post-normalizers; computing and pre-normalizing a mean and standard deviation of all elements of an input vector x to the array that belong to the set group of each of the pre-normalizers; and computing and post-normalizing the mean μ and the standard deviation σ of all elements of an output vector y that belong to the set group of each of the post-normalizers.

    Architecture for enabling zero value shifting

    公开(公告)号:US10832773B1

    公开(公告)日:2020-11-10

    申请号:US16458806

    申请日:2019-07-01

    Abstract: A system includes an analog memory architecture for performing differential reading. The analog memory architecture includes a weight array including first cross-point devices located at intersections of a first set of conductive column wires and a first set of conductive row wires, and a reference array operatively coupled to the weight array and including second cross-point devices located at intersections of a second set of conductive column wires and a second set of conductive row wires. The second cross-point devices include differential unipolar switching memory devices configured to enable zero-value shifting of the outputs of the first cross-point devices.

    Deep neural network training with native devices

    公开(公告)号:US10748064B2

    公开(公告)日:2020-08-18

    申请号:US14837798

    申请日:2015-08-27

    Abstract: An artificial neural network and methods for performing computations on an artificial neural network include multiple neurons, including a layer of input neurons, one or more layers of hidden neurons, and a layer of output neurons. Arrays of weights are configured to accept voltage pulses from a first layer of neurons and to output current to a second layer of neurons during a feed forward operation. Each array of weights includes multiple resistive processing units having respective settable resistances.

    Convolutional neural networks using resistive processing unit array

    公开(公告)号:US10740671B2

    公开(公告)日:2020-08-11

    申请号:US15480597

    申请日:2017-04-06

    Inventor: Tayfun Gokmen

    Abstract: Technical solutions are described for implementing a convolutional neural network (CNN) using resistive processing unit (RPU) array. An example method includes configuring an RPU array corresponding to a convolution layer in the CNN based on convolution kernels of the layer. The method further includes performing forward pass computations via the RPU array by transmitting voltage pulses corresponding to input data to the RPU array, and storing values corresponding to output currents from the RPU arrays as output maps. The method further includes performing backward pass computations via the RPU array by transmitting voltage pulses corresponding to error of the output maps, and storing the output currents from the RPU arrays as backward error maps. The method further includes performing update pass computations via the RPU array by transmitting voltage pulses corresponding to the input data of the convolution layer and the error of the output maps to the RPU array.

    In-cell differential read-out circuitry for reading signed weight values in resistive processing unit architecture

    公开(公告)号:US10468098B2

    公开(公告)日:2019-11-05

    申请号:US16353111

    申请日:2019-03-14

    Abstract: A resistive processing unit (RPU) device includes a weight storage device to store a weight voltage which corresponds to a weight value of the RPU device, and a read transistor having a gate connected to the weight storage device, and first and second source/drain terminals connected to first and second control ports, respectively. A current source connected to the second source/drain terminal generates a fixed reference current. The read transistor generates a weight current in response to the weight voltage. A read current output from the second control port represents a signed weight value of the RPU device. A magnitude of the read current is equal to a difference between the weight current and the fixed reference current. The sign of the read current is positive when the weight current is greater than the fixed reference current, and negative when the weight current is less than the fixed reference current.

    IN-CELL DIFFERENTIAL READ-OUT CIRCUITRY FOR READING SIGNED WEIGHT VALUES IN RESISTIVE PROCESSING UNIT ARCHITECTURE

    公开(公告)号:US20190304539A1

    公开(公告)日:2019-10-03

    申请号:US16353111

    申请日:2019-03-14

    Abstract: A resistive processing unit (RPU) device includes a weight storage device to store a weight voltage which corresponds to a weight value of the RPU device, and a read transistor having a gate connected to the weight storage device, and first and second source/drain terminals connected to first and second control ports, respectively. A current source connected to the second source/drain terminal generates a fixed reference current. The read transistor generates a weight current in response to the weight voltage. A read current output from the second control port represents a signed weight value of the RPU device. A magnitude of the read current is equal to a difference between the weight current and the fixed reference current. The sign of the read current is positive when the weight current is greater than the fixed reference current, and negative when the weight current is less than the fixed reference current.

Patent Agency Ranking