AREA AND POWER EFFICIENT IMPLEMENTATIONS OF MODIFIED BACKPROPAGATION ALGORITHM FOR ASYMMETRIC RPU DEVICES

    公开(公告)号:US20210279556A1

    公开(公告)日:2021-09-09

    申请号:US16808811

    申请日:2020-03-04

    Abstract: Aspects of the invention include a first matrix resistive processing unit (“RPU”) array receives a first input vector along the rows of the first matrix RPU. A second matrix RPU array receives a second input vector along the rows of the second matrix RPU. A reference matrix RPU array receives an inverse of the first input vector along the rows of the reference matrix RPU and an inverse of the second input vector along the rows of the reference matrix RPU. A plurality of analog to digital converters are coupled to respective outputs of a plurality of summing junctions that receive respective column outputs of the first matrix RPU array, the second matrix RPU array, and the reference RPU array and provides a digital value of the output of the plurality of summing junctions.

    ACTIVATION FUNCTION COMPUTATION FOR NEURAL NETWORKS

    公开(公告)号:US20210264247A1

    公开(公告)日:2021-08-26

    申请号:US16797587

    申请日:2020-02-21

    Abstract: A computer-implemented method for improving the efficiency of computing an activation function in a neural network system includes initializing, by a controller, weights in a weight vector associated with the neural network system. Further, the method includes receiving, by the controller, an input vector of input values for computing a dot product with the weight vector for the activation function, which determines an output value of a node in the neural network system. The method further includes predicting, by a rectifier linear unit (ReLU), which computes the activation function, that the output value of the node will be negative based on computing an intermediate value for computing the dot product, and based on a magnitude of the intermediate value exceeding a precomputed threshold value. Further, the method includes, in response to the prediction, terminating, by the ReLU, the computation of the dot product, and outputting a 0 as the output value.

    AGGREGATE ADJUSTMENTS IN A CROSS BAR NEURAL NETWORK

    公开(公告)号:US20200050929A1

    公开(公告)日:2020-02-13

    申请号:US16100673

    申请日:2018-08-10

    Abstract: Method, systems, crosspoint arrays, and systems for tuning a neural network. A crosspoint array includes: a set of conductive rows, a set of conductive columns intersecting the set of conductive rows to form a plurality of crosspoints, a circuit element coupled to each of the plurality of crosspoints configured to store a weight of the neural network, a voltage source associated with each conductive row, a first integrator attached at the end of at least one of the conductive column, and a first variable resistor attached to the integrator and the end of the at least one conductive column.

Patent Agency Ranking