Multi-memory on-chip computational network

    公开(公告)号:US11741345B2

    公开(公告)日:2023-08-29

    申请号:US17033573

    申请日:2020-09-25

    Abstract: Provided are systems, methods, and integrated circuits for a neural network processing system. In various implementations, the system can include a first array of processing engines coupled to a first set of memory banks and a second array of processing engines coupled to a second set of memory banks. The first and second set of memory banks be storing all the weight values for a neural network, where the weight values are stored before any input data is received. Upon receiving input data, the system performs a task defined for the neural network. Performing the task can include computing an intermediate result using the first array of processing engines, copying the intermediate result to the second set of memory banks, and computing a final result using the second array of processing engines, where the final result corresponds to an outcome of performing the task.

    Circuit architecture with biased randomization

    公开(公告)号:US11250319B1

    公开(公告)日:2022-02-15

    申请号:US15714924

    申请日:2017-09-25

    Abstract: Disclosed herein are techniques for classifying data with a data processing circuit. In one embodiment, the data processing circuit includes a probabilistic circuit configurable to generate a decision at a pre-determined probability, and an output generation circuit including an output node and configured to receive input data and a weight, and generate output data at the output node for approximating a product of the input data and the weight. The generation of the output data includes propagating the weight to the output node according a first decision of the probabilistic circuit. The probabilistic circuit is configured to generate the first decision at a probability determined based on the input data.

    PROCESSING FOR MULTIPLE INPUT DATA SETS
    13.
    发明申请

    公开(公告)号:US20190294968A1

    公开(公告)日:2019-09-26

    申请号:US15933201

    申请日:2018-03-22

    Abstract: Disclosed herein are techniques for performing multi-layer neural network processing for multiple contexts. In one embodiment, a computing engine is set in a first configuration to implement a second layer of a neural network and to process first data related to a first context to generate first context second layer output. The computing engine can be switched from the first configuration to a second configuration to implement a first layer of the neural network. The computing engine can be used to process second data related to a second context to generate second context first layer output. The computing engine can be set to a third configuration to implement a third layer of the neural network to process the first context second layer output and the second context first layer output to generate a first processing result of the first context and a second processing result of the second context.

    MULTI-MEMORY ON-CHIP COMPUTATIONAL NETWORK
    14.
    发明公开

    公开(公告)号:US20230334294A1

    公开(公告)日:2023-10-19

    申请号:US18339954

    申请日:2023-06-22

    Abstract: Provided are systems, methods, and integrated circuits for neural network processing. In various implementations, an integrated circuit for neural network processing can include a plurality of memory banks storing weight values for a neural network. The memory banks can be on the same chip as an array of processing engines. Upon receiving input data, the circuit can be configured to use the set of weight values to perform a task defined for the neural network. Performing the task can include reading weight values from the memory banks, inputting the weight values into the array of processing engines, and computing a result using the array of processing engines, where the result corresponds to an outcome of performing the task.

    Processing for multiple input data sets

    公开(公告)号:US11475306B2

    公开(公告)日:2022-10-18

    申请号:US15933201

    申请日:2018-03-22

    Abstract: Disclosed herein are techniques for performing multi-layer neural network processing for multiple contexts. In one embodiment, a computing engine is set in a first configuration to implement a second layer of a neural network and to process first data related to a first context to generate first context second layer output. The computing engine can be switched from the first configuration to a second configuration to implement a first layer of the neural network. The computing engine can be used to process second data related to a second context to generate second context first layer output. The computing engine can be set to a third configuration to implement a third layer of the neural network to process the first context second layer output and the second context first layer output to generate a first processing result of the first context and a second processing result of the second context.

    Restructuring a multi-dimensional array

    公开(公告)号:US10445638B1

    公开(公告)日:2019-10-15

    申请号:US15908236

    申请日:2018-02-28

    Abstract: Disclosed herein are techniques for performing neural network computations. In one embodiment, an apparatus may include an array of processing elements, the array having a configurable first effective dimension and a configurable second effective dimension. The apparatus may also include a controller configured to determine at least one of: a first number of input data sets to be provided to the array at the first time or a second number of output data sets to be generated by the array at the second time, and to configure, based on at least one of the first number or the second number, at least one of the first effective dimension or the second effective dimension of the array.

    Processor with control flow
    18.
    发明授权

    公开(公告)号:US12008466B1

    公开(公告)日:2024-06-11

    申请号:US15934469

    申请日:2018-03-23

    CPC classification number: G06N3/08

    Abstract: In various implementations, provided are systems and methods for operating a neural network that includes conditional structures. In some implementations, an integrated circuit can compute a result using a set of intermediate results, where the intermediate results are computed from the outputs of a hidden layer of the neural network. The integrated circuit can further test the result against a condition. The outcome of the test can determine a next layer that the integrated circuit is to execute, or can be used to determine that further execution of the neural network can be terminated.

    PROCESSING FOR MULTIPLE INPUT DATA SETS

    公开(公告)号:US20230014783A1

    公开(公告)日:2023-01-19

    申请号:US17951084

    申请日:2022-09-22

    Abstract: Disclosed herein are techniques for performing multi-layer neural network processing for multiple contexts. In one embodiment, a computing engine is set in a first configuration to implement a second layer of a neural network and to process first data related to a first context to generate first context second layer output. The computing engine can be switched from the first configuration to a second configuration to implement a first layer of the neural network. The computing engine can be used to process second data related to a second context to generate second context first layer output. The computing engine can be set to a third configuration to implement a third layer of the neural network to process the first context second layer output and the second context first layer output to generate a first processing result of the first context and a second processing result of the second context.

Patent Agency Ranking