Processing data of a neural network

    公开(公告)号:US12159223B2

    公开(公告)日:2024-12-03

    申请号:US17084249

    申请日:2020-10-29

    Applicant: Arm Limited

    Abstract: A method of processing image data of a neural network is performed by a data processing apparatus and comprises writing a first tensor to first storage of the data processing apparatus using a row stride, wherein the first tensor comprises at least one data group, the at least one data group comprising a plurality of data samples and having height, width, and depth dimensions [h, w, c]. The method further comprises transforming the first tensor into a second tensor using a first stride such that the second tensor is a column tensor comprising a plurality of rows, and writing the second tensor to second storage using a second stride that is related to a multiple of the first stride, δn, such that the second stride covers a first set of memory elements in the second storage into which data samples of a first row of the second tensor are stored and a second set of memory elements into which no data samples from the second tensor are stored.

    Apparatus and method for supporting multiple cache features

    公开(公告)号:US10691606B2

    公开(公告)日:2020-06-23

    申请号:US15392190

    申请日:2016-12-28

    Applicant: ARM Limited

    Abstract: An apparatus and method are provided for supporting multiple cache features. The apparatus provides cache storage comprising a plurality of cache ways and organised as a plurality of ways groups, where each way group comprises multiple cache ways from the plurality of cache ways. First cache feature circuitry is provided to implement a first cache feature that is applied to the way groups, and second cache feature circuitry is provided to implement a second cache feature that is applied to the way groups. Way group control circuitry is then arranged to provide a first mapping defining which cache ways belong to each way group when the first cache feature is applied to the way groups, and a second mapping defining which cache ways belong to each way group when the second cache feature is applied to the way groups. The first mapping and the second mapping are selected so as to prevent application of a cache feature to the way groups by one of the cache feature circuits from interfering with the ability of the other cache feature circuit to access at least one cache way in each of the way groups. Such an approach alleviates the risk of actions taken by one of the cache features from interfering with the ability of the other cache feature to operate as intended.

    Data processing method and system for performing convolutions

    公开(公告)号:US11423117B2

    公开(公告)日:2022-08-23

    申请号:US16552548

    申请日:2019-08-27

    Abstract: A computer implemented method for performing convolutions between subsets of an input data array and a kernel resulting in subsets of an output data array. The method may include receiving an input data array and using positional data indicating the position of elements of the input data array to determine subsets of the input data array which contains at least one non-zero value data element; performing convolutions between the subsets of the input data array containing at least one non-zero value data element and a kernel to produce output data array subsets; and combining the output data subsets with the positional data to generate output data indicative of a completed output data array.

Patent Agency Ranking