LINEFILL DELEGATION IN A CACHE HIERARCHY

    公开(公告)号:US20250021480A1

    公开(公告)日:2025-01-16

    申请号:US18350217

    申请日:2023-07-11

    Applicant: Arm Limited

    Abstract: Apparatuses, methods, systems, and chip-containing products are disclosed, which relate to an arrangement comprising a level N cache level and a level M cache level, where M is greater than N. The level N cache level comprises a plurality of linefill slots and performs a slot allocation procedure in response to a lookup miss in dependence on a linefill slot occupancy criterion. The slot allocation procedure comprises allocation of an available slot of the plurality of slots to a pending linefill request generated in response to the lookup miss. The level N cache level effects a modification of the slot allocation procedure in dependence on the linefill slot occupancy criterion and is responsive to the linefill slot occupancy criterion being fulfilled to cause a linefill delegation action to be instructed to the level M cache level.

    DATA PROCESSING APPARATUS AND METHOD FOR PREFETCH GENERATION

    公开(公告)号:US20220308880A1

    公开(公告)日:2022-09-29

    申请号:US17209515

    申请日:2021-03-23

    Applicant: Arm Limited

    Abstract: The invention provides a data processing apparatus and a data processing method for generating prefetches of data for use during execution of instructions by processing circuitry. The prefetches that are generated are based on a nested prefetch pattern. The nested prefetch pattern comprises a first pattern and a second pattern. The first pattern is defined by a first address offset between sequentially accessed addresses and a first observed number of the sequentially accessed addresses separated by the first address offset. The second pattern is defined by a second address offset between sequential iterations of the first pattern and a second observed number of the sequential iterations of the first pattern separated by the second address offset.

    APPARATUS AND METHOD FOR PERFORMING ADDRESS TRANSLATION

    公开(公告)号:US20200065257A1

    公开(公告)日:2020-02-27

    申请号:US16451384

    申请日:2019-06-25

    Applicant: Arm Limited

    Abstract: Examples of the present disclosure relate to an apparatus comprising processing circuitry to perform data processing operations, storage circuitry to store data for access by the processing circuitry, address translation circuitry to maintain address translation data for translating virtual memory addresses into corresponding physical memory addresses, and prefetch circuitry. The prefetch circuitry is arranged to prefetch first data into the storage circuitry in anticipation of the first data being required for performing the data processing operations. The prefetching comprises, based on a prediction scheme, predicting a first virtual memory address associated with the first data, accessing the address translation circuitry to determine a first physical memory address corresponding to the first virtual memory address, and retrieving the first data based on the first physical memory address corresponding to the first virtual memory address. The prefetch circuitry is further arranged, based on the prediction scheme, to predict a second virtual memory address associated with second data in anticipation of the second data being prefetched, and to provide the predicted second virtual memory address to the address translation circuitry to enable the address translation circuitry to obtain the address translation data for the second virtual memory address.

    CIRCUITRY AND METHOD
    5.
    发明公开

    公开(公告)号:US20230244606A1

    公开(公告)日:2023-08-03

    申请号:US17592022

    申请日:2022-02-03

    Applicant: Arm Limited

    CPC classification number: G06F12/0862 G06F2212/602

    Abstract: Circuitry comprises a memory system to store data items; cache memory storage to store a copy of one or more data items, the cache memory storage comprising a hierarchy of two or more cache levels; detector circuitry to detect at least a property of data items for storage by the cache memory storage; and control circuitry to control eviction, from a given cache level, of a data item stored by the given cache level, the control circuitry being configured to select a destination to store a data item evicted from the given cache level in response to a detection by the detector circuitry.

    PREFETCHER TRAINING
    6.
    发明申请

    公开(公告)号:US20230121686A1

    公开(公告)日:2023-04-20

    申请号:US17501272

    申请日:2021-10-14

    Applicant: Arm Limited

    Abstract: An apparatus comprises a cache to store information, items of information in the cache being associated with addresses; cache lookup circuitry to perform lookups in the cache; and a prefetcher to prefetch items of information into the cache in advance of an access request being received for said items of information. The prefetcher selects addresses to train the prefetcher. In response to determining that a cache lookup specifying a given address has resulted in a hit and determining that a cache lookup previously performed in response to a prefetch request issued by the prefetcher for the given address resulted in a hit, the prefetcher selects the given address as an address to be used to train the prefetcher.

    IDENTIFYING READ-SET INFORMATION BASED ON AN ENCODING OF REPLACEABLE-INFORMATION VALUES

    公开(公告)号:US20200057692A1

    公开(公告)日:2020-02-20

    申请号:US16105129

    申请日:2018-08-20

    Applicant: Arm Limited

    Abstract: An apparatus comprises processing circuitry, transactional memory support circuitry and a cache. The processing circuitry processes threads of data processing, and the transactional memory support circuitry supports execution of a transaction within a thread, including tracking a read set of addresses, comprising addresses accessed by read instructions within the transaction. A transaction comprises instructions for which the processing circuitry is configured to prevent commitment of the results of speculatively executed instruction until the transaction has completed. The cache has a plurality of entries, each associated with an address and specifying a replaceable-information value for that address that comprises information for which, outside of the transaction, processing would be functionally correct even if the information was incorrect. While the transaction is pending, the transactional memory support circuitry identifies, based on an encoding of the replaceable-information values, read-set information identifying addresses in the read set tracked for the transaction.

Patent Agency Ranking