Memory address translation using stored key entries

    公开(公告)号:US10853262B2

    公开(公告)日:2020-12-01

    申请号:US16342644

    申请日:2017-11-29

    Applicant: ARM LIMITED

    Abstract: Memory address translation apparatus comprises page table access circuitry to access a page table to retrieve translation data; a translation data buffer to store one or more instances of the translation data, comprising: an array of storage locations arranged in rows and columns; a row buffer comprising a plurality of entries and comparison circuitry responsive to a key value dependent upon at least the initial memory address, to compare the key value with information stored in each of at least one key entry and an associated value entry for storing at least a representation of a corresponding output memory address, and to identify which of the at least one key entry, if any, is a matching key entry storing information matching the key value; and output circuitry to output, when there is a matching key entry, at least the representation of the output memory address.

    Methods and apparatus of cache access to a data array with locality-dependent latency characteristics

    公开(公告)号:US10761988B2

    公开(公告)日:2020-09-01

    申请号:US16142330

    申请日:2018-09-26

    Applicant: Arm Limited

    Abstract: Aspects of the present disclosure relate to an apparatus comprising a data array having locality-dependent latency characteristics such that an access to an open unit of the data array has a lower latency than an access to a closed unit of the data array. Set associative cache indexing circuitry determines, in response to a request for data associated with a target address, a cache set index. Mapping circuitry identifies, in response to the index, a set of data array locations corresponding to the index, according to a mapping in which a given unit of the data array comprises locations corresponding to a plurality of consecutive indices, and at least two locations of the set of locations corresponding to the same index are in different units of the data array. Cache access circuitry accesses said data from one of the set of data array locations.

    Apparatus and method for transferring data between address ranges in memory

    公开(公告)号:US10712965B2

    公开(公告)日:2020-07-14

    申请号:US15806580

    申请日:2017-11-08

    Applicant: ARM LIMITED

    Abstract: An apparatus and method are provided for transferring data between address ranges in memory. The apparatus comprises a data transfer controller, that is responsive to a data transfer request received by the apparatus from a processing element, to perform a transfer operation to transfer data from at least one source address range in memory to at least one destination address range in the memory. A redirect controller is then arranged, whilst the transfer operation is being performed, to intercept an access request that specifies a target address within a target address range, and to perform a memory redirection operation so as to cause the access request to be processed without awaiting completion of the transfer operation. Via such an approach, the apparatus can effectively hide from the source of the access request the fact that the transfer operation is in progress, and hence the transfer operation can be arranged to occur in the background, and in a manner that is transparent to the software executing on the source that has issued the access request.

    Delay masking action for memory access requests

    公开(公告)号:US10860215B2

    公开(公告)日:2020-12-08

    申请号:US16152485

    申请日:2018-10-05

    Applicant: Arm Limited

    Abstract: An apparatus comprises control circuitry to control access to a memory implemented using a memory technology providing variable access latency. The control circuitry has request handling circuitry to identify an execution context switch comprising a transition from servicing memory access requests associated with a first execution context to servicing memory access requests associated with a second execution context. At least when the execution context switch meets a predetermined condition, a delay masking action is triggered to control subsequent memory access requests associated with the second execution context, for which the required data is already stored in the memory, to be serviced with a response delay which is independent of which addresses were accessed by the memory access requests associated with the first execution context. This can help guard against attacks which aim to exploit variation in response latency to gain insight into the addresses accessed by a victim execution context.

    Memory address translation
    5.
    发明授权

    公开(公告)号:US10831673B2

    公开(公告)日:2020-11-10

    申请号:US16181474

    申请日:2018-11-06

    Applicant: Arm Limited

    Abstract: Memory address translation apparatus comprises page table access circuitry to access page table data to retrieve translation data defining an address translation between an initial memory address in an initial memory address space, and a corresponding output memory address in an output address space; a translation data buffer to store, for a subset of the virtual address space, one or more instances of the translation data; and control circuitry, responsive to an input initial memory address to be translated, to request retrieval of translation data for the input initial memory address from the translation data buffer and, before completion of processing of the request for retrieval from the translation data buffer, to initiate retrieval of translation data for the input initial memory address by the page table access circuitry.

    Cache control in presence of speculative read operations

    公开(公告)号:US11263133B2

    公开(公告)日:2022-03-01

    申请号:US16979624

    申请日:2019-03-12

    Applicant: Arm Limited

    Abstract: Coherency control circuitry (10) supports processing of a safe-speculative-read transaction received from a requesting master device (4). The safe-speculative-read transaction is of a type requesting that target data is returned to a requesting cache (11) of the requesting master device (4) while prohibiting any change in coherency state associated with the target data in other caches (12) in response to the safe-speculative-read transaction. In response, at least when the target data is cached in a second cache associated with a second master device, at least one of the coherency control circuitry (10) and the second cache (12) is configured to return a safe-speculative-read response while maintaining the target data in the same coherency state within the second cache. This helps to mitigate against speculative side-channel attacks.

    Storage circuitry responsive to a tag-matching command

    公开(公告)号:US10860495B2

    公开(公告)日:2020-12-08

    申请号:US16464019

    申请日:2017-09-15

    Applicant: ARM LIMITED

    Abstract: Storage circuitry comprises an array of storage locations arranged in rows and columns, a row buffer comprising a plurality of entries each to store information from a storage location at a corresponding column of an active row of the array, and comparison circuitry responsive to a tag-matching command specifying a tag value to compare the tag value with information stored in each of a subset of two or more entries of the row buffer. The comparison circuitry identifies which of the subset of entries, if any, is a matching entry storing information matching the tag value. This allows memory technologies such as DRAM to be used more efficiently as a set-associative cache.

    Cache apparatus and method that facilitates a reduction in energy consumption through use of first and second data arrays

    公开(公告)号:US11036639B2

    公开(公告)日:2021-06-15

    申请号:US15864062

    申请日:2018-01-08

    Applicant: ARM Limited

    Abstract: A cache apparatus is provided comprising a data storage structure providing N cache ways that each store data as a plurality of cache blocks. The data storage structure is organised as a plurality of sets, where each set comprises a cache block from each way, and further the data storage structure comprises a first data array and a second data array, where at least the second data array is set associative. A set associative tag storage structure stores a tag value for each cache block, with that set associative tag storage structure being shared by the first and second data arrays. Control circuitry applies an access likelihood policy to determine, for each set, a subset of the cache blocks of that set to be stored within the first data array. Access circuitry is then responsive to an access request to perform a lookup operation within an identified set of the set associative tag storage structure overlapped with an access operation to access within the first data array the subset of the cache blocks for the identified set. In the event of a hit condition being detected that identifies a cache block present in the first data array, that access request is then processed using the cache block accessed within the first data array. If instead a hit condition is detected that identifies a cache block absent in the first data array, then a further access operation is performed to access the identified cache block within a selected way of the second data array. Such a cache structure provides a high performance and energy efficient mechanism for storing cached data.

Patent Agency Ranking