Block copy
    1.
    发明授权

    公开(公告)号:US12086441B2

    公开(公告)日:2024-09-10

    申请号:US17461105

    申请日:2021-08-30

    Applicant: Rambus Inc.

    Abstract: An interconnected stack of one or more Dynamic Random Access Memory (DRAM) die also has one or more custom logic, controller, or processor die. The custom die(s) of the stack include direct channel interfaces that allow direct access to memory regions on one or more DRAMs in the stack. The direct channels are time-division multiplexed such that each DRAM die is associated with a time slot on a direct channel. The custom die configures a first DRAM die to read a block of data and transmit it via the direct channel using a time slot that is assigned to a second DRAM die. The custom die also configures the second memory device to receive the first block of data in its assigned time slot and write the block of data.

    TAGS AND DATA FOR CACHES
    3.
    发明申请

    公开(公告)号:US20220398198A1

    公开(公告)日:2022-12-15

    申请号:US17853735

    申请日:2022-06-29

    Applicant: Rambus Inc.

    Abstract: A device includes a memory controller and a cache memory coupled to the memory controller. The cache memory has a first set of cache lines associated with a first memory block and comprising a first plurality of cache storage locations, as well as a second set of cache lines associated with a second memory block and comprising a second plurality of cache storage locations. A first location of the second plurality of cache storage locations comprises cache tag data for both the first set of cache lines and the second set of cache lines.

    Tags and data for caches
    4.
    发明授权

    公开(公告)号:US11409659B2

    公开(公告)日:2022-08-09

    申请号:US17221639

    申请日:2021-04-02

    Applicant: Rambus Inc.

    Abstract: A device includes a memory controller and a cache memory coupled to the memory controller. The cache memory has a first set of cache lines associated with a first memory block and comprising a first plurality of cache storage locations, as well as a second set of cache lines associated with a second memory block and comprising a second plurality of cache storage locations. A first location of the second plurality of cache storage locations comprises cache tag data for both the first set of cache lines and the second set of cache lines.

    Tags and data for caches
    6.
    发明授权

    公开(公告)号:US12093180B2

    公开(公告)日:2024-09-17

    申请号:US17853735

    申请日:2022-06-29

    Applicant: Rambus Inc.

    CPC classification number: G06F12/0868 G06F3/0604 G06F3/0658 G06F3/0673

    Abstract: A device includes a memory controller and a cache memory coupled to the memory controller. The cache memory has a first set of cache lines associated with a first memory block and comprising a first plurality of cache storage locations, as well as a second set of cache lines associated with a second memory block and comprising a second plurality of cache storage locations. A first location of the second plurality of cache storage locations comprises cache tag data for both the first set of cache lines and the second set of cache lines.

    SYSTEM APPLICATION OF DRAM COMPONENT WITH CACHE MODE

    公开(公告)号:US20220165326A1

    公开(公告)日:2022-05-26

    申请号:US17439215

    申请日:2020-03-16

    Applicant: RAMBUS INC.

    Abstract: Disclosed is a memory system that has a memory controller and may have a memory component. The memory component may be a dynamic random access memory (DRAM). The memory controller is connectable to the memory component. The memory component has at least one data row and at least one tag row different from and associated with the at least one data row. The memory system is to implement a cache having multiple ways to hold a data group. The memory controller is operable in each of a plurality of operating modes. The operating modes include a first operating mode and a second operating mode. The first operating mode and the second operating mode have differing addressing and timing for accessing the data group. The memory controller has cache read logic that sends a cache read command, cache results logic that receives a response from the memory component, and cache fetch logic.

    Methods and Circuits for Streaming Data to Processing Elements in Stacked Processor-Plus-Memory Architecture

    公开(公告)号:US20220076714A1

    公开(公告)日:2022-03-10

    申请号:US17410786

    申请日:2021-08-24

    Applicant: Rambus Inc.

    Abstract: A stacked processor-plus-memory device includes a processing die with an array of processing elements of an artificial neural network. Each processing element multiplies a first operand—e.g. a weight—by a second operand to produce a partial result to a subsequent processing element. To prepare for these computations, a sequencer loads the weights into the processing elements as a sequence of operands that step through the processing elements, each operand stored in the corresponding processing element. The operands can be sequenced directly from memory to the processing elements or can be stored first in cache. The processing elements include streaming logic that disregards interruptions in the stream of operands.

    TAGS AND DATA FOR CACHES
    9.
    发明申请

    公开(公告)号:US20210326265A1

    公开(公告)日:2021-10-21

    申请号:US17221639

    申请日:2021-04-02

    Applicant: Rambus Inc.

    Abstract: A device includes a memory controller and a cache memory coupled to the memory controller. The cache memory has a first set of cache lines associated with a first memory block and comprising a first plurality of cache storage locations, as well as a second set of cache lines associated with a second memory block and comprising a second plurality of cache storage locations. A first location of the second plurality of cache storage locations comprises cache tag data for both the first set of cache lines and the second set of cache lines.

Patent Agency Ranking