BACK-END MEMORY CHANNEL THAT RESIDES BETWEEN FIRST AND SECOND DIMM SLOTS AND APPLICATIONS THEREOF

    公开(公告)号:US20190042162A1

    公开(公告)日:2019-02-07

    申请号:US16104040

    申请日:2018-08-16

    Abstract: A computing system is described. The computing system includes a memory controller having a double data rate memory interface. The double data rate memory interface has a first memory channel interface and a second memory channel interface. The computing system also includes a first DIMM slot and a second DIMM slot. The computing system also includes a first memory channel coupled to the first memory channel interface and the first DIMM slot, wherein the first memory channel's CA and DQ wires are not coupled to the second DIMM slot. The computing system also includes a second memory channel coupled to the second memory channel interface and the second DIMM slot, wherein the second memory channel's CA and DQ wires are not coupled to the first DIMM slot. The computing system also includes a back end memory channel that is coupled to the first and second DIMM slots.

    STACKED MEMORY CHIP DEVICE WITH ENHANCED DATA PROTECTION CAPABILITY

    公开(公告)号:US20190004909A1

    公开(公告)日:2019-01-03

    申请号:US15640182

    申请日:2017-06-30

    Abstract: A stacked memory chip device is described. The stacked memory chip device includes a plurality of stacked memory chips. The stacked memory chip device includes read/write logic circuitry to service read/write requests for cache lines kept within the plurality of stacked memory chips. The stacked memory chip device includes data protection circuitry to store information to protect substantive data of cache lines in the plurality of stacked memory chips, where, the information is kept in more than one of the plurality of stacked memory chips, and where, any subset of the information that protects respective substantive information of a particular one of the cache lines is not stored in a same memory chip with the respective substantive information.

    APPARATUS AND METHOD TO REDUCE BANDWIDTH AND LATENCY OVERHEADS OF PROBABILISTIC CACHES

    公开(公告)号:US20220308998A1

    公开(公告)日:2022-09-29

    申请号:US17214356

    申请日:2021-03-26

    Abstract: An apparatus and method to reduce bandwidth and latency associated with probabilistic caches. For example, one embodiment of a processor comprises: a plurality of cores to execute instructions and process data, one or more of the cores to generate a request for a first cache line; a cache controller comprising cache lookup logic to determine a first way of a cache in which to search for the first cache line based on a first set of tag bits comprising one or more bits associated with the first cache line; the cache lookup logic to compare a second set of tag bits of the first cache line with a third set of tag bits of an existing cache line stored in the first way, wherein if the second set of tag bits and the third set of tag bits to not match, then the cache lookup logic to determine that the first cache line is not in the first way and to compare a fourth set of tag bits of the first cache line with a fifth set of tag bits of the existing cache line, wherein responsive to a match between the fourth set of tag bits and the fifth set of tag bits, the cache lookup logic to determine that the first cache line is stored in a second way and to responsively read the first cache line from the second way.

    DYNAMIC MULTILEVEL MEMORY SYSTEM
    6.
    发明申请

    公开(公告)号:US20220229575A1

    公开(公告)日:2022-07-21

    申请号:US17710796

    申请日:2022-03-31

    Abstract: A system can dynamically migrate memory pages from near memory to far memory during runtime. A system basic input output system (BIOS) can program a first memory address space of size P and a second memory address space of size P to a near memory (NM) space of size (N) and a far memory (FM) space of size (M), where P equals N+M. For the first memory address space, the OS can manage the NM space and the FM space as a flat memory space with an address space of size P available. For the second memory address space, the OS can manage the NM space as a NM cache for FM, with an address space of size M available.

    DISTRIBUTION OF ERROR CHECKING AND CORRECTION (ECC) BITS TO ALLOCATE ECC BITS FOR METADATA

    公开(公告)号:US20210141692A1

    公开(公告)日:2021-05-13

    申请号:US17156399

    申请日:2021-01-22

    Abstract: A memory subsystem includes multiple memory resources connected in parallel, including a first memory resource and a second memory resource. The memory subsystem can split a portion of data into multiple sub-portions. Split into smaller portions, the system needs fewer ECC (error checking and correction) bits to provide the same level of ECC protection. The portion of data can include N ECC bits for error correction, and the sub-portions can each include a sub-portion of (N−M) ECC bits for error correction. The system can then use M bits of data for non-ECC purposes, such as metadata.

    APPARATUS AND METHOD FOR IMPLEMENTING A MULTI-LEVEL MEMORY HIERARCHY HAVING DIFFERENT OPERATING MODES

    公开(公告)号:US20180341588A1

    公开(公告)日:2018-11-29

    申请号:US16052581

    申请日:2018-08-01

    Abstract: A system and method are described for integrating a memory and storage hierarchy including a non-volatile memory tier within a computer system. In one embodiment, PCMS memory devices are used as one tier in the hierarchy, sometimes referred to as “far memory.” Higher performance memory devices such as DRAM placed in front of the far memory and are used to mask some of the performance limitations of the far memory. These higher performance memory devices are referred to as “near memory.” In one embodiment, the “near memory” is configured to operate in a plurality of different modes of operation including (but not limited to) a first mode in which the near memory operates as a memory cache for the far memory and a second mode in which the near memory is allocated a first address range of a system address space with the far memory being allocated a second address range of the system address space, wherein the first range and second range represent the entire system address space.

Patent Agency Ranking