摘要:
A method for selecting a line to replace in an inclusive set-associative cache memory system which is based on a least recently used replacement policy but is enhanced to detect and give special treatment to the reloading of a line that has been recently cast out. A line which has been reloaded after having been recently cast out is assigned a special encoding which temporarily gives priority to the line in the cache so that it will not be selected for replacement in the usual least recently used replacement process. This method of line selection for replacement improves system performance by providing better hit rates in the cache hierarchy levels above, by ensuring that heavily used lines in the levels above are not aged out of the levels below due to lack of use.
摘要:
A solution for cooperative data prefetching that enables software control of a memory-side data prefetch and/or a processor-side data prefetch is provided. In one embodiment, the invention provides a solution for generating an application, in which access to application data for the application is improved (e.g., optimized) in program code for the application. In particular, a push request, for performing a memory-side data prefetch of the application data, and a prefetch request, for performing a processor-side data prefetch, are added to the program code. The memory-side data prefetch results in the application data being copied from a first data store to a second data store that is faster than the first data store while the processor-side data prefetch results in the application data being copied from the second data store to a third data store that is faster than the second data store.
摘要:
A solution for cooperative data prefetching that enables software control of a memory-side data prefetch and/or a processor-side data prefetch is provided. In one embodiment, the invention provides a solution for generating an application, in which access to application data for the application is improved (e.g., optimized) in program code for the application. In particular, a push request, for performing a memory-side data prefetch of the application data, and a prefetch request, for performing a processor-side data prefetch, are added to the program code. The memory-side data prefetch results in the application data being copied from a first data store to a second data store that is faster than the first data store while the processor-side data prefetch results in the application data being copied from the second data store to a third data store that is faster than the second data store.
摘要:
A method and system for prefetching in computer system are provided. The method in one aspect includes using a prefetch engine to perform prefetch instructions and to translate unmapped data. Misses to address translations during the prefetch are handled and resolved. The method also includes storing the resolved translations in a respective cache translation table. A system for prefetching in one aspect includes a prefetch engine operable to receive instructions to prefetch data from the main memory. The prefetch engine is also operable to search cache address translation for prefetch data and perform address mapping translation, if the prefetch data is unmapped. The prefetch engine is further operable to prefetch the data and store the address mapping in one or more cache memory, if the data is unmapped.
摘要:
In response to receipt of an input string, an attempt is made to identify, in a template store, a closely matching template for use as a compression template. In response to identification of a closely matching template that can be used as a compression template, the input string is compressed into a compressed string by reference to a longest common subsequence compression template. Compressing the input string includes encoding, in a compressed string, an identifier of the compression template, encoding substrings of the input string not having commonality with the compression template of at least a predetermined length as literals, and encoding substrings of the input string having commonality with the compression template of at least the predetermined length as a jump distance without reference to a base location in the compression template. The compressed string is then output.
摘要:
A field programmable gate array (FPGA) includes configuration RAM (CRAM) including at least one non-hardened portion and at least one hardened portion having an SER resilience greater than an SER resilience of the non-hardened portion.
摘要:
Memory page management in a tiered memory system including a system that includes at least one page table for storing a plurality of entries, each entry associated with a page of memory and each entry including an address of the page and a memory tier of the page. The system also includes a control program configured for allocating pages associated with the entries to a software module, the allocated pages from at least two different memory tiers. The system further includes an agent of the control program capable of operating independently of the control program, the agent configured for receiving an authorization key to the allocated pages, and for migrating the allocated pages between the different memory tiers responsive to the authorization key.
摘要:
A computer system having a combined memory. A first logical partition of the combined memory is a main memory region in a storage memory. A second logical partition of the combined memory is a direct memory region in a main memory. A memory controller comprising a storage controller is configured to receive a memory access request including a real address from a processor, determine whether the real address is for the first logical partition or for the second logical partition. If the address is for the first logical partition the storage controller communicates with an IO controller in the storage memory to service the memory access request. If the address is for the direct memory region, the memory controller services the memory access request in a conventional manner.
摘要:
A memory system and methods for memory manage are presented. The memory system includes a volatile memory electrically connected to a high-density memory; a memory controller that expects data to be written or read to or from the memory system at a bandwidth and a latency associated with the volatile memory; a directory within the volatile memory that associates a volatile memory address with data stored in the high-density memory; and redundant storage in the high-density memory that stores a copy of the association between the volatile memory address and the data stored in the high-density memory. The methods for memory management allow writing to and reading from the memory system using a first memory read/write interface (e.g. DRAM interface, etc.), though data is stored in a device of a different memory type (e.g. FLASH, etc.).
摘要:
An electronic system having a power efficient differential signal between a first and second electronic unit. A controller uses information, such as compliance with data transmission rate requirement and bit error rate (BER) versus a BER threshold to control power modes such that a minimal amount of power is required. Amplitude of transmission and single ended or differential transmission of data are examples of the power modes. The controller also factors in a failing phase in a differential signal in selecting a minimal power mode that satisfies the transmission rate requirement of the BER threshold.