摘要:
The present invention is generally directed to a system and method for fetching data from a system memory to an ATM card. The method includes the steps of receiving a request (via a PCI bus) to fetch data from memory, and identifying the request as an ATM request. The method then determines, based on the start address, the number of cache lines that will be implicated by the fetch. Then, the method automatically fetches the appropriate number of cache lines into the cache, and then passes the data to the ATM card, via the PCI bus. In accordance with another aspect of the present invention, a system is provided for fetching data from memory for an ATM card. Broadly, the system includes a system memory for data storage and a cache memory for providing high-speed (retrieval) temporary storage of data, the cache memory being disposed in communication with the system memory via a high-speed system bus. The system further includes a PCI bus in communication with the cache memory via an input/output (I/O) bus. A first mechanism is configured to identify a fetch for data from memory to the PCI bus by an ATM card. A second mechanism is configured to determine the number of lines of the cache memory that will be implicated by the identified fetch. Finally, a third mechanism is configured to automatically fetch the appropriate number of lines from the cache memory and to pass the data to the PCI bus.
摘要:
The present invention is generally directed to a system and method for fetching data from system memory to a device in communication with the system over a PCI bus, via an I/O cache. Broadly, the present invention may be viewed as a novel way to communicate certain fetching hints; namely, hints that specify certain qualities about the data that is to be fetched from the system memory. In operation, the I/O cache may use such hints to more effectively manage the data that passes through it. As simply one example, if, based upon the hints, the controller for the I/O cache knew (or assumed) that the data being fetched was ATM data, then it would also know (based upon the nature of ATM data) that precisely a forty-eight byte data payload was to be sent to the requesting device, and the I/O cache could pre-fetch precisely this amount of data (typically one or two cache lines). In accordance with one aspect of the invention, such a system includes an input/output (I/O) cache memory interposed between the system memory and the PCI bus, wherein the cache memory has internal memory space in the form of a plurality of data lines within the cache memory. The system further includes a plurality of registers for each PCI master that are configured to define fetching criteria. Finally, the system includes a register selector that is configured to select an active register among the plurality of registers, wherein fetching criteria for the device is specified by the active register.
摘要:
The present invention is generally directed to a system and method for providing improved memory management in an asynchronous I/O cache memory. The method includes the steps of identifying a request for data from the system memory by a requesting device that is in communication with the system memory via an I/O bus. Then the method controls the communication of data from the system memory into the cache memory. The method further includes the step of communicating the data from the cache memory to the requesting device, and immediately after communicating the data to the requesting device, the method discards the data from the cache memory. In accordance with the preferred embodiment, the method flushes data from the I/O cache line at a time. Therefore, when a given cache line of data is flushed from the cache after the last data byte of the cache line is communicated out to the requesting device.