Abstract:
In a first aspect, a first method of transmitting a data packet is provided. The first method includes the steps of (1) for each connection from which a data packet may be transmitted, storing header data corresponding to the connection; (2) employing a user application to form header and payload data of a packet, wherein the user application is associated with a connection from which the packet is to be transmitted; and (3) while transmitting the packet, comparing one or more portions of the packet header data with the header data corresponding to the connection with which the user application is associated. Numerous other aspects are provided.
Abstract:
A method and system for receiving packets in a computer network are disclosed. The method and system include providing at least one receive port, a buffer, a scheduler, and a wrap port. The buffer has an input coupled with the at least one receive port and an output. The scheduler has a first input coupled to the output of the buffer, a second input coupled to the wrap port, and an output.
Abstract:
Systems and methods for distributing thread instructions in the pipeline of a multi-threading digital processor are disclosed. More particularly, hardware and software are disclosed for successively selecting threads in an ordered sequence for execution in the processor pipeline. If a thread to be selected cannot execute, then a complementary thread is selected for execution.
Abstract:
A system and method in accordance with the present invention allows for an adapter to be utilized in a server environment that can accommodate both a 10 G and a 1 G source utilizing the same pins. This is accomplished through the use of a high speed serializer/deserializer (high speed serdes) which can accommodate both data sources. The high speed serdes allows for the use of a relatively low reference clock speed on the NIC to provide the proper clocking of the data sources and also allows for different modes to be set to accommodate the different data sources. Finally the system allows for the adapter to use the same pins for multiple data sources.
Abstract:
An Ethernet adapter is disclosed. The Ethernet adapter comprises a plurality of layers for allowing the adapter to receive and transmit packets from and to a processor. The plurality of layers include a demultiplexing mechanism to allow for partitioning of the processor. A Host Ethernet Adapter (HEA) is an integrated Ethernet adapter providing a new approach to Ethernet and TCP acceleration. A set of TCP/IP acceleration features have been introduced in a toolkit approach: Servers TCP/IP stacks use these accelerators when and as required. The interface between the server and the network interface controller has been streamlined by bypassing the PCI bus. The HEA supports network virtualization. The HEA can be shared by multiple OSs providing the essential isolation and protection without affecting its performance.
Abstract:
Method and apparatus for implementing use of a network connection table. In one aspect, searching for network connections includes receiving a packet, and zeroing particular fields of connection information from the packet if a new connection is to be established. The connection information is converted to an address for a location in a direct table using a table access process. The direct table stores patterns and reference information for new and existing connections. The connection information is compared with at least one pattern stored in the direct table at the address to find reference information for the received packet.
Abstract:
System and method for maintenance and examination of timers for a computer system having connections in a networking system. Timer values in a connection table each indicate a timeout for a timer for a connection, where each connection has multiple timers, and one of the timer values is written to a global timer array for each connection such that the global timer array can be scanned to determine when timeouts occur for active connections. Sparse restart of a timer includes restarting the timer if data is communicated with a connected computer before the timeout occurs and after a predetermined time interval after timer start, and not restarting the timer if data is communicated before the timeout occurs and within the predetermined interval after timer start.
Abstract:
A graphical user interface, method, and apparatus for configuring a logical partition (LPAR), comprises one or more screens for configuring an LPAR having allocated resources residing on a server computer, the LPAR being uniquely identified by a partition ID; the one or more screens comprising an SNA selection element configured for user-selection of a shared network adapter (SNA) ID from one or more available SNA IDs, wherein each selectable SNA ID uniquely identifies a respective SNA installed on the server computer; a physical port selection element configured for user-selection of a physical port ID from one or more physical port IDs each corresponding to a respective physical port, wherein the one or more physical port IDs uniquely identify all physical ports residing on the respective SNA for the selected SNA ID; and an active configure button which, when selected by a user, causes the display of one or more screens for configuring a logical shared adapter (LSA) associated with the respective SNA.
Abstract:
A method and system for performing a lookup for a packet in a computer network are disclosed. The packet includes a header. The method and system include providing a parser, providing a lookup engine coupled with the parser, and providing a processor coupled with the lookup engine. The parser is for parsing the packet for the header prior to receipt of the packet being completed. The lookup engine performs a lookup for the header and returns a resultant. In one aspect, the lookup includes performing a local lookup of a cache that includes resultants of previous lookups. The processor processes the resultant.
Abstract:
A system and method for reducing latency in a host Ethernet adapter (HEA) includes the following. First, the HEA receives a packet with an internet protocol (IP) header and data in the HEA. The HEA parses a connection identifier from the IP header and accesses a negative cache in the HEA to determine if the connection identifier is not in a memory external to the HEA. The HEA applies a default treatment to the packet if the connection identifier is not in the memory, thereby reducing latency by decreasing access to the memory.