DETECTING FAILURE OF LAYER 2 SERVICE USING BROADCAST MESSAGES

    公开(公告)号:US20240015086A1

    公开(公告)日:2024-01-11

    申请号:US18370006

    申请日:2023-09-19

    Applicant: Nicira, Inc.

    CPC classification number: H04L43/0805 H04L41/0668 H04L43/10

    Abstract: Some embodiments provide a method for detecting a failure of a layer 2 (L2) bump-in-the-wire service at a device. In some embodiments, the device sends heartbeat signals to a second device connected to L2 service nodes in order to detect failure of the L2 service (e.g., a failure of all the service nodes). In some embodiments, the heartbeat signals are unidirectional heartbeat signals (e.g., a unidirectional bidirectional-forwarding-detection (BFD) session) sent from each device to the other. The heartbeat signals, in some embodiments, use a broadcast MAC address in order to reach the current active L2 service node in the case of a failover (i.e., an active service node failing and a standby service node becoming the new active service node). The unidirectional heartbeat signals are also used, in some embodiments, to decrease the time between a failover and data messages being forwarded to the new active service node.

    Inline load balancing
    9.
    发明授权

    公开(公告)号:US11075842B2

    公开(公告)日:2021-07-27

    申请号:US16427294

    申请日:2019-05-30

    Applicant: Nicira, Inc.

    Abstract: Some embodiments provide a novel method for load balancing data messages that are sent by a source compute node (SCN) to one or more different groups of destination compute nodes (DCNs). In some embodiments, the method deploys a load balancer in the source compute node's egress datapath. This load balancer receives each data message sent from the source compute node, and determines whether the data message is addressed to one of the DCN groups for which the load balancer spreads the data traffic to balance the load across (e.g., data traffic directed to) the DCNs in the group. When the received data message is not addressed to one of the load balanced DCN groups, the load balancer forwards the received data message to its addressed destination. On the other hand, when the received data message is addressed to one of load balancer's DCN groups, the load balancer identifies a DCN in the addressed DCN group that should receive the data message, and directs the data message to the identified DCN. To direct the data message to the identified DCN, the load balancer in some embodiments changes the destination address (e.g., the destination IP address, destination port, destination MAC address, etc.) in the data message from the address of the identified DCN group to the address (e.g., the destination IP address) of the identified DCN.

    Context based firewall services for data message flows for multiple concurrent users on one machine

    公开(公告)号:US11032246B2

    公开(公告)日:2021-06-08

    申请号:US15836888

    申请日:2017-12-10

    Applicant: Nicira, Inc.

    Abstract: Some embodiments of the invention provide a novel architecture for capturing contextual attributes on host computers that execute one or more machines, and for consuming the captured contextual attributes to perform services on the host computers. The machines are virtual machines (VMs) in some embodiments, containers in other embodiments, or a mix of VMs and containers in still other embodiments. Some embodiments execute a guest-introspection (GI) agent on each machine from which contextual attributes need to be captured. In addition to executing one or more machines on each host computer, these embodiments also execute a context engine and one or more attribute-based service engines on each host computer. One of these service engines is a firewall engine. Through the GI agents of the machines on a host, the context engine of that host in some embodiments collects contextual attributes associated with network events and/or process events on the machines. The context engine then provides the contextual attributes to the firewall engine, which, in turn, use these contextual attributes to identify firewall rules to enforce.

Patent Agency Ranking