Full Attention with Sparse Computation Cost

    公开(公告)号:US20230022151A1

    公开(公告)日:2023-01-26

    申请号:US17860691

    申请日:2022-07-08

    Applicant: Google LLC

    Abstract: The present disclosure is directed to machine learning model architectures which provide full attention capability in each attention head while maintaining low computation and memory complexity. Specifically, according to one aspect of the present disclosure, example attention models provided herein can treat the self-attention mechanism as a conditional expectation over embeddings at each location and approximate the conditional distribution with a structured factorization. Each location can attend to all other locations, either via direct attention, or through indirect attention to group representations, which are again conditional expectations of embeddings from corresponding local regions.

    Knowledge Graph Completion and Multi-Hop Reasoning in Knowledge Graphs at Scale

    公开(公告)号:US20230289626A1

    公开(公告)日:2023-09-14

    申请号:US18183410

    申请日:2023-03-14

    Applicant: Google LLC

    CPC classification number: G06N5/022 G06F16/2453

    Abstract: Provided are computing systems, methods, and platforms for negative sampling in knowledge graphs with improved efficiency. A knowledge graph comprising entities and links between the entities can be obtained. A query computation graph comprising nodes and edges can be generated based on the knowledge graph. The nodes of the query computation graph can include anchor nodes, a root node, and intermediate nodes positioned in paths between the anchor nodes and the root node. A node cut of a query of the query computation graph can be determined and can include at least one node that cuts at least one path between each anchor node and the root node of the query computation graph. Negative samples can be identified by bidirectionally traversing the query computation graph in a first direction from the anchor nodes to the node cut and in a second direction from the root node to the node cut.

Patent Agency Ranking