-
公开(公告)号:US20250111210A1
公开(公告)日:2025-04-03
申请号:US18900531
申请日:2024-09-27
Applicant: Google LLC
Inventor: Chong You , Guru Guruganesh , Joshua Timothy Ainslie , Manzil Zaheer , Sanjiv Kumar , Santiago Ontañón , Shanda Li , Venkata Sesha Pavana Srinadh Bhojanapalli , Sumit Sanghai
IPC: G06N3/0475
Abstract: Systems and methods for processing inputs using attention neural networks. In particular, one or more of the attention layers within the attention neural network compute relative position biases using functional interpolation.
-
公开(公告)号:US20250028556A1
公开(公告)日:2025-01-23
申请号:US18354897
申请日:2023-07-19
Applicant: Google LLC
Inventor: Joshua Ruizhi Wang , Brian Mulford , Qiaoran Li , Michael John de Ridder , Pawel Opalinski , Tresa Johnson , Paul David Duetting , Guru Guruganesh , Jonathan Schneider
IPC: G06F9/50
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for distributing resources to client devices are described. A computer-implemented system receives a request for a resource distribution constraint for a given resource of a first type, determines that a second type of resource that has a sufficient amount of historical distribution data to generate a distribution constraint, and determines the resource distribution constraint using historical distribution data for the second type of resource.
-
公开(公告)号:US11238332B2
公开(公告)日:2022-02-01
申请号:US17341193
申请日:2021-06-07
Applicant: Google LLC
Inventor: Joshua Timothy Ainslie , Santiago Ontañón , Philip Pham , Manzil Zaheer , Guru Guruganesh , Kumar Avinava Dubey , Amr Ahmed
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing network inputs using an attention neural network that has one or more sparse attention sub-layers. Each sparse attention sub-layer is configured to apply a sparse attention mechanism that attends differently for input positions that are in a first proper subset of the input positions in the input to the sub-layer than for positions that are not in the first proper subset.
-
公开(公告)号:US20220156553A1
公开(公告)日:2022-05-19
申请号:US17589542
申请日:2022-01-31
Applicant: Google LLC
Inventor: Joshua Timothy Ainslie , Santiago Ontañón , Philip Pham , Manzil Zaheer , Guru Guruganesh , Kumar Avinava Dubey , Amr Ahmed
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing network inputs using an attention neural network that has one or more sparse attention sub-layers. Each sparse attention sub-layer is configured to apply a sparse attention mechanism that attends differently for input positions that are in a first proper subset of the input positions in the input to the sub-layer than for positions that are not in the first proper subset.
-
公开(公告)号:US20210383191A1
公开(公告)日:2021-12-09
申请号:US17341193
申请日:2021-06-07
Applicant: Google LLC
Inventor: Joshua Timothy Ainslie , Santiago Ontañón , Philip Pham , Manzil Zaheer , Guru Guruganesh , Kumar Avinava Dubey , Amr Ahmed
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing network inputs using an attention neural network that has one or more sparse attention sub-layers. Each sparse attention sub-layer is configured to apply a sparse attention mechanism that attends differently for input positions that are in a first proper subset of the input positions in the input to the sub-layer than for positions that are not in the first proper subset.
-
-
-
-