Invention Grant
- Patent Title: Mixture of experts neural networks
-
Application No.: US16879187Application Date: 2020-05-20
-
Publication No.: US11790214B2Publication Date: 2023-10-17
- Inventor: Noam M. Shazeer , Azalia Mirhoseini , Krzysztof Stanislaw Maziarz
- Applicant: Google LLC
- Applicant Address: US CA Mountain View
- Assignee: Google LLC
- Current Assignee: Google LLC
- Current Assignee Address: US CA Mountain View
- Agency: Fish & Richardson P.C.
- Main IPC: G06N3/045
- IPC: G06N3/045 ; G06N3/08

Abstract:
A system includes a neural network that includes a Mixture of Experts (MoE) subnetwork between a first neural network layer and a second neural network layer. The MoE subnetwork includes multiple expert neural networks. Each expert neural network is configured to process a first layer output generated by the first neural network layer to generate a respective expert output. The MoE subnetwork further includes a gating subsystem that selects, based on the first layer output, one or more of the expert neural networks and determine a respective weight for each selected expert neural network, provides the first layer output as input to each of the selected expert neural networks, combines the expert outputs generated by the selected expert neural networks in accordance with the weights for the selected expert neural networks to generate an MoE output, and provides the MoE output as input to the second neural network layer.
Public/Granted literature
- US20200279150A1 MIXTURE OF EXPERTS NEURAL NETWORKS Public/Granted day:2020-09-03
Information query