-
1.
公开(公告)号:US12242947B2
公开(公告)日:2025-03-04
申请号:US16759561
申请日:2018-10-29
Applicant: DeepMind Technologies Limited
Inventor: Pablo Sprechmann , Siddhant Jayakumar , Jack William Rae , Alexander Pritzel , Adrià Puigdomènech Badia , Oriol Vinyals , Razvan Pascanu , Charles Blundell
Abstract: There is described herein a computer-implemented method of processing an input data item. The method comprises processing the input data item using a parametric model to generate output data, wherein the parametric model comprises a first sub-model and a second sub-model. The processing comprises processing, by the first sub-model, the input data to generate a query data item, retrieving, from a memory storing data point-value pairs, at least one data point-value pair based upon the query data item and modifying weights of the second sub-model based upon the retrieved at least one data point-value pair. The output data is then generated based upon the modified second sub-model.
-
公开(公告)号:US20200380372A1
公开(公告)日:2020-12-03
申请号:US16995655
申请日:2020-08-17
Applicant: DeepMind Technologies Limited
Inventor: Daniel Pieter Wierstra , Chrisantha Thomas Fernando , Alexander Pritzel , Dylan Sunil Banarse , Charles Blundell , Andrei-Alexandru Rusu , Yori Zwols , David Ha
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.
-
公开(公告)号:US10824946B2
公开(公告)日:2020-11-03
申请号:US16511496
申请日:2019-07-15
Applicant: DeepMind Technologies Limited
Inventor: Meire Fortunato , Charles Blundell , Oriol Vinyals
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a neural network. In one aspect, a method includes maintaining data specifying, for each of the network parameters, current values of a respective set of distribution parameters that define a posterior distribution over possible values for the network parameter. A respective current training value for each of the network parameters is determined from a respective temporary gradient value for the network parameter. The current values of the respective sets of distribution parameters for the network parameters are updated in accordance with the respective current training values for the network parameters. The trained values of the network parameters are determined based on the updated current values of the respective sets of distribution parameters.
-
公开(公告)号:US10438114B1
公开(公告)日:2019-10-08
申请号:US14821463
申请日:2015-08-07
Applicant: DeepMind Technologies Limited
Inventor: Charles Blundell , Julien Robert Michel Cornebise
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for content recommendation using neural networks. One of the methods includes receiving context information for an action recommendation; processing the context information using a neural network that comprises one or more Bayesian neural network layers to generate, for each of the actions, one or more parameters of a distribution over possible action scores for the action and selecting an action from plurality of possible actions using the parameters of the distributions over the possible action scores for the action.
-
公开(公告)号:US11836630B2
公开(公告)日:2023-12-05
申请号:US17024217
申请日:2020-09-17
Applicant: DeepMind Technologies Limited
Inventor: Meire Fortunato , Charles Blundell , Oriol Vinyals
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a neural network. In one aspect, a method includes maintaining data specifying, for each of the network parameters, current values of a respective set of distribution parameters that define a posterior distribution over possible values for the network parameter. A respective current training value for each of the network parameters is determined from a respective temporary gradient value for the network parameter. The current values of the respective sets of distribution parameters for the network parameters are updated in accordance with the respective current training values for the network parameters. The trained values of the network parameters are determined based on the updated current values of the respective sets of distribution parameters.
-
公开(公告)号:US20220383074A1
公开(公告)日:2022-12-01
申请号:US17829204
申请日:2022-05-31
Applicant: DeepMind Technologies Limited
Inventor: Heiko Strathmann , Mohammadamin Barekatain , Charles Blundell , Petar Velickovic
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing persistent message passing using graph neural networks.
-
公开(公告)号:US10748065B2
公开(公告)日:2020-08-18
申请号:US16526240
申请日:2019-07-30
Applicant: DeepMind Technologies Limited
Inventor: Daniel Pieter Wierstra , Chrisantha Thomas Fernando , Alexander Pritzel , Dylan Sunil Banarse , Charles Blundell , Andrei-Alexandru Rusu , Yori Zwols , David Ha
Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using multi-task neural networks. One of the methods includes receiving a first network input and data identifying a first machine learning task to be performed on the first network input; selecting a path through the plurality of layers in a super neural network that is specific to the first machine learning task, the path specifying, for each of the layers, a proper subset of the modular neural networks in the layer that are designated as active when performing the first machine learning task; and causing the super neural network to process the first network input using (i) for each layer, the modular neural networks in the layer that are designated as active by the selected path and (ii) the set of one or more output layers corresponding to the identified first machine learning task.
-
公开(公告)号:US10664753B2
公开(公告)日:2020-05-26
申请号:US16445523
申请日:2019-06-19
Applicant: DeepMind Technologies Limited
Abstract: A method includes maintaining respective episodic memory data for each of multiple actions; receiving a current observation characterizing a current state of an environment being interacted with by an agent; processing the current observation using an embedding neural network in accordance with current values of parameters of the embedding neural network to generate a current key embedding for the current observation; for each action of the plurality of actions: determining the p nearest key embeddings in the episodic memory data for the action to the current key embedding according to a distance measure, and determining a Q value for the action from the return estimates mapped to by the p nearest key embeddings in the episodic memory data for the action; and selecting, using the Q values for the actions, an action from the multiple actions as the action to be performed by the agent.
-
公开(公告)号:US20200005152A1
公开(公告)日:2020-01-02
申请号:US16511496
申请日:2019-07-15
Applicant: DeepMind Technologies Limited
Inventor: Charles Blundell , Meire Fortunato , Oriol Vinyals
Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for training a neural network. In one aspect, a method includes maintaining data specifying, for each of the network parameters, current values of a respective set of distribution parameters that define a posterior distribution over possible values for the network parameter. A respective current training value for each of the network parameters is determined from a respective temporary gradient value for the network parameter. The current values of the respective sets of distribution parameters for the network parameters are updated in accordance with the respective current training values for the network parameters. The trained values of the network parameters are determined based on the updated current values of the respective sets of distribution parameters.
-
公开(公告)号:US20190303764A1
公开(公告)日:2019-10-03
申请号:US16445523
申请日:2019-06-19
Applicant: DeepMind Technologies Limited
IPC: G06N3/08
Abstract: A method includes maintaining respective episodic memory data for each of multiple actions; receiving a current observation characterizing a current state of an environment being interacted with by an agent; processing the current observation using an embedding neural network in accordance with current values of parameters of the embedding neural network to generate a current key embedding for the current observation; for each action of the plurality of actions: determining the p nearest key embeddings in the episodic memory data for the action to the current key embedding according to a distance measure, and determining a Q value for the action from the return estimates mapped to by the p nearest key embeddings in the episodic memory data for the action; and selecting, using the Q values for the actions, an action from the multiple actions as the action to be performed by the agent.
-
-
-
-
-
-
-
-
-