-
公开(公告)号:US11017309B2
公开(公告)日:2021-05-25
申请号:US16032737
申请日:2018-07-11
Applicant: Massachusetts Institute of Technology
Inventor: Charles Roques-Carmes , Yichen Shen , Li Jing , Tena Dubcek , Scott A. Skirlo , Hengameh Bagherianlemraski , Marin Soljacic
Abstract: A photonic parallel network can be used to sample combinatorially hard distributions of Ising problems. The photonic parallel network, also called a photonic processor, finds the ground state of a general Ising problem and can probe critical behaviors of universality classes and their critical exponents. In addition to the attractive features of photonic networks—passivity, parallelization, high-speed and low-power—the photonic processor exploits dynamic noise that occurs during the detection process to find ground states more efficiently.
-
公开(公告)号:US12287842B2
公开(公告)日:2025-04-29
申请号:US17239830
申请日:2021-04-26
Applicant: Massachusetts Institute of Technology
Inventor: Charles Roques-Carmes , Yichen Shen , Li Jing , Tena Dubcek , Scott A. Skirlo , Hengameh Bagherianlemraski , Marin Soljacic
IPC: G06E3/00 , G06F17/16 , G06N3/044 , G06N3/045 , G06N3/047 , G06N3/067 , G06N3/084 , G06N7/01 , G06F17/18 , G06N3/04 , G06N3/08 , G06N7/00 , G06N10/00
Abstract: A photonic parallel network can be used to sample combinatorially hard distributions of Ising problems. The photonic parallel network, also called a photonic processor, finds the ground state of a general Ising problem and can probe critical behaviors of universality classes and their critical exponents. In addition to the attractive features of photonic networks—passivity, parallelization, high-speed and low-power—the photonic processor exploits dynamic noise that occurs during the detection process to find ground states more efficiently.
-
公开(公告)号:US20180260703A1
公开(公告)日:2018-09-13
申请号:US15820906
申请日:2017-11-22
Applicant: Massachusetts Institute of Technology
Inventor: Marin Soljacic , Yichen Shen , Li Jing , Tena Dubcek , Scott Skirlo , John E. Peurifoy , Max Erik Tegmark
CPC classification number: G06N3/08 , G06F17/142 , G06F17/16 , G06N3/0445 , G06N3/084 , G06N20/00
Abstract: A system for training a neural network model, the neural network model comprising a plurality of layers including a first hidden layer associated with a first set of weights, the system comprising at least one computer hardware processor programmed to perform: obtaining training data; selecting a unitary rotational representation for representing a matrix of the first set weights, the selected unitary rotational representation comprising a plurality of parameters; training the neural network model using the training data using an iterative neural network training algorithm to obtain a trained neural network model, each iteration of the iterative neural network training algorithm comprising: updating values of the plurality of parameters in the selected unitary rotational representation for representing the matrix of the set of weights for the at least one hidden layer, and saving the trained neural network model.
-
-