-
公开(公告)号:US20250165765A1
公开(公告)日:2025-05-22
申请号:US18791687
申请日:2024-08-01
Applicant: Intel Corporation
Inventor: Philipp Stratmann , Alessandro Pierro , Gabriel Andres Fonseca Guerra , Sumedh Risbud , Andreas Wild , Ashish Rao Mangalore
Abstract: A neuromorphic network may solve combinatorial optimization problems. The neuromorphic network may include variable neurons, a solution monitoring neuron, and one or more readout neurons. The variable neurons may each represent one binary variable in a combinatorial optimization problem. An internal state of a variable neuron may change as the variable flips. The internal state may be stored in a memory of the variable neuron. The variable neuron may spike when its internal state changes. One or more other variable neurons receiving the spike may determine whether to change their internal states based on the spike. The variable neurons may send their internal states to the solution monitoring neuron to compute a cost of the QUBO problem and determine whether a solution is found. A readout neuron may receive variable assignments resulting in the solution from at least some variable neurons and integrate the variable assignments into one message.
-
公开(公告)号:US20180174042A1
公开(公告)日:2018-06-21
申请号:US15385334
申请日:2016-12-20
Applicant: Intel Corporation
Inventor: Narayan Srinivasa , Yongqiang Cao , Andreas Wild
CPC classification number: G06N3/08 , G06N3/0454 , G06N3/049
Abstract: Systems and methods for supervised learning and cascaded training of a neural network are described. In an example, a supervised process is used for strengthening connections to classifier neurons, with a supervised learning process of receiving a first spike at a classifier neuron from a processing neuron in response to training data, and receiving an out-of-band communication of a second desired (artificial) spike at the classifier neuron that corresponds to the classification of the training data. As a result of spike timing dependent plasticity, connections to the classifier neuron are strengthened. In another example, a cascaded technique is disclosed to generate a plurality of trained neural networks that are separately initialized and trained based on different types or forms of training data, which may be used with cascaded or parallel operation of the plurality of trained neural networks.
-