Abstract:
A parallel convolutional neural network is provided. The CNN is implemented by a plurality of convolutional neural networks each on a respective processing node. Each CNN has a plurality of layers. A subset of the layers are interconnected between processing nodes such that activations are fed forward across nodes. The remaining subset is not so interconnected.
Abstract:
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting likelihoods of conditions being satisfied using recurrent neural networks. One of the systems is configured to process a temporal sequence comprising a respective input at each of a plurality of time steps and comprises: one or more recurrent neural network layers; one or more logistic regression nodes, wherein each of the logistic regression nodes corresponds to a respective condition from a predetermined set of conditions, and wherein each of the logistic regression nodes is configured to, for each of the plurality of time steps: receive the network internal state for the time step; and process the network internal state for the time step in accordance with current values of a set of parameters of the logistic regression node to generate a future condition score for the corresponding condition for the time step.
Abstract:
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using recurrent neural networks to analyze health events. One of the methods includes obtaining a first temporal sequence of health events, wherein the first temporal sequence comprises respective health-related data associated with a particular patient at each of a plurality of time steps; processing the first temporal sequence of health events using a recurrent neural network to generate a neural network output for the first temporal sequence; and generating, from the neural network output for the first temporal sequence, health analysis data that characterizes future health events that may occur after a last time step in the temporal sequence.
Abstract:
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicting likelihoods of conditions being satisfied using recurrent neural networks. One of the systems is configured to process a temporal sequence comprising a respective input at each of a plurality of time steps and comprises: one or more recurrent neural network layers; one or more logistic regression nodes, wherein each of the logistic regression nodes corresponds to a respective condition from a predetermined set of conditions, and wherein each of the logistic regression nodes is configured to, for each of the plurality of time steps: receive the network internal state for the time step; and process the network internal state for the time step in accordance with current values of a set of parameters of the logistic regression node to generate a future condition score for the corresponding condition for the time step.
Abstract:
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating representations of input sequences. One of the methods includes obtaining an input sequence, the input sequence comprising a plurality of inputs arranged according to an input order; processing the input sequence using a first long short term memory (LSTM) neural network to convert the input sequence into an alternative representation for the input sequence; and processing the alternative representation for the input sequence using a second LSTM neural network to generate a target sequence for the input sequence, the target sequence comprising a plurality of outputs arranged according to an output order.
Abstract:
A parallel convolutional neural network is provided. The CNN is implemented by a plurality of convolutional neural networks each on a respective processing node. Each CNN has a plurality of layers. A subset of the layers are interconnected between processing nodes such that activations are fed forward across nodes. The remaining subset is not so interconnected.