Abstract:
An improved Artificial Neural Network (ANN) is disclosed that comprises a conventional ANN, a database block, and a compare and update circuit. The conventional ANN is formed by a plurality of neurons, each neuron having a prototype memory dedicated to store a prototype and a distance evaluator to evaluate the distance between the input pattern presented to the ANN and the prototype stored therein. The database block has: all the prototypes arranged in slices, each slice being capable to store up to a maximum number of prototypes; the input patterns or queries to be presented to the ANN; and the distances resulting of the evaluation performed during the recognition/classification phase. The compare and update circuit compares the distance with the distance previously found for the same input pattern updates or not the distance previously stored.
Abstract:
The improved neural network of the present invention results from the combination of a dedicated logic block with a conventional neural network based upon a mapping of the input space usually employed to classify an input data by computing the distance between said input data and prototypes memorized therein. The improved neural network is able to classify an input data, for instance, represented by a vector A even when some of its components are noisy or unknown during either the learning or the recognition phase. To that end, influence fields of various and different shapes are created for each neuron of the conventional neural network. The logic block transforms at least some of the n components (A1, . . . , An) of the input vector A into the m components (V1, . . . , Vm) of a network input vector V according to a linear or non-linear transform function F. In turn, vector V is applied as the input data to said conventional neural network. The transform function F is such that certain components of vector V are not modified, e.g. Vk=Aj, while other components are transformed as mentioned above, e.g. Vi=Fi(A1, . . . , An). In addition, one (or more) component of vector V can be used to compensate an offset that is present in the distance evaluation of vector V. Because, the logic block is placed in front of the said conventional neural network any modification thereof is avoided.
Abstract:
Let us consider a plurality of input patterns having an essential characteristic in common but which differ on at least one parameter (this parameter modifies the input pattern in some extent but not this essential characteristic for a specific application). During the learning phase, each input pattern is normalized in a normalizer, before it is presented to a classifier. If not recognized, it is learned, i.e. the normalized pattern is stored in the classifier as a prototype with its category associated thereto. From a predetermined reference value of that parameter, the normalizer computes an element related to said parameter which allows to set the normalized pattern from the input pattern and vice versa to retrieve the input pattern from the normalized pattern. As a result, all these input patterns are represented by the same normalized pattern. The above method and circuits allow to reduce the number of required prototypes in the classifier, improving thereby its response quality.
Abstract:
An artificial neural network (ANN) based system that is adapted to process an input pattern to generate an output pattern related thereto having a different number of components than the input pattern. The system (26) is comprised of an ANN (27) and a memory (28), such as a DRAM memory, that are serially connected. The input pattern (23) is applied to a processor (22), where it can be processed or not (the most general case), before it is applied to the ANN and stored therein as a prototype (if learned). A category is associated with each stored prototype. The processor computes the coefficients that allow the determination of the estimated values of the output pattern, these coefficients are the components of a so-called intermediate pattern (24). Assuming the ANN has already learned a number of input patterns, when a new input pattern is presented to the ANN in the recognition phase, the category of the closest prototype is output therefrom and is used as a pointer to the memory. In turn, the memory outputs the corresponding intermediate pattern. The input pattern and the intermediate pattern are applied to the processor to construct the output pattern (25) using the coefficients. Typically, the input pattern is a block of pixels in the field of scaling images.
Abstract:
An artificial neural network (ANN) based system that is adapted to process an input pattern to generate an output pattern related thereto having a different number of components than the input pattern. The system (26) is comprised of an ANN (27) and a memory (28), such as a DRAM memory, that are serially connected. The input pattern (23) is applied to a processor (22), where it can be processed or not (the most general case), before it is applied to the ANN and stored therein as a prototype (if learned). A category is associated with each stored prototype. The processor computes the coefficients that allow the determination of the estimated values of the output pattern, these coefficients are the components of a so-called intermediate pattern (24). Assuming the ANN has already learned a number of input patterns, when a new input pattern is presented to the ANN in the recognition phase, the category of the closest prototype is output therefrom and is used as a pointer to the memory. In turn, the memory outputs the corresponding intermediate pattern. The input pattern and the intermediate pattern are applied to the processor to construct the output pattern (25) using the coefficients. Typically, the input pattern is a block of pixels in the field of scaling images.
Abstract:
A method and system for processing electronic documents. A temporary computer object is created. An address of a first electronic document is obtained. A first tag, a second tag, and the address of the first electronic document are copied into a header of the created temporary computer object. Selected text from the first electronic document is obtained. The first and second tag respectively mark the beginning and the end of the header. The address of the first electronic document is disposed between the first and second tags. The selected text and a third tag are copied into the created temporary computer object. The third tag marks the end of the created temporary computer object. The selected text is disposed between the header of the created temporary computer object and the third tag. The created temporary computer object is stored in a second electronic document.
Abstract:
A method and systems for copying textual objects from source documents into an object document, and for tagging, linking and processing said copied textual portions, including the disclosure of a new type of hyperlinking mechanism, for enabling to identify and trace the sources and the authorship of said copied textual portions or of all textual sub-portions or fragments of text that could be generated from said copied textual portions by editing the object document. The invention can be implemented by means of software implementing the disclosed system and method running on word-processors and web browsers.
Abstract:
A method and system for processing electronic documents. A temporary computer object is created. An address of a first electronic document is obtained. A first tag, a second tag, and the address of the first electronic document are copied into a header of the created temporary computer object. Selected text from the first electronic document is obtained.The first and second tag respectively mark the beginning and the end of the header. The address of the first electronic document is disposed between the first and second tags. The selected text and a third tag are copied into the created temporary computer object. The third tag marks the end of the created temporary computer object. The selected text is disposed between the header of the created temporary computer object and the third tag. The created temporary computer object is stored in a second electronic document.
Abstract:
A method and systems for copying textual objects from source documents into an object document, and for tagging, linking and processing said copied textual portions, including the disclosure of a new type of hyperlinking mechanism, for enabling to identify and trace the sources and the authorship of said copied textual portions or of all textual sub-portions or fragments of text that could be generated from said copied textual portions by editing the object document. The invention can be implemented by means of software implementing the disclosed system and method running on word-processors and web browsers.