Cascade correlation algorithm
Encyclopedia
Cascade-Correlation is an architecture and supervised learning
algorithm
for artificial neural network
s developed by Scott Fahlman
at Carnegie Mellon in 1990.
Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no backpropagation
of error signals through the connections of the network.
Supervised learning
Supervised learning is the machine learning task of inferring a function from supervised training data. The training data consist of a set of training examples. In supervised learning, each example is a pair consisting of an input object and a desired output value...
algorithm
Algorithm
In mathematics and computer science, an algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Algorithms are used for calculation, data processing, and automated reasoning...
for artificial neural network
Artificial neural network
An artificial neural network , usually called neural network , is a mathematical model or computational model that is inspired by the structure and/or functional aspects of biological neural networks. A neural network consists of an interconnected group of artificial neurons, and it processes...
s developed by Scott Fahlman
Scott Fahlman
Scott Elliott Fahlman is a computer scientist at Carnegie Mellon University. He is notable for early work on automated planning in a blocks world, on semantic networks, on neural networks , on the Dylan programming language, and on Common Lisp...
at Carnegie Mellon in 1990.
Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no backpropagation
Backpropagation
Backpropagation is a common method of teaching artificial neural networks how to perform a given task. Arthur E. Bryson and Yu-Chi Ho described it as a multi-stage dynamic system optimization method in 1969 . It wasn't until 1974 and later, when applied in the context of neural networks and...
of error signals through the connections of the network.
External links
- The Cascade-Correlation Learning Architecture Scott E. Fahlman and Christian Lebiere, August 29, 1991. Article created for National Science FoundationNational Science FoundationThe National Science Foundation is a United States government agency that supports fundamental research and education in all the non-medical fields of science and engineering. Its medical counterpart is the National Institutes of Health...
under Contract Number EET-8716324 and Defense Advanced Research Projects AgencyDefense Advanced Research Projects AgencyThe Defense Advanced Research Projects Agency is an agency of the United States Department of Defense responsible for the development of new technology for use by the military...
(DOD), ARPA Order No. 4976 under Contract F33615-87-C-1499.