Alopex: A Correlation-Based Learning Algorithm for Feed-Forward and Recurrent Neural Networks (1994)
- read the abstract! rather than using the gradient error estimate as in backpropagation, it uses the correlation between changes in network weights and changes in the error + gaussian noise.
- backpropagation requires calculation of the derivatives of the transfer function from one neuron to the output. This is very non-local information.
- one alternative is somewhat empirical: compute the derivatives wrt the weights through perturbations.
- all these algorithms are solutions to the optimization problem: minimize an error measure, E, wrt the network weights.
- all network weights are updated synchronously.
- can be used to train both feedforward and recurrent networks.
- algorithm apparently has a long history, especially in visual research.
- the algorithm is quite simple! easy to understand.
- use stochastic weight changes with a annealing schedule.
- this is pre-pub: tables and figures at the end.
- looks like it has comparable or faster convergence then backpropagation.
- not sure how it will scale to problems with hundreds of neurons; though, they looked at an encoding task with 32 outputs.
|