m8ta
use https for features. 

{1455}  
Conducting credit assignment by aligning local distributed representations
Lit review.
 
{1453}  
PMID22325196 Backpropagation through time and the brain
 
{1426}  
Training neural networks with local error signals
 
{1423}  
PMID27824044 Random synaptic feedback weights support error backpropagation for deep learning.
Our proof says that weights W0 and W evolve to equilibrium manifolds, but simulations (Fig. 4) and analytic results (Supple mentary Proof 2) hint at something more specific: that when the weights begin near 0, feedback alignment encourages W to act like a local pseudoinverse of B around the error manifold. This fact is important because if B were exactly W + (the Moore Penrose pseudoinverse of W ), then the network would be performing GaussNewton optimization (Supplementary Proof 3). We call this update rule for the hidden units pseudobackprop and denote it by ∆hPBP = W + e. Experiments with the linear net work show that the angle, ∆hFA ]∆hPBP quickly becomes smaller than ∆hFA ]∆hBP (Fig. 4b, c; see Methods). In other words feedback alignment, despite its simplicity, displays elements of secondorder learning.  
{1422}  
PMID29205151 Towards deep learning with segregated dendrites https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5716677/
 
{699} 
ref: Harris2008.03
tags: retroaxonal retrosynaptic Harris learning cortex backprop
date: 12072011 02:34 gmt
revision:2
[1] [0] [head]


PMID18255165[0] Stability of the fittest: organizing learning through retroaxonal signals
____References____
 
{862} 
ref: 0
tags: backpropagation cascade correlation neural networks
date: 12202010 06:28 gmt
revision:1
[0] [head]


The CascadeCorrelation Learning Architecture
 
{634} 
ref: RAzsa2008.01
tags: nAChR nicotinic acetylchoine receptor interneurons backpropagating LTP hippocampus
date: 10082008 17:37 gmt
revision:0
[head]


PMID18215234[0] Dendritic nicotinic receptors modulate backpropagating action potentials and longterm plasticity of interneurons.
____References____  