use https for features.
text: sort by
tags: modified
type: chronology
hide / / print
ref: -0 tags: computational neuroscience opinion tony zador konrad kording lillicrap date: 07-30-2019 21:04 gmt revision:0 [head]

Two papers out recently in Arxive and Biorxiv:

  • A critique of pure learning: what artificial neural networks can learn from animal brains
    • Animals learn rapidly and robustly, without the need for labeled sensory data, largely through innate mechanisms as arrived at and encoded genetically through evolution.
    • Still, this cannot account for the connectivity of the human brain, which is much to large for the genome; with us, there are cannonical circuits and patterns of intra-area connectivity which act as the 'innate' learning biases.
    • Mice and men are not so far apart evolutionary. (I've heard this also from people FIB-SEM imaging cortex) Hence, understanding one should appreciably lead us to understand the other. (I agree with this sentiment, but for the fact that lab mice are dumb, and have pretty stereotyped behaviors).
    • References Long short term memory and learning to learn in networks of spiking neurons -- which claims that a hybrid algorithm (BPTT with neuronal rewiring) with realistic neuronal dynamics markedly increases the computational power of spiking neural networks.
  • What does it mean to understand a neural network?
    • As has been the intuition with a lot of neuroscientists probably for a long time, posits that we have to investigate the developmental rules (wiring and connectivity, same as above) plus the local-ish learning rules (synaptic, dendritic, other .. astrocytic).
      • The weights themselves, in either biological neural networks, or in ANN's, are not at all informative! (Duh).
    • Emphasizes the concept of compressability: how much information can be discarded without impacting performance? With some modern ANN's, 30-50x compression is possible. Authors here argue that little compression is possible in the human brain -- the wealth of all those details about the world are needed! In other words, no compact description is possible.
    • Hence, you need to learn how the network learns those details, and how it's structured so that important things are learned rapidly and robustly, as seen in animals (very similar to above).

hide / / print
ref: Stevenson-2011.02 tags: Kording neural recording doubling northwestern chicago date: 01-28-2013 00:12 gmt revision:1 [0] [head]

PMID-21270781[0] How advances in neural recording affect data analysis.

  • Number of recorded channels doubles about every 7 years (slowish).
  • "Emerging data analysis techniques should consider both the computational costs and the potential for more accurate models associated with this exponential growth of the number of recorded neurons."


[0] Stevenson IH, Kording KP, How advances in neural recording affect data analysis.Nat Neurosci 14:2, 139-42 (2011 Feb)

hide / / print
ref: -0 tags: neural recording doubling Stevenson Kording date: 02-08-2012 04:28 gmt revision:0 [head]

PMID-21270781 How advances in neural recording affect data analysis

  • Number of channels recorded doubles every 7 years.
  • This extrapolated from the past 50 years of growth.