m8ta
You are not authenticated, login.
text: sort by
tags: modified
type: chronology
{696}
hide / / print
ref: Jarosiewicz-2008.12 tags: Schwartz BMI learning perturbation date: 03-07-2012 17:11 gmt revision:2 [1] [0] [head]

PMID-19047633[0] Functional network reorganization during learning in a brain-computer interface paradigm.

  • quote: For example, the tuning functions of neurons in the motor cortex can change when monkeys adapt to perturbations that interfere with the execution (5–7) or visual feedback (8–10) of their movements. Check these refs - have to be good!
  • point out that only the BMI lets you see how the changes reflect changes in behavior.
  • BMI also allows pertubactions to target a subset of neurons. apparently, they had the same idea as me.
  • used the PV algorithm. yeck.
  • perturbed a select subset of neurons by rotating their tuning by 90deg. about the Z-axis. pre - perturb - washout series of experiments.
  • 3D BMI, center-out task, 8 targets at the corners of a cube.
  • looked for the following strategies for compensating to the perturbation:
    • re-aiming: to compensate for the deflected trajectory, aim at a rotated target.
    • re-waiting: decrease the strength of the rotated neurons.
    • re-mapping: use the new units based on their rotated tuning.
  • modulation depths for the rotated neurons did in fact decrease.
  • PD for the neurons that were perturbed rotated more than the control neurons.
  • rotated neurons contributed to error parallel to perturbation, unrotated compensated for this, and contributed to 'errors' in the opposite direction.
  • typical recording sessions of 3 hours - thus, the adaptation had to proceed quickly and only online. pre-perturb-washout each had about 8 * 20 trials.
  • interesting conjecture: "Another possibility is that these neurons solve the “credit-assignment problem” described in the artificial intelligence literature (25–26). By using a form of Hebbian learning (27), each neuron could reduce its contribution to error independently of other neurons via noise-driven synaptic updating rules (28–30). "
    • ref 25: Minsky - 1961;
    • ref 26: Cohen PR, Feigenbaum EA (1982) The Handbook of Artificial Intelligence; 27 references Hebb driectly - 1949 ;
    • ref 28: ALOPEX {695} ;
    • ref 29: PMID-1903542[1] A more biologically plausible learning rule for neural networks.
    • ref 30: PMID-17652414[2] Model of birdsong learning based on gradient estimation by dynamic perturbation of neural conductances. Fiete IR, Fee MS, Seung HS.

____References____

[0] Jarosiewicz B, Chase SM, Fraser GW, Velliste M, Kass RE, Schwartz AB, Functional network reorganization during learning in a brain-computer interface paradigm.Proc Natl Acad Sci U S A 105:49, 19486-91 (2008 Dec 9)
[1] Mazzoni P, Andersen RA, Jordan MI, A more biologically plausible learning rule for neural networks.Proc Natl Acad Sci U S A 88:10, 4433-7 (1991 May 15)
[2] Fiete IR, Fee MS, Seung HS, Model of birdsong learning based on gradient estimation by dynamic perturbation of neural conductances.J Neurophysiol 98:4, 2038-57 (2007 Oct)