m8ta
you are not logged in, login. new entry
text: sort by
tags: modified
type: chronology
{884} is owned by agnewlife.{549} is owned by tlh24.
[0] Porada I, Bondar I, Spatz WB, Kruger J, Rabbit and monkey visual cortex: more than a year of recording with up to 64 microelectrodes.J Neurosci Methods 95:1, 13-28 (2000 Jan 31)

[0] Aflalo TN, Graziano MS, Relationship between unconstrained arm movements and single-neuron firing in the macaque motor cortex.J Neurosci 27:11, 2760-80 (2007 Mar 14)

[0] Karni A, Meyer G, Rey-Hipolito C, Jezzard P, Adams MM, Turner R, Ungerleider LG, The acquisition of skilled motor performance: fast and slow experience-driven changes in primary motor cortex.Proc Natl Acad Sci U S A 95:3, 861-8 (1998 Feb 3)

[0] Pastalkova E, Itskov V, Amarasingham A, Buzsáki G, Internally generated cell assembly sequences in the rat hippocampus.Science 321:5894, 1322-7 (2008 Sep 5)

[0] Clancy EA, Xia H, Christie A, Kamen G, Equalization filters for multiple-channel electromyogram arrays.J Neurosci Methods 165:1, 64-71 (2007 Sep 15)

{1389}
hide / edit[1] / print
ref: -0 tags: photoacoustic tomography mouse imaging q-switched laser date: 05-11-2017 05:23 gmt revision:1 [0] [head]

Single-impulse panoramic photoacoustic computed tomography of small-animal whole-body dynamics at high spatiotemporal resolution

  • Used Q-switched Nd:YAG and Ti:Sapphire lasers to illuminate mice axially (from the top, through a diffuser and conical lens), exciting the photoacuostic effect, from which they were able to image at 125um resolution a full slice of the mouse.
    • I'm surprised at their mode of illumination -- how do they eliminate the out-of-plane photoacoustic effect?
  • Images look low contrast, but structures, e.g. cortical vasculature, are visible.
  • Can image at the rep rate of the laser (50 Hz), and thereby record cardiac and pulmonary rhythms.
  • Suggest that the photoacoustic effect can be used to image brain activity, but spatial and temporal resolution are limited.

{1390}
hide / edit[0] / print
ref: -0 tags: photoacoustic tomography mouse imaging q-switched laser date: 05-11-2017 05:21 gmt revision:0 [head]

Single-impulse panoramic photoacoustic computed tomography of small-animal whole-body dynamics at high spatiotemporal resolution

  • Used Q-switched Nd:YAG and Ti:Sapphire lasers to illuminate mice axially, exciting the photoacuostic effect, from which they were able to image at 125um resolution a full slice of the mouse.
  • Images look low contrast, but structures, e.g. cortical vasculature, are visible.
  • Can image at the rep rate of the laser (50 Hz), and thereby record cardiac and pulmonary rhythms.
  • Suggest that the photoacoustic effect can be used to image brain activity, but spatial and temporal resolution are limited.

{1378}
hide / edit[0] / print
ref: -0 tags: carbon fiber thread spinning Pasquali Kemere nanotube stimulation date: 02-09-2017 01:09 gmt revision:0 [head]

PMID-25803728 Neural stimulation and recording with bidirectional, soft carbon nanotube fiber microelectrodes.

  • Poulin et al. demonstrated that microelectrodes made solely of CNT fibers22 show remarkable electrochemical activity, sensitivity, and resistance to biofouling compared to conventional carbon fibers when used for bioanalyte detection in vitro.23-25
  • Fibers were insulated with 3 um of block copolymer polystyrene-polybutadiene (PS-b-PBD) (polybutadiene is sythetic rubber)
    • Selected for good properties of biocompatibility, flexibility, resistance to flextural fatigue.
    • Available from Sigma-Aldrich.
    • Custom continuous dip-coating process.
  • 18um diameter, 15 - 20 x lower impedance than equivalently size PtIr.
    • 2.5 - 6x lower than W.
    • In practice, 43um dia, 1450um^2, impedance of 11.2 k; 12.6um, 151k.
  • Charge storage capacity 327 mC / cm^2; PtIr = 1.2 mC/cm^2
  • Wide water window of -1.5V - 1.5V, consistent with noble electrochemical properties of C.
  • Lasts for over 97e6 pulsing cycles beyond the water window, vs 43e6 for PEDOT.
  • Tested via 6-OHDA model of PD disease vs. standard PtIr stimulating electrodes, implanted via 100um PI shuttled attached with PEG.
  • Yes, debatable...
  • Tested out to 3 weeks durability. Appear to function as well or better than metal electrodes.

PMID-23307737 Strong, light, multifunctional fibers of carbon nanotubes with ultrahigh conductivity.

  • Full process:
    1. Dissolve high-quality, 5um long CNT in chlorosulfonic acid (the only known solvent for CNTs)
    2. Filter to remove particles
    3. Extrude liquid crystal dope through a spinneret, 65 or 130um orifice
    4. Into a coagulant, acetone or water
    5. Onto a rotating drum to put tension on the thread & align the CNTs.
    6. Wash in water and dry at 115C.
  • Properties:
    • Tensile strength 1 GPa +- 0.2 GPa.
    • Tensile modulus 120 GPa +- 50, best value 200 GPa
      • Pt: 168 GPa ; Au: 79 GPa.
    • Elongation to break 1.4 %
    • Conductivity: 0.3 MS/m, Iodine doped 5 +- 0.5 MS/m (22 +- 4 microhm cm)
      • Cu: 59.6 MS/m ; Pt: 9.4 MS/m ; Au: 41 MS/m
      • Electrical conductivity drops after annealing @ 600C
      • But does not drop after kinking and repeated mechanical cycling.
  • Theoretical modulus of MWCNT ~ 350 GPa.
  • Fibers well-aligned at ~ 90% the density (measure 1.3 g/cc) of close-packed CNT.

{1364}
hide / edit[0] / print
ref: -0 tags: polyimide aqueous degradation kapton date: 01-22-2017 05:51 gmt revision:0 [head]

Aqueous degradation of polyimides

  • Above ph 2, Kapton (PMDA-ODA) test specimens decreased both tensile strength and elongation to break with water, with a rate that increased with temperature.
  • No samples completely degraded, however; tensile strength decreased by about 2x, and elongation from 30% to 5%.
  • The authors suspect that ortho (off-molecular axis) amide bonding, at about 0.6% of the total number of imide bonds, is responsible for this (otherwise the film would completely fall apart.)
  • Imide bonds themselves are robust to all but strong bases and acids.
  • See also {1253}.

{1338}
hide / edit[0] / print
ref: -0 tags: ZeroMQ messaging sockets multithreading date: 05-03-2016 06:10 gmt revision:0 [head]

ZeroMQ -- much better sockets framework than native TCP/UDP sockets.

  • Bindings for Ocaml, too.
  • Supports Erlang-like concurrency.

{1273}
hide / edit[0] / print
ref: -0 tags: spectroscopy frequency domain PMT avalanche diode laser Tufts date: 02-25-2014 19:02 gmt revision:0 [head]

Frequency-domain techniques for tissue spectroscopy and imaging

  • 52 pages, book chapter
  • Good detail on bandwidth, tissue absorption, various technologies.

{823}
hide / edit[7] / print
ref: Kruger-2010.05 tags: microelectrode array nichrome 7 years rhesus electrophysiology MEA Kruger oblique inverted date: 01-29-2013 07:54 gmt revision:7 [6] [5] [4] [3] [2] [1] [head]

PMID-20577628[0] Seven years of recording from monkey cortex with a chronically implanted multiple electrode.

  • Seven years!! good recordings the whole time, too. As they say, this is a clinically realistic time period. Have they solved the problem?
  • Used 12.5um Ni-Cr-Al wire insulated with 3um of polymide.
    • Wires were then glued to an 8x8 connector block using conductive epoxy.
    • Glued the bundle together with a solution of plexiglas in dichloroethane.
    • Then introduced the 0.3mm bundle into a j-shaped cannula. This allowed them to approach the gray matter inverted, from below (the white matter).
    • implanted 64 ch array into ventral premotor cortex (arm representation?).
  • No apparent degradation of recording quality over that time.
  • Had some serious problems with the quality of their connector.
    • They recommend: "Rather, the contacts on the head should be made from noble metals and be flat or shallowly hollow, so that they can be easily cleaned, and no male contacts can break."
    • Really need to amplify and multiplex prior connector (imho).
  • Claim that them managed to record from two neurons on one channel for nearly 7 years (ch 54).
  • They cite us, but only to indicate that we recommend slow penetration of the brain. They agree with our results that lowering of individual electrodes is better than all at once.

____References____

[0] Kruger J, Caruana F, Volta RD, Rizzolatti G, Seven years of recording from monkey cortex with a chronically implanted multiple microelectrode.Front Neuroengineering 3 Issue 6 (2010 May 28)

{1052}
hide / edit[3] / print
ref: Chestek-2009.09 tags: BMI problems address critique spike sorting Shenoy date: 01-23-2013 02:23 gmt revision:3 [2] [1] [0] [head]

IEEE-5332822 (pdf) Neural prosthetic systems: Current problems and future directions

  • Where there is unlikely to be improvements: spike sorting and spiking models.
  • Where there are likely to be dramatic improvements: non-stationarity of recorded waveforms, limitations of a linear mappings between neural activity and movement kinematics, and the low signal to noise ratio of the neural data.
  • Compare different sorting methods: threshold, single unit, multiunit, relative to decoding.
  • Plot waveform changes over an hour -- this contrasts with earlier work (?) {1032}
  • Figure 5: there is no obvious linear transform between neural activity and the kinematic parameters.
  • Suggest that linear models need to be replaced by the literature of how primates actually make reaches.
  • Discuss that offline performance is not at all the same as online; in the latter the user can learn and adapt on the fly!

____References____

Chestek, C.A. and Cunningham, J.P. and Gilja, V. and Nuyujukian, P. and Ryu, S.I. and Shenoy, K.V. Engineering in Medicine and Biology Society, 2009. EMBC 2009. Annual International Conference of the IEEE 3369 -3375 (2009)

{1179}
hide / edit[4] / print
ref: -0 tags: optical coherence tomography neural recording squid voltage sensitive dyes review date: 12-23-2012 21:00 gmt revision:4 [3] [2] [1] [0] [head]

PMID-20844600 Detection of Neural Action Potentials Using Optical Coherence Tomography: Intensity and Phase Measurements with and without Dyes.

  • Optical methods of recording have been investigated since the 1940's:
    • During action potential (AP) propagation in neural tissue light scattering, absorption, birefringence, fluorescence, and volume changes have been reported (Cohen, 1973).
  • OCT is reflection-based, not transmission: illuminate and measure from the same side.
    • Here they use spectral domain OCT, where the mirror is not scanned; rather SD-OCT uses a spectrometer to record interference of back-scattered light from all depth points simultaneously (Fercher et al., 1995).
    • Use of a spectrometer allows imaging of an axial line within 10-50us, sufficient for imaging action potentials.
    • SD-OCT, due to some underlying mathematics which I can't quite grok atm, can resolve/annul common-mode phase noise for high temporal and Δphase measurement (high sensitivity).
      • This equates to "microsecond temporal resolution and sub-nanometer optical path length resolution".
  • OCT is generally (intially?) used for in-vivo imaging of retinas, in humans and other animals.
  • They present new data for depth-localization of neural activity in squid giant axons (SGA) stained with a voltage-sensitive near-infrared dye.
    • Note: averaged over 250 sweeps.
  • ΔPhase>>ΔIntensity -- figure 4 in the paper.
  • Use of voltage-sensitive dyes improves the resolution of ΔI , but not dramatically --
    • And Δphase is still a bit delayed.
    • Electrical recording is the control.
      • It will take significant technology development before optical methods exceed electrical methods...
  • Looks pretty preliminary. However, OCT can image 1-2mm deep in transparent tissue, which is exceptional.
  • Will have to read their explanation of OCT.
  • Used in a squid giant axon prep. 2010, wonder if anything new has been done (in vivo?).
  • Claim that progress is hampered by limited understanding of how these Δphase signals arise.

{1160}
hide / edit[0] / print
ref: -0 tags: william james quotes date: 04-30-2012 15:40 gmt revision:0 [head]

And the faculty of voluntarily bringing back a wandering attention over and over again is the very root of judgement, character, and will. No one is compos sui if he have it not. An education should improve this faculty would be education par excellence".

{166}
hide / edit[2] / print
ref: -0 tags: implicit motor sequence learning basal ganglia parkinson's disease date: 03-06-2012 22:47 gmt revision:2 [1] [0] [head]

PMID-19744484 What can man do without basal ganglia motor output? The effect of combined unilateral subthalamotomy and pallidotomy in a patient with Parkinson's disease.

  • Unilateral lesion of both STN and GPi in one patient. Hence, the patient was his own control.
    • DRastically reduced the need for medication, indicating that it had a profound effect on BG output.
  • Arm contralateral lesion showed faster reaction times and normal movement speeds; ipsilateral arm parkinsonian.
  • Implicit sequence learning in a task was absent.
  • In a go / no-go task when the percent of no-go trials increased, the RT speriority of contralateral hand was lost.
  • " THe risk of persistent dyskinesias need not be viewed as a contraindication to subthalamotomy in PD patients since they can be eliminated if necessary by a subsequent pallidotomy without producting deficits that impair daily life.
  • Subthalamotomy incurs persistent hemiballismus / chorea in 8% of patients; in many others chorea spontaneously disappears.
    • This can be treated by a subsequent pallidotomy.
  • Patient had Parkinsonian symptoms largely restricted to right side.
  • Measured TMS ability to stimulate motor cortex -- which appears to be a common treatment. STN / GPi lesion appears to have limited effect on motor cortex exitability 9other things redulate it?).
  • conclusion: interrupting BG output removes such abnormal signaling and allows the motor system to operate more normally.
    • Bath DA presumably calms hyperactive SNr neurons.
    • Yuo cannot distrupt output of the BG with compete imuntiy; the associated abnormalities may be too subtle to be detected in normal behaviors, explaniing the overall clinical improbement seen in PD patients after surgery and the scarcity fo clinical manifestations in people with focal BG lesions (Bhatia and Marsden, 1994; Marsden and Obeso 1994).
      • Our results support the prediction that surgical lesions of the BG in PD would be associated with inflexibility or reduced capability for motor learning. (Marsden and Obeso, 1994).
  • It is better to dispense with faulty BG output than to have a faulty one.

{175}
hide / edit[1] / print
ref: BMI notes-0 tags: spike filtering rate_estimation BME 265 Henriquez date: 01-06-2012 03:06 gmt revision:1 [0] [head]

http://hardm.ath.cx:88/pdf/BME265_final.pdf

{309}
hide / edit[3] / print
ref: Porada-2000.01 tags: electrodes recording oblique inverted MEA arrays Kruger date: 01-05-2012 23:07 gmt revision:3 [2] [1] [0] [head]

PMID-10776811[0] More than a year of recording with up to 64 microelectrodes

  • for more than a year action potentials of good quality were obtained from most electrodes!
  • used 60mm-long, 12.5um Ni-Cr-Al (Isaohm) wire, polyimide insulated, soldered to microconnectors. Tips purely ('primitively') cut after bonding them to a piece of photographic film substrate.
  • implanted in the rabbit and marmoset V1 cortex from afar.
  • with the 8 rabbits they used a magnetic release to prevent excessive force from removing the implant.
  • used small sections of thicker wire to individually label the electrodes for x-ray; thusly could reconstruct the electrode positions. electrodes in the white matter were silent mais or menos.
  • the autocorrelation functions of the neurons generally look good; some of them do not have a refractory period though.
  • in GFAP-stained sections a single electrode track appeared as a hole of about 28 um wide. The outer diameter of the wire insulation as 18um. electrode tracts were not visible in cresyl violet tracts. the neurones near the electrode tips appeared normal.
  • we recorded signals for up to 711 days, during which time the recording quality did not degrade. nice, nice!
  • they think that the large length of free wire, running about 5mm through the brain provides a sufficient degree of friction so that locally the tissue is prevented from moving relative to the electrodes. They did not need to use microstimulation to improve recording quality.

____References____

{261}
hide / edit[1] / print
ref: Aflalo-2007.03 tags: Graziano motor cortex M1 SUA macaque monkey electrophysiology tuning date: 01-03-2012 03:37 gmt revision:1 [0] [head]

PMID-17360898[] Relationship between Unconstrained Arm Movements and Single-Neuron Firing in the Macaque Motor Cortex

  • the best explanation of neuronal firing was the final mulijoint configuration of the arm - it accounted for 36% of the SUA variance.
  • the search for the 'correct' motor parameter (that neurons are tuned to) is an ill-posed experimental question because motor parameters are very intercorrelated.
  • they knock experiments in which the animals are overtrained & the movements limited - and they are right!
  • single electrode recording with cronically implanted steel chamber - e.g. it took a damn long time!
    • imaged the central sulcus through the dura.
    • verified location with single unit responses to palpation of the contralateral hand/arm (in S1) & microstimulation-evoked movements in M1.
  • used optotrak to measure the position of the monkey.
  • occasionally, the monkey attemptted to scratch the experimenter with fast semi-ballistic arm movement. heh. :)
  • movements were seprarated based on speed analysis - that is, all the data were analyzed as discrete segments.
  • neurons were inactive during periods of hand stasis between movements.
  • tested the diversity of their training set in a clever way: they simulated neurons tuned to various parameters of the motion, and tested to see if their analysis could recover the tuning. it could.
    • however, they still used unvalidated regression analysis to test their hypothesis. regression analysis estimates how much variance is estimated by the cosine-tuning model - it returns an R^2.
  • either averaged the neuronal tuning over an entire movement or smoothed the firing rate using a 10hz upper cutoff.
  • Moran & Schwartz' old result seems to be as much a consequence of averaging across trials as it is a consequence of actual tuning...
    • whithout the averaging, only 3% of the variance could be attributed to speed tuning.
  • i think that they have a good point in all of this: when you eliminate sources of variance (e.g. starting position) from the behavior, either by mechanical restraint or simple omission of segments or even better averaging over trials, you will get a higher R^2. but it may be false, a compression of the space along an axis where they are not well correlated!
  • a model in which the final position matters little, but the velocity used to get there does, has been found to account for little of the neuronal variance.
    • instead, neurons are tuned to any of a number of movements that terminate near a preferred direction.
  • observational studies of of the normal psontaneous behavior of monkeys indicate that a high proportion of time is spent using the arm as a postural device.
    • therefore, they expect that neurons are tuned to endpoint posture.
    • modeled the neuronal firing as a gaussian surface in the 8-dimensional space of the arm posture.
  • in comparison to other studies, the offset between neural activity and behavior was not significantly different, over the entire population of recorded neurons, from zero. This may be due to the nature of the task, which was spontaneous and ongoing, not cue and reaction based, as in many other studies.
    • quote: This result suggests that the neuronal tuning to posture reflects reatively more and anticipation of the future state of the limb rather than a feedback signal about a recent state of the limb.

____References____

{1000}
hide / edit[2] / print
ref: YaoHong-2009.1 tags: wireless transmitter modulator QPSK date: 01-03-2012 00:56 gmt revision:2 [1] [0] [head]

IEEE-5256305 (pdf) A 200-pJ/b MUX-Based RF Transmitter for Implantable Multichannel Neural Recording

  • 400Mhz band
  • 17Mbps
  • 0.18um CMOS process, 1.2-1.8V, 2.9mA, -8dbM output power.
  • 0.2 nJ/bit
  • 1.2mm^2
  • no receiver, though a COTS one could probably be purchased.
  • O-QPSK (offset quadrature phase shift keying)
  • VCO operates at 2x the output frequency.
  • No IF.
  • PSK > FSK.
  • Measured error vector magnitude (EVM), which is approx 8% over the data rate.
    • This due to phase noise in the LO & phase mismatch.
    • Typical O-QPSK requires 23% EVM for a BER of 10^-4
  • Almost as good as UWB.

____References____

Yao-Hong Liu and Cheng-Lung Li and Tsung-Hsien Lin A 200-pJ/b MUX-Based RF Transmitter for Implantable Multichannel Neural Recording Microwave Theory and Techniques, IEEE Transactions on 57 10 2533 -2541 (2009)

{917}
hide / edit[8] / print
ref: Doty-1969.01 tags: Doty microstimulation brain behavior macaque conditioned stimulus attention motivation 1969 date: 12-29-2011 23:28 gmt revision:8 [7] [6] [5] [4] [3] [2] [head]

PMID-4888623[0] Electrical stimulation of the brain in behavioral context.

  • Excellent review.
  • Focal stimulation of macaques can induce insect-grabbing responses, after which they will carefully examine their hands to see what was caught!
    • Same thing has been observed in humans -- the patient reported that he wanted to catch 'that butterfly'.
  • Such complicated action must be the effect of downstream / upstream targets of the stimulated site, as the actual stimulation carries no information other than it's spatial locality within the brain.
  • Stimulation of the rostral thalamus in the language hemisphere can elicit phrases: "Now one goes home", "Thank you", "I see something".
    • These are muttered involuntarily and without recollection of having been spoken.
  • Doty stimulated macaques at 20ua for 500us as a CS in postcentral gyrus (S1?) for a lever press CR, which should (he says)only activate a few dozen neurons.
  • Can elicit mating behaviors in oposums with electrical stimulation of the hypothalamus, but only if another opossum or furry object is present.
  • Stimulation of the caudate nucleus in humans causes an arrest reaction: they may speak, smile, or laught inappropriately, but appropriate voluntary responses are brought to a halt.
  • Stimulation of the basolateral amygdala can cause:
    • Hungry cats to immediately stop eating
    • Stop stalking prey
    • Non-hunting animals to stalk prey, and indeed will solve problems to gain access to rats which can be attacked.
  • Prolonged stimulation of almost every place in the brain of a cat at 3-8Hz can put it to sleep, though since lab cats normally sleep 17/24hours, this result may not be significant.
  • Stimulation at most sites in the limbic system has the still mysterious ability to organize motor activity in any fashion required to produce more of the activity or to avoid it, as the case may be.
  • Rats that are stimulated in the periaqueductal gray will self-administer stimulation, but will squeal and otherwise indicate pain and fright during the stimulation. Increasing the duration of stimulation from 0.5 to 1 second makes self-administration of this apparently fearful stimulation stop in both rats and cats.
  • Certain patterns of activity within systems responsible for fearful or aggressive behavior, rather than being aversive are perversely gratifying. This is clearly recognized in the sociology of man...
  • Rats will self-stimulate with the same stimulus trains that will cause them to eat and drink, and under some conditions the self-stimulation occurs only if food or water is available.
  • On the other hand, rats will choose self-stimulation of the lateral hypothalamus instead of food, even when they are starving.
    • Electrically induced hunger is its own reward.
  • The work of Loucks (124, 125) forms the major point of origin for the concept that motivation is essential to learning. with careful and thorough training, Loucks was unable to form CRs to an auditory CS using stimulation of the motor cortex as the US. With this paradigm, the limb movements elicited by the US never appeared to the CS alone; but movements were readily established when each CS-US combination was immediately followed by the presentation of food.
    • However: Kupalov independently proved that stimulation of the motor cortex could be used as the US, at the same time using stimulation at other loci as the CS.
    • Why the difference? Attention -- failures are commonly obtained with animals that consistenly fidget or fight restraint, as most of them do.
    • Cortical stimulation itself is not rewarding or aversive; animals neither seek nor avoid stimulation of most neocortical areas.
  • On classical conditioning: [Bures and colleagues (20, 65) bibtex:Bures-1968 bibtex:Gerbrandt-1968] found that if an anticedent stimulus, which might or might not effect a neuron, were consistently followed by effective intracellular electrical stimulation of that individual neuron, in roughly 10 percent of the cells of the neocortex, hippocampus, thalamus, or mesencephalic reticular formation a change in the response of that cell to the antecedent stimulus could be observed.
  • With an apparent exception of the cerebellum it is possible to electrical excitation any place in the brain as a CS in chickens, rats, rabbits ...
  • Stimulation of group 1 proprioceptive muscle-afferent fibers in cats is ineffective as a CS.
    • Muscle spindles lack clear access to the systems subserving conditioned reflexes. (These instead go to the cerebellum)
  • Macaques can also discriminate between two stimulation sites 1-3 mm apart apparently over the entirety of the cortex, at frequencies between 2 and 100Hz, and over a 4-10fold range of currents.
  • In human cases where electrical stimulation or the cortex elicits specific memories, extirpation of the stimulated area does not effect recall of this memory (156) {973}.

____References____

[0] Doty RW, Electrical stimulation of the brain in behavioral context.Annu Rev Psychol 20no Issue 289-320 (1969)

{919}
hide / edit[2] / print
ref: Tehovnik-1996.03 tags: ICMS technique Tehovnik MIT 1996 current density microstimulation date: 12-29-2011 05:11 gmt revision:2 [1] [0] [head]

PMID-8815302[0] Electrical stimulation of neural tissue to evoke behavioral responses

  • reference to justify our current levels.
  • radial dispersion of current, inverse square falloff of excitability.
  • low currents (10 ua) can activate 10-1000 of neurons in cat M1 (allegedly).

____References____

[0] Tehovnik EJ, Electrical stimulation of neural tissue to evoke behavioral responses.J Neurosci Methods 65:1, 1-17 (1996 Mar)

{954}
hide / edit[2] / print
ref: -0 tags: Georgoplous todorov M1 controversy square root bias PV date: 12-22-2011 22:52 gmt revision:2 [1] [0] [head]

PMID-11017158 One motor cortex, two different views

  • ref {950}, {952}, {953}
  • Georgopoulos re-analyzed their data without squareroot transformation and without smothing -- using only binned rates -- and found that it did not substantially change the porportions of tuned cells
  • In return, Todorov {955} responds that classifying cells based on maximal R^2 is stupid -- many cells lie on the decision boundaries in this manifold.

{135}
hide / edit[1] / print
ref: Vijayakumar-2005.12 tags: schaal motor learning LWPL PLS partial least sqares date: 12-07-2011 04:09 gmt revision:1 [0] [head]

PMID-16212764[0] Incremental online learning in high dimensions

ideas:

  • use locally linear models.
  • use a small number of regressions in selected dimensions of input space in the spirit of partial least squares regression. (like partial least-squares) hence, can operate in very high dimensions.
  • function to be approximated has locally low-dimensional structure, which holds for most real-world data.
  • use: the learning of of value functions, policies, and models for learning control in high-dimensional systems (like complex robots or humans).
  • important distinction between function-approximation learning:
    • methods that fit nonlinear functions globally, possibly using input space expansions.
      • gaussian process regression
      • support vector machine regression
        • problem: requires the right kernel choice & basis vector choice.
      • variational bayes for mixture models
        • represents the conditional joint expectation, which is expensive to update. (though this is factored).
      • each above were designed for data analysis, not incremental data. (biology is incremental).
    • methods that fit simple models locally and segment the input space automatically.
      • problem: the curse of dimensionality: they require an exponential number of models for accurate approximation.
        • this is not such a problem if the function is locally low-dim, as mentioned above.
  • projection regression (PR) works via decomposing multivariate regressions into a superposition of single-variate regressions along a few axes of input space.
    • projection pursuit regression is a well-known and useful example.
    • sigmoidal neural networks can be viewed as a method of projection regression.
  • they want to use factor analysis, which assumes that the observed data is generated from a low-dimensional distribution with a limited number of latent variables related to the output via a transformation matrix + noise. (PCA/ wiener filter)
    • problem: the factor analysis must represent all high-variance dimensions in the data, even if it is irrelevant for the output.
    • solution: use joint input and output space projection to avoid elimination of regression-important dimensions.
----
  • practical details: they use the LPWR algorithm to model the inverse dynamics of their 7DOF hydraulically-actuated gripper arm. That is, they applied random torques while recording the resulting accelerations, velocities, and angles, then fit a function to predict torques from these variables. The robot was compliant and not very well modeled with a rigid body model, though they tried this. The resulting LPWR generated model was 27 to 7, predicted torques. The control system uses this functional approximation to compute torques from desired trajectories, i think. The desired trajectories are generated using spline-smoothing ?? and the control system is adaptive in addition to the LPWR approximation being adaptive.
  • The core of the LPWR is partial-least squares regression / progression pursuit, coupled with gaussian kernels and a distance metric (just a matrix) learned via constrained gradient descent with cross-validation. The partial least squares (PLS) appears to be very popular in many fields, and there are an number of ways of computing it. Distance metric can expand without limit, and overlap freely. Local models are added based on MSE, i think, and model adding stops when the space is well covered.
  • I think this technique is very powerful - you separate the the function evaluation from the error minimization, to avoid the problem of ambiguous causes. Instead, when applying the LPWR to the robot, the torques cause the angles and accelerations -> but you invert this relationship: want to control the torques given trajectory. Of course, the whole function approximation is stationary in time - the p/v/a is sufficient to describe the state and the required torques. Does the brain work in the same way? do random things, observe consequences, work in consequence space and invert ?? e.g. i contracted my bicep and it caused my hand to move to the face; now I want my hand to move to my face again, what caused that? Need reverse memory... or something. Hmm. let's go back to conditional learning: if any animal does an action, and subsequently it is rewarded, it will do that action again. if this is conditional on a need, then that action will be performed only when needed.. when habitual, the action will be performed no matter what.. this is the nature of all animals, i think, and corresponds to rienforcement learning? but how? I suppose it's all about memory, and assigning credit where credit is due. the same problem is dealt with rienforcement learning. and yet things like motor learning seem so far out of this paradigm - they are goal-directed and minimize some sort of error. eh, not really. Clementine is operating on the conditioned response now - has little in the way of error. but gradually this will be built; with humans, it is built very quickly by reuse of existing modes. or conciousness.
  • back to the beginning: you dont have to regress into output space - you regress into sensory space, and do as much as possible in that sensory space for control. this is very powerful, and the ISO learning people (Porr et al) have effectively discovered this: you minimize in sensory space.
    • does this abrogate the need for backprop? we are continually causality-inverting machines; we are prredictive.

____References____

[0] Vijayakumar S, D'Souza A, Schaal S, Incremental online learning in high dimensions.Neural Comput 17:12, 2602-34 (2005 Dec)

{714}
hide / edit[2] / print
ref: Maass-2002.11 tags: Maass liquid state machine expansion LSM Markram computation cognition date: 12-06-2011 07:17 gmt revision:2 [1] [0] [head]

PMID-12433288[0] Real-time computing without stable states: a new framework for neural computation based on perturbations.

  • It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks.
    • Stable states, e.g. Turing machines and attractor-based networks are not requried!
    • How does this compare to Shenoy's result that neuronal dynamics converge to a 'stable' point just before movement?

____References____

[0] Maass W, Natschläger T, Markram H, Real-time computing without stable states: a new framework for neural computation based on perturbations.Neural Comput 14:11, 2531-60 (2002 Nov)

{811}
hide / edit[2] / print
ref: -0 tags: debian squeeze xorg fglrx ATI date: 02-19-2010 01:41 gmt revision:2 [1] [0] [head]

I recently had to replace a video card on my work computer - the Nvidia 8800 died; to replace it, I bought a ATI card, since they are cheaper and higher performance. Unfortunately, ATI/AMD drivers don't play with Linux as well as Nvidia's drivers do: the ATI linux drivers v 10.2 downloaded Feb 18 2010 are incompatible with Xorg v 7.5 included in Debian squeeze (testing). To fix this problem, I had to downgrade the package xserver-xorg and associated dependencies to that in Lenny.

Edit /etc/apt/sources.list & add lenny (we have an apt-cacher in the lab, use your appropriate mirror):

deb http://152.16.229.8/ac/ftp.debian.org/debian/ lenny main
deb-src http://152.16.229.8/ac/ftp.debian.org/debian/ lenny main

deb http://152.16.229.8/ac/ftp.debian.org/debian/ squeeze main
deb-src http://152.16.229.8/ac/ftp.debian.org/debian/ squeeze main

deb http://security.debian.org/ squeeze/updates main
deb-src http://security.debian.org/ squeeze/updates main

# debian multimedia
deb http://crispy/ac/www.debian-multimedia.org squeeze main

sudo apt-get update

sudo apt-get install xserver-xorg/stable xserver-xorg-core/stable x11-xkb-utils/stable x11-common/stable xserver-xorg-video-vesa/stable xserver-xorg-input-mouse/stable xserver-xorg-input-kbd/stable

Then download the installer from AMD's site, and use it to make --buildpkg Ubuntu/intrepid packages. You will need to install one dependency, dkms (something for controlling kernel modules?).

This will result in the following Debian packages:

fglrx-amdcccle_8.702-0ubuntu1_amd64.deb
fglrx-kernel-source_8.702-0ubuntu1_amd64.deb
fglrx-modaliases_8.702-0ubuntu1_amd64.deb
libamdxvba1_8.702-0ubuntu1_amd64.deb
xorg-driver-fglrx_8.702-0ubuntu1_amd64.deb
xorg-driver-fglrx-dev_8.702-0ubuntu1_amd64.deb

which you can then install, the kernel source one first.

useful for pining the xorg to come from stable: http://jaqque.sbih.org/kplug/apt-pinning.html

{789}
hide / edit[4] / print
ref: work-0 tags: emergent leabra QT neural networks GUI interface date: 10-21-2009 19:02 gmt revision:4 [3] [2] [1] [0] [head]

I've been reading Computational Explorations in Cognitive Neuroscience, and decided to try the code that comes with / is associated with the book. This used to be called "PDP+", but was re-written, and is now called Emergent. It's a rather large program - links to Qt, GSL, Coin3D, Quarter, Open Dynamics Library, and others. The GUI itself seems obtuse and too heavy; it's not clear why they need to make this so customized / panneled / tabbed. Also, it depends on relatively recent versions of each of these libraries - which made the install on my Debian Lenny system a bit of a chore (kinda like windows).

A really strange thing is that programs are stored in tree lists - woah - a natural folding editor built in! I've never seen a programming language that doesn't rely on simple text files. Not a bad idea, but still foreign to me. (But I guess programs are inherently hierarchal anyway.)

Below, a screenshot of the whole program - note they use a Coin3D window to graph things / interact with the model. The colored boxes in each network layer indicate local activations, and they update as the network is trained. I don't mind this interface, but again it seems a bit too 'heavy' for things that are inherently 2D (like 2D network activations and the output plot). It's good for seeing hierarchies, though, like the network model.

All in all looks like something that could be more easily accomplished with some python (or ocaml), where the language itself is used for customization, and not a GUI. With this approach, you spend more time learning about how networks work, and less time programming GUIs. On the other hand, if you use this program for teaching, the gui is essential for debugging your neural networks, or other people use it a lot, maybe then it is worth it ...

In any case, the book is very good. I've learned about GeneRec, which uses different activation phases to compute local errors for the purposes of error-minimization, as well as the virtues of using both Hebbian and error-based learning (like GeneRec). Specifically, the authors show that error-based learning can be rather 'lazy', purely moving down the error gradient, whereas Hebbian learning can internalize some of the correlational structure of the input space. You can look at this internalization as 'weight constraint' which limits the space that error-based learning has to search. Cool idea! Inhibition also is a constraint - one which constrains the network to be sparse.

To use his/their own words:

... given the explanation above about the network's poor generalization, it should be clear why both Hebbian learning and kWTA (k winner take all) inhibitory competition can improve generalization performance. At the most general level, they constitute additional biases that place important constraints on the learning and the development of representations. Mroe specifically, Hebbian learning constrains the weights to represent the correlational structure of the inputs to a given unit, producing systematic weight patterns (e.g. cleanly separated clusters of strong correlations).

Inhibitory competition helps in two ways. First, it encourages individual units to specialize in representing a subset of items, thus parcelling up the task in a much cleaner and more systematic way than would occur in an otherwise unconstrained network. Second, inhibition greatly restricts the settling dynamics of the network, greatly constraining the number of states the network can settle into, and thus eliminating a large proportion of the attractors that can hijack generalization.."

{787}
hide / edit[1] / print
ref: life-0 tags: IQ intelligence Flynn effect genetics facebook social utopia data machine learning date: 10-02-2009 14:19 gmt revision:1 [0] [head]

src

My theory on the Flynn effect - human intelligence IS increasing, and this is NOT stopping. Look at it from a ML perspective: there is more free time to get data, the data (and world) has almost unlimited complexity, the data is much higher quality and much easier to get (the vast internet & world!(travel)), there is (hopefully) more fuel to process that data (food!). Therefore, we are getting more complex, sophisticated, and intelligent. Also, the idea that less-intelligent people having more kids will somehow 'dilute' our genetic IQ is bullshit - intelligence is mostly a product of environment and education, and is tailored to the tasks we need to do; it is not (or only very weakly, except at the extremes) tied to the wetware. Besides, things are changing far too fast for genetics to follow.

Regarding this social media, like facebook and others, you could posit that social intelligence is increasing, along similar arguments to above: social data is seemingly more prevalent, more available, and people spend more time examining it. Yet this feels to be a weaker argument, as people have always been socializing, talking, etc., and I'm not sure if any of these social media have really increased it. Irregardless, people enjoy it - that's the important part.

My utopia for today :-)

{764}
hide / edit[2] / print
ref: work-0 tags: ocaml mysql programming functional date: 07-03-2009 19:16 gmt revision:2 [1] [0] [head]

Foe my work I store a lot of analyzed data in SQL databases. In one of these, I have stored the anatomical target that the data was recorded from - namely, STN or VIM thalamus. After updating the analysis programs, I needed to copy the anatomical target data over to the new SQL tables. Where perl may have been my previous go-to language for this task, I've had enuogh of its strange quiks, hence decided to try it in Ruby (worked, but was not so elegant, as I don't actually know Ruby!) and then Ocaml.

ocaml
#use "topfind"
#require "mysql"

(* this function takes a query and a function that converts entries 
in a row to Ocaml tuples *)
let read_table db query rowfunc =
	let r = Mysql.exec db query in
	let col = Mysql.column r in
	let rec loop = function
		| None      -> []
		| Some x    -> rowfunc col x :: loop (Mysql.fetch r)
	in
	loop (Mysql.fetch r)
	;;
	

let _ = 
	let db = Mysql.quick_connect ~host:"crispy" ~database:"turner" ~password:"" ~user:"" () in
	let nn = Mysql.not_null in
	(* this function builds a table of files (recording sessions) from a given target, then 
	uses the mysql UPDATE command to propagate to the new SQL database. *)
	let propagate targ = 
		let t = read_table db 
			("SELECT file, COUNT(file) FROM `xcor2` WHERE target='"^targ^"' GROUP BY file")
			(fun col row -> (
				nn Mysql.str2ml (col ~key:"file" ~row), 
				nn Mysql.int2ml (col ~key:"COUNT(file)" ~row) )
			)
		in
		List.iter (fun (fname,_) -> 
			let query = "UPDATE `xcor3` SET `target`='"^targ^
				"' WHERE STRCMP(`file`,'"^fname^"')=0" in
			print_endline query ;
			ignore( Mysql.exec db query )
		) t ;
	in
	propagate "STN" ; 
	propagate "VIM" ; 
	propagate "CTX" ; 
	Mysql.disconnect db ;;

Interacting with MySQL is quite easy with Ocaml - though the type system adds a certain overhead, it's not too bad.

{728}
hide / edit[1] / print
ref: -0 tags: john F kennedy quote opinion thought lie myth date: 04-14-2009 21:13 gmt revision:1 [0] [head]

For the great enemy of truth is very often not the lie—deliberate, contrived, and dishonest—but the myth— persistent, persuasive, and unrealistic. Too often we hold fast to the clichés of our forbears. We subject all facts to a prefabricated set of interpretations. We enjoy the comfort of opinion without the discomfort of thought.

—John F. Kennedy

{707}
hide / edit[1] / print
ref: Maquet-2001.11 tags: sleep learning memory Maquet date: 03-20-2009 18:38 gmt revision:1 [0] [head]

PMID-11691982[0] The Role of Sleep in Learning and Memory

  • 8 years ago; presumably much has changed?
  • NREM = SWS; REM = PS (paradoxical sleep)
  • nice table in there! looks as though he was careful in background research on this one; plenty of references.
  • "indeed, stress can also lead to an increase in REM sleep." -- but this may only be related to the presence of new material.
    • however, there is no increase in REM sleep if there is no material to learn.
  • reminder that theta rhythm is seen in the hippocampus in both exploratory activity and in REM sleep.
    • anticipated the presence of replay in the hippocampus
  • spindles allow the entry of Ca+2, which facilitates LTP (?).
  • I should check up on songbird learning (mentioned in the review!).
    • Young zebra finches have to establish the correspondence between vocal production (motor output) and the resulting auditory feedback (sensory).
    • This cannot be done during waking because the bird song arises from tightly time-coded sequence of activity; during sleep, however, motor output can be compared to sensory feedback (so as to capture an inverse model?)
  • PGO (ponto-geniculo-occipital) waves occur immediately before REM sleep. PGO waves are more common in rats after aversive training.
  • ACh increases cortical plasticity in adult mammals; REM sleep is characterized by a high level of ACh and 5-HT (serotonin).
---
  • sleep may not be necessary for recall-based learning, it just may be a goot time for it. Sharp waves and ripples are observed in both quiet waking and SWS.
  • Learning to reach in a force field is consolidated in 5 hours after training. [1]
  • Again mentions the fact that antidipressant drugs, which drastically reduce the amount of REM sleep, do not aversely affect memory.

____References____

[0] Maquet P, The role of sleep in learning and memory.Science 294:5544, 1048-52 (2001 Nov 2)
[1] Shadmehr R, Brashers-Krug T, Functional stages in the formation of human long-term motor memory.J Neurosci 17:1, 409-19 (1997 Jan 1)

{636}
hide / edit[1] / print
ref: Karni-1998.02 tags: motor learning skill acquisition fMRI date: 10-08-2008 21:05 gmt revision:1 [0] [head]

PMID-9448252[0] The acquisition of skilled motor performance: Fast and slow experience-driven changes in primary motor cortex

  • a few minutes of daily practice on a sequential finger opposition task induced large, incremental performance gains over a few weeks of training
  • performance was lateralized
  • limited training experience can be sufficient to trigger performance gains that require time to become evident.
  • learning is characterized by two stages:
    • "fast” learning, an initial, within-session improvement phase, followed by a period of consolidation of several hours duration
      • possibly this is due to synaptic plasticity.
    • and then “slow” learning, consisting of delayed, incremental gains in performance emerging after continued practice
      • In many instances, most gains in performance evolved in a latent manner not during, but rather a minimum of 6–8 hr after training, that is, between sessions
      • this is thought to correspond to the reorganization of M1 & other cortical structures.
  • long-term training results in highly specific skilled motor performance, paralleled by the emergence of a specific, more extensive representation of a trained sequence of movements in the contralateral primary motor cortex. this is seen when imaging for activation using fMRI.
  • why is there the marked difference between declarative learning, which often only takes one presentation to learn, and procedural memory, which takes several sessions to learn? Hypothetically, they require different neural substrates.
  • pretty good series of references...

____References____

{604}
hide / edit[1] / print
ref: Pastalkova-2008.09 tags: hippocampus Buzsaki sequences date: 09-22-2008 21:25 gmt revision:1 [0] [head]

PMID-18772431[0] Internally generated cell assembly sequences in the rat hippocampus.

  • The task was unique: the rats had to run in a wheel for 10-20 seconds before choosing the left or right arms of a figure-8 maze. The rats were rewarded with water if they alternated arm choice.
  • Looked at the activity of pyramical cells - many of them place cells as well as episode-cells - in the hippocampus, and found that the pattern of firing per neuron was predictable and predictive or which choice the rat would take after running in the wheel.
  • The same pattern of hippocampal firing was not found in a control running task (one that did not require a choice).
  • The pattern of firing was phase locked to the theta oscillations in the hippocampus; this phase relationship gradually advanced during the course of trials.
  • During the wheel running, there seemed to be a series of delayed firing bursts by the hippocampal neurons.

____References____

{478}
hide / edit[2] / print
ref: bookmark-0 tags: ECG wireless nordic quasar date: 12-07-2007 21:13 gmt revision:2 [1] [0] [head]

{493}
hide / edit[0] / print
ref: Clancy-2007.09 tags: EMG channel equalization filter date: 11-11-2007 05:04 gmt revision:0 [head]

PMID-17614134[0] Equalization filters for multiple-channel electromyogram arrays.

  • idea: use digital filtering to equalize (as in communication systems) each electrode in a large array, and then use this to drive the common-mode (digital) rejection.

____References____

{463}
hide / edit[1] / print
ref: bookmark-0 tags: quotes Helen Keller teaching education date: 10-09-2007 17:34 gmt revision:1 [0] [head]

http://www.ntlf.com/html/lib/quotes.htm

  • Only some 12% of a national sample of almost 400,000 teachers received less then average ratings from students. John Centra (heh!)

{459}
hide / edit[0] / print
ref: bookmark-0 tags: quotes wisdom economist date: 10-08-2007 03:05 gmt revision:0 [head]

http://www.optimist123.com/optimist/c1_words_of_wisdom/index.html

{441}
hide / edit[0] / print
ref: notes-0 tags: DSP filter quantize lowpass elliptic matlab date: 09-02-2007 15:20 gmt revision:0 [head]

So, in order to measure how quantizing filter coeficients affects filter response, I quantized the coefficients of a 8th order bandpass filter designed with:

[B1, A1] = ellip(4,0.8,70, [600/31.25e3 6/31.25]);
here is a function that quantizes & un-quantizes the filter coeff, then compares the frequency responses:
function [Bq, Aq, Bcoef, Acoef] = filter_quantize(B, A) 
% quantize filter coeficients & un-quantize so as to get some idea to
% the *actual* fixed-point filter performance. 
% assume that everything in broken into biquads. 
base = 10; 
Aroots = roots(A); 
Broots = roots(B); 
order = length(Aroots)/2; % the number of biquads.
scale = B(1).^(1/order); % distribute the gain across the biquads. 
for o = 0:order-1
	Acoef_biquad(o+1, :) = poly(Aroots(o*2+1 : o*2+2));
	Bcoef_biquad(o+1, :) = poly(Broots(o*2+1 : o*2+2))*scale; 
end
Bcoef = round(Bcoef_biquad .* 2^base); 
Acoef = round(Acoef_biquad .* 2^base); 
% now, reverse the process. 
Bq2 = Bcoef ./ 2^base; 
Aq2 = Acoef ./ 2^base; 
for o = 0:order-1
	Arootsq(o*2+1: o*2+2) = roots(Aq2(o+1, :)); 
	Brootsq(o*2+1: o*2+2) = roots(Bq2(o+1, :)); 
end
Aq = poly(Arootsq); 
Bq = poly(Brootsq).*B(1); 
[H, W] = freqz(B, A); 
[Hq, Wq] = freqz(Bq, Aq); 
figure
plot(W, db(abs(H)), 'b')
hold on
plot(W, db(abs(Hq)), 'r')
axis([0 pi -100 0])
The result: high frequency is not much affected

but low frequency is strongly affected.

But this is at a quatization to 10 bits - quantization to 15 bits lead to reasonably good performance. I'm not sure if this conclusively indicates / counterindicates downsampling prior to highpassing for my application, but i would say that it does, as if you downsample by 2 the highpass cutoff frequency will be 2x larger hence the filter will be less senitive to quantization errors which affect low frequencies.

{382}
hide / edit[1] / print
ref: notes-0 tags: keyboard frogpad layout qwerty date: 05-31-2007 00:52 gmt revision:1 [0] [head]

a while ago I made a custom keyboard for myself - something like the frogPad chording keyboard, but more suitable for programming. Here is the image i made for myself to learn the layout.

Upon testing, however, it seems that those scribbly marks on the paper had some import - this is the present layout, as re-drawn in inkscape. Presumably this second iteration is better?

{381}
hide / edit[2] / print
ref: notes-0 tags: low-power microprocessor design techniques ieee DSP date: 05-29-2007 03:30 gmt revision:2 [1] [0] [head]

http://hardm.ath.cx:88/pdf/lowpowermicrocontrollers.pdf

also see IBM's eLite DSP project.

{274}
hide / edit[0] / print
ref: bookmark-0 tags: DARPA projects quantum electron spin date: 04-04-2007 20:39 gmt revision:0 [head]

http://www.darpa.mil/DSO/trans/transit.htm

{256}
hide / edit[2] / print
ref: math-0 tags: partial least squares PLS regression thesis italy date: 03-26-2007 16:48 gmt revision:2 [1] [0] [head]

http://www.fedoa.unina.it/593/

  • pdf does not seem to open in linux? no, doesn't open on windows either - the Pdf is screwed up!
  • here is a published version of his work.

{217}
hide / edit[2] / print
ref: notes-0 tags: mysql join date: 02-17-2007 18:13 gmt revision:2 [1] [0] [head]

you can join two tables by matching their entries!!

SELECT * FROM `eval` LEFT JOIN (`infoT` ) ON (infoT.file=eval.file AND infoT.chan=eval.chan AND infoT.unit=eval.unit AND infoT.section=eval.section) WHERE infoT.maxinfo/infoshuf > 20 ORDER BY eval.eval DESC

{98}
hide / edit[0] / print
ref: bookmark-0 tags: unscented kalman filter square-root Merwe date: 0-0-2007 0:0 revision:0 [head]

http://hardm.ath.cx/pdf/unscentedKalmanFilter.pdf -- the square root transform. contains a nice tabulation of the original algorithm, which i what I use.

http://hardm.ath.cx/pdf/unscentedKalmanFilter2000.pdf -- the original, with examples of state, parameter, and dual estimation

http://en.wikipedia.org/wiki/Kalman_filter -- wikipedia page, also has the unscented kalman filter

http://www.cs.unc.edu/~welch/kalman/media/pdf/Julier1997_SPIE_KF.pdf - Julier and Ulhmann's original paper. a bit breif.

http://www.cs.ubc.ca/~murphyk/Papers/Julier_Uhlmann_mar04.pdf -- Julier and Ulhmann's invited paper, quite excellent.

{103}
hide / edit[0] / print
ref: bookmark-0 tags: Shadmehr torque forces jacobian date: 0-0-2007 0:0 revision:0 [head]

The Computational Neurobiology of Reaching and Pointing - online notes

{107}
hide / edit[0] / print
ref: notes-0 tags: SQL kinarm count date: 0-0-2007 0:0 revision:0 [head]

SELECT file, COUNT(file) FROM info2 WHERE unit>1 AND maxinfo/infoshuf > 10 AND analog < 5 GROUP BY file ORDER BY COUNT(file) DESC

to count the number of files matching the criteria.. and get aggregate frequentist statistics.

{139}
hide / edit[0] / print
ref: Schaal-1998.11 tags: schaal local learning PLS partial least squares function approximation date: 0-0-2007 0:0 revision:0 [head]

PMID-9804671 Constructive incremental learning from only local information

{18}
hide / edit[0] / print
ref: notes-0 tags: SQL fulltext search example date: 0-0-2006 0:0 revision:0 [head]

SELECT * FROM `base` WHERE MATCH(`From`, `To`) AGAINST('hanson') ORDER BY `Date` DESC Limit 0, 100

  • you need to have a fulltext on the column set provided as a parameter to the MATCH() keyword. Case does not matter so log as the coalition is correct.

{30}
hide / edit[0] / print
ref: notes-0 tags: matlab mysql hack date: 0-0-2006 0:0 revision:0 [head]

LD_PRELOAD=/lib/libgcc_s.so.1 matlab this allows you to command-line call mysql or other programs that use linux's standard libgcc.

{34}
hide / edit[0] / print
ref: bookmark-0 tags: linear_algebra solution simultaneous_equations GPGPU GPU LUdecomposition clever date: 0-0-2006 0:0 revision:0 [head]

http://gamma.cs.unc.edu/LU-GPU/lugpu05.pdf