use https for features.
text: sort by
tags: modified
type: chronology
{822} is owned by tlh24.{845} is owned by tlh24.
hide / / print
ref: -0 tags: kernel regression structure discovery fitting gaussian process date: 09-24-2018 22:09 gmt revision:1 [0] [head]

Structure discovery in Nonparametric Regression through Compositional Kernel Search

  • Use Gaussian process kernels (squared exponential, periodic, linear, and ratio-quadratic)
  • to model a kernel function, k(x,x)k(x,x') which specifies how similar or correlated outputs yy and yy' are expected to be at two points $$x$ and xx' .
    • By defining the measure of similarity between inputs, the kernel determines the pattern of inductive generalization.
    • This is different than modeling the mapping y=f(x)y = f(x) .
    • It's something more like y=N(m(x)+k(x,x))y' = N(m(x') + k(x,x')) -- check the appendix.
    • See also: http://rsta.royalsocietypublishing.org/content/371/1984/20110550
  • Gaussian process models use a kernel to define the covariance between any two function values: Cov(y,y)=k(x,x)Cov(y,y') = k(x,x') .
  • This kernel family is closed under addition and multiplication, and provides an interpretable structure.
  • Search for kernel structure greedily & compositionally,
    • then optimize parameters with conjugate gradients with restarts.
    • This seems straightforwardly intuitive...
  • Kernels are scored with the BIC.
  • C.f. {842} -- "Because we learn expressions describing the covariance structure rather than the functions themselves, we are able to capture structure which does not have a simple parametric form."
  • All their figure examples are 1-D time-series, which is kinda boring, but makes sense for creating figures.
    • Tested on multidimensional (d=4) synthetic data too.
    • Not sure how they back out modeling the covariance into actual predictions -- just draw (integrate) from the distribution?

hide / / print
ref: notes-0 tags: Gladwell talent narcissism management structure business date: 11-19-2009 06:02 gmt revision:1 [0] [head]

http://www.gladwell.com/pdf/talent.pdf -- From 2002. Old but excellent. Structure is required to achieve broad, slow to ROI projects. (It's almost common sense when expressed this way!)

hide / / print
ref: Wagner-2004.01 tags: sleep insight mental restructure integration synthesis consolidation date: 03-20-2009 21:31 gmt revision:1 [0] [head]

PMID-14737168[0] Sleep Inspires Insight.

  • Subjects performed a cognitive task requiring the learning of stimulus–response sequences, in which they improved gradually by increasing response speed across task blocks. However, they could also improve abruptly after gaining insight into a hidden abstract rule underlying all sequences.
    • number reduction task - three numbers 1, 4, 9, in short sequence, with a simple comparison rule to generate a derivative number sequence; task was to determine the last number in sequence; this number was always the same as the second number.
  • This abstract rule was more likely to be learned after 8 hours of sleep as compared to 8 hours of wakefulness.
  • My thoughts: replay during sleep allows synchronous replay of cortical activity seen during the day (presumably from the hippocampus to the neocortex), replay which is critical for linking the second number with the last (response) number. This is a process of integration: merging present memories with existing memories / structure. The difference in time here is not as long as it could be .. presumably it goes back to anything in your cortex that is activated buy the hippocampal memories. In this way we build up semi-consistent integrated maps of the world. Possibly these things occur during dreams, and the weird events/thoughts/sensations are your brain trying to smooth and merge/infer things about the world.


[0] Wagner U, Gais S, Haider H, Verleger R, Born J, Sleep inspires insight.Nature 427:6972, 352-5 (2004 Jan 22)

hide / / print
ref: bookmark-0 tags: plexon documenation data file structure reading plx date: 0-0-2006 0:0 revision:0 [head]