m8ta
You are not authenticated, login.
text: sort by
tags: modified
type: chronology
{1577}
hide / / print
ref: -2008 tags: sketch armando solar lezema inductive program synthesis date: 11-11-2023 20:08 gmt revision:5 [4] [3] [2] [1] [0] [head]

Sketch - Program synthesis by sketching

  • Sketch consists of a C-like language with the additional primitives: integer-valued holes ??, reorder, generator (macros where the holes can vary), assert, repeat (also a variable-hole parameterized macro), regular-expression generators for selecting substitute programming primitives, and atomics for concurrency-focused synthesis (fork and compare-and-swap).
    • The other common programming primitives are there, too - if, else, while, procedures, arrays.
      • Loops, arrays, and recursion are limited to a fixed size to constrain the search space.
  • Sketch is inductive - working solutions are derived from the skeleton provided by the programmer + counterexamples provided by a verification function.
    • This is compared to previous work, which focused on deductive synthesis: programs are derived from a constructive proof of the satisfiabiltiy of a problem.
      • This 'proved' challenging in practice -- programmers are much better versed in construction rather than abstract mathematical derivation.
    • "By giving up the deductive approach, we lose correctness by construction, and we are forced to rely on an external validation procedure."
  • So they use CEGIS: CounterExample Guided Inductive Synthesis.
    • For most sketches, only a small set of counterexamples are needed to fully constrain the solution.
    • The counterexamples are fed to the inductive synthesizer so that the next candidate will be guaranteed to work correctly for all.
    • Each counterexample reduces the search space by (typically) 2^15 -- much information for each -- so only a handful of validation calls are usually needed, even when the solution space is very large.
      • E.g. 40 iterations when the candidate space is 2^600
        • Illuminating that computers can so readily handle such large spaces with prefect fidelity!
    • Counterexamples are much easier to find than an abstact model (as in the decuctive approach)
  • To do the synthesis, the sketch is expanded to a 'control': a set of variables, holes and calling contexts.
  • This control includes parameterized values that track the value of each variable, along with operators that define their relation to other variables, holes, and calling contexts.
    • The denotation function translates any expression to a function from a state to a parameterized value.
  • The program sketch is parsed (partially evaluated) to iteratively update σ,Φ\sigma, \Phi , where σ\sigma is a mapping from the set of variable names to parameterized values, and Φ\Phi is the set of control functions (holes and calling contexts).
    • With control flow like while and if, the set of parameterized values σ\sigma (with calling context) and control functions Φ\Phi are unrolled (to a limit) and unioned for the two branches (if & else).
    • Similarly for procedures and generators -- iterative state σ,Φ\sigma, \Phi is updated with additional calling context τ\tau (and bounded recursion / bounded evaluation! - kk is a hyperparameter)
    • Assertions and "ifs" are modeled as intersections on the iterative state.
      • In this way, semantics can model the behavior of all candidate solutions simultaneously -- e.g. Φ\Phi is a set of all possible assignments to the holes.
  • (Chapter 5) In practice: control is represented in an intermediate language, itself represented as a DAG.
    • Several optimizations are applied to make the DAG simpler, e.g. structural hashing to remove common sub-trees, and constant propagation (which may be redundant with internal SAT rules).
    • There seems to be a tension between Chapter 3 (abstract denotational semantics) and Chapter 5 (concrete DAG manipulations); how exactly they relate could have been spelled out better.
      • I.e.: how do the context, parameterized values, and control functions directly lead to the DAG??
    • I infer: the DAG seems to be a representation of the data & code flow graph that makes explicit the relationships between variables, holes, and control structures. (this is never mentioned, but seems to be the meaning of 'denotation'.)
  • The denotation function (compilation or partial evaluation) only needs to be computed once -- subsequent CEGIS loop evaluations only need to update the SAT problem from the counter-examples.
  • Integers are represented via mutually-exclusive & ordered guard bits: if a bit is 1, then the hole or variable evaluates to that integer. Aka one-hot encoding. This permits translation from DAG to CNF.
  • CEGIS starts by finding a solution = integer assignment of holes & permutations that make the SAT problem evaluate to True.
  • Based on this control vector, it then looks for a solution (value of the input variables) to the SAT problem that evaluates as False.
    • Clever re-use of the SAT solver & representation!
  • This counter-example is used to improve the SAT problem = add statements (system must evaluate to true with all input-output counter-examples) = remove hypotheses from Φ\Phi .
  • Did not delve much into the concurrency-related chapters (7-9) or the Stencil scientific computing chapters (10-11).

The essential algorithm, in words:

Take the sketch, expand it to a set of parameterized variables, holes, and calling contexts. Convert these to a DAG aka (?) data-code flow graph w/ dependencies. Try to simplify the DAG, one-hot encode integers, and convert to either a conjunctive-normal-form (CNF) SAT problem for MiniSat, or to a boolean circuit for the ABC solver.

Apply MiniSat or ABC to the problem to select a set of control values = values for the holes & permutations that satisfy the boolean constraints. Using this solution, use the SAT solver to find a input variable configuration that does not satisfy the problem. This serves as a counter-example. Run this through the validator function (oracle) to see what it does; use the counter-example and (inputs and outputs) to add clauses to the SAT problem. Run several times until either no counter-examples can be found or the problem is `unsat`.

Though the thesis describes a system that was academic & relatively small back in 2008, Sketch has enjoyed continuous development, and remains used. I find the work that went into it to be remarkable and impressive -- even with incremental improvements, you need accurate expansion of the language & manipulations to show proof-of-principle. Left wondering what limits its application to even larger problems -- need for a higher-level loop that further subdivides / factorizes the problem, or DFS for filling out elements of the sketch?

Interesting links discovered in while reading the dissertation:

  • Lock-free queues using compare_exchange_weak in C++ (useful!)
  • Karatsuba algorithm for multiplying large n-digit numbers with O(n 1.58)O(n^{1.58}) complexity. (Trick: divide and conquer & use algebra to re-use intermediate results via addition and subtraction)

{1301}
hide / / print
ref: -0 tags: polybenzoxazole PBO synthesis zylon date: 10-10-2014 22:40 gmt revision:0 [head]

Synthesis and thermal properties of polybenzoxazole from soluble precursors with hydroxy-substituted polyenaminonitrile

  • Process:
    1. purified/distilled reagents
    2. made the CCB, a open-ring soluble precursor
    3. eluted the CCB in water / methanol
    4. thermoset the resulting polymer.
  • No control of molecular weight, nor material properties of the cured film.
  • Resultant film was highly temperature resistant, though.

{892}
hide / / print
ref: -0 tags: automatic programming synthesis loops examples date: 10-03-2011 22:28 gmt revision:1 [0] [head]

Open letter proposing some ideas on how to automate programming: simulate a human! Rather from a neuro background, and rather sketchy (as in vague, not as in the present slang usage).

images/892_1.pdf

{809}
hide / / print
ref: work-0 tags: sine wave synthesis integrator date: 02-03-2010 05:52 gmt revision:1 [0] [head]

I learned this in college, but have forgotten all the details - Microcontroller provides an alternative to DDS

freq=|F|2πτ freq = \frac{\sqrt{|F|}}{2 \pi \tau} where τ\tau is the sampling frequency. F ranges from -0.2 to 0.

{713}
hide / / print
ref: Graves-2001.04 tags: sleep memory REM protein synthesis review date: 03-25-2009 15:23 gmt revision:1 [0] [head]

PMID-11250009[0] Sleep and memory: a molecular perspective.

  • inhibition of protein synthesis is most effective if it occurs at a time post-training when rapid eye movement (REM) sleep is required for memory consolidation
  • The neurochemical changes that occur across sleep/wake states, especially the cholinergic changes that occur in the hippocampus during REM sleep, might provide a mechanism by which sleep modulates specific cellular signaling pathways involved in hippocampus-dependent memory storage.
    • REM sleep could influence the consolidation of hippocampus-dependent long-term memory if it occurs during windows that are sensitive to cholinergic or serotonergic signaling.
    • PKA activation seems important to hippocampal long-term memory
    • NMDA affects PKA through Ca2+ to adenyl cyclase
    • 5-HT_1A receptor negatively coupled to adenyl cyclase (AC)
    • 5-HT concentrations go down in hippocampus during sleep ?

____References____

[0] Graves L, Pack A, Abel T, Sleep and memory: a molecular perspective.Trends Neurosci 24:4, 237-43 (2001 Apr)

{708}
hide / / print
ref: Wagner-2004.01 tags: sleep insight mental restructure integration synthesis consolidation date: 03-20-2009 21:31 gmt revision:1 [0] [head]

PMID-14737168[0] Sleep Inspires Insight.

  • Subjects performed a cognitive task requiring the learning of stimulus–response sequences, in which they improved gradually by increasing response speed across task blocks. However, they could also improve abruptly after gaining insight into a hidden abstract rule underlying all sequences.
    • number reduction task - three numbers 1, 4, 9, in short sequence, with a simple comparison rule to generate a derivative number sequence; task was to determine the last number in sequence; this number was always the same as the second number.
  • This abstract rule was more likely to be learned after 8 hours of sleep as compared to 8 hours of wakefulness.
  • My thoughts: replay during sleep allows synchronous replay of cortical activity seen during the day (presumably from the hippocampus to the neocortex), replay which is critical for linking the second number with the last (response) number. This is a process of integration: merging present memories with existing memories / structure. The difference in time here is not as long as it could be .. presumably it goes back to anything in your cortex that is activated buy the hippocampal memories. In this way we build up semi-consistent integrated maps of the world. Possibly these things occur during dreams, and the weird events/thoughts/sensations are your brain trying to smooth and merge/infer things about the world.

____References____

[0] Wagner U, Gais S, Haider H, Verleger R, Born J, Sleep inspires insight.Nature 427:6972, 352-5 (2004 Jan 22)

{565}
hide / / print
ref: Walker-2005.12 tags: algae transfection transformation protein synthesis bioreactor date: 03-21-2008 17:22 gmt revision:1 [0] [head]

Microalgae as bioreactors PMID-16136314

{243}
hide / / print
ref: Dzirasa-2006.1 tags: DAT-KO Kafui Nicolelis sleep wake dopamine tyrosine synthesis date: 03-12-2007 01:50 gmt revision:1 [0] [head]

PMID-17035544 Dopaminergic control of sleep-wake states.

  • dopmergic activity is high in REM sleep!! perhaps this is involved in learning?
  • they have a good description of the DAT-KO model, and why it is good for both exessive levels of synaptic dopamine as well as depressed/parkinsonian levels...
  • also at http://hardm.ath.cx:88/pdf/Kafui2006.pdf