PMID29123069 A neural algorithm for a fundamental computing problem
 Ceneral idea: localitysensitive hashing, e.g. hashing that is sensitive to the highdimensional locality of the input space, can be efficiently solved using a circuit inspired by the insect olfactory system.
 Here, activation of 50 different types of ORNs is mapped to 50 projection neurons, which 'centers the mean'  concentration dependence is removed.
 This is then projected via a random matrix of sparse binary weights to a much larger set of Kenyon cells, which in turn are inhibited by one APL neuron.
 Normal localitysensitive hashing uses dense matrices of Gaussiandistributed random weights, which means higher computational complexity...
 ... these projections are governed by the JohnsonLindenstrauss lemma, which says that projection from highd to lowd space can preserve locality (distance between points) within an error bound.
 Show that the WTA selection of the top 5% plus random binary weight preserves locality as measured by overlap with exact input locality on toy data sets, including MNIST and SIFT.
 Flashy title as much as anything else got this into Science... indeed, has only been cited 6 times in Pubmed.

