LDMNet: Low dimensional manifold regularized neural nets.
 Synopsis of the math:
 Fit a manifold formed from the concatenated input ‘’and’’ output variables, and use this set the loss of (hence, train) a deep convolutional neural network.
 Manifold is fit via point integral method.
 This requires both SGD and variational steps  alternate between fitting the parameters, and fitting the manifold.
 Uses a standard deep neural network.
 Measure the dimensionality of this manifold to regularize the network. Using a 'elegant trick', whatever that means.
 Still yet he results, in terms of error, seem not very significantly better than previous work (compared to weight decay, which is weak sauce, and dropout)
 That said, the results in terms of feature projection, figures 1 and 2, ‘’do’’ look clearly better.
 Of course, they apply the regularizer to same image recognition / classification problems (MNIST), and this might well be better adapted to something else.
 Not completely thorough analysis, perhaps due to space and deadlines.
