m8ta
you are not logged in, login. new entry
text: sort by
tags: modified
type: chronology
{1416}
hide / edit[0] / print
ref: -0 tags: cutting plane manifold learning classification date: 10-31-2018 23:49 gmt revision:0 [head]

Learning data manifolds with a Cutting Plane method

  • Looks approximately like SVM: perform binary classification on a high-dimensional manifold (or sets of manifolds in this case).
  • The general idea behind Mcp_simple is to start with a finite number of training examples, find the maximum margin solution for that training set, augment the draining set by finiding a poing on the manifolds that violates the constraints, iterating the process until a tolerance criteria is met.
  • The more complicated cutting plane SVM uses slack variables to allow solution where classification is not linearly separable.
    • Propose using one slack variable per manifold, plus a manifold center, which strictly obeys the margin (classification) constraint.
  • Much effort put to proving the convergence properties of these algorithms; admittedly I couldn't be bothered to read...

{760}
hide / edit[2] / print
ref: -0 tags: LDA myopen linear discriminant analysis classification date: 01-03-2012 02:36 gmt revision:2 [1] [0] [head]

How does LDA (Linear discriminant analysis) work?

It works by projecting data points onto a series of planes, one per class of output, and then deciding based which projection plane is the largest.

Below, to the left is a top-view of this projection with 9 different classes of 2D data each in a different color. Right is a size 3D view of the projection - note the surfaces seem to form a parabola.

Here is the matlab code that computes the LDA (from myopen's ceven

% TrainData and TrainClass are inputs, column major here.
% (observations on columns)
N = size(TrainData,1);
Ptrain = size(TrainData,2);
Ptest = size(TestData,2);

% add a bit of interpolating noise to the data.
sc = std(TrainData(:)); 
TrainData =  TrainData + sc./1000.*randn(size(TrainData));

K = max(TrainClass); % number of classes.

%%-- Compute the means and the pooled covariance matrix --%%
C = zeros(N,N);
for l = 1:K;
	idx = find(TrainClass==l);
		% measure the mean per class
	Mi(:,l) = mean(TrainData(:,idx)')';
		% sum all covariance matrices per class
	C = C + cov((TrainData(:,idx)-Mi(:,l)*ones(1,length(idx)))');
end

C = C./K; % turn sum into average covariance matrix
Pphi = 1/K;
Cinv = inv(C);

%%-- Compute the LDA weights --%%
for i = 1:K
	Wg(:,i) = Cinv*Mi(:,i);
		% this is the slope of the plane
	Cg(:,i) = -1/2*Mi(:,i)'*Cinv*Mi(:,i) + log(Pphi)';
		% and this, the origin-intersect.
end

%%-- Compute the decision functions --%%
Atr = TrainData'*Wg + ones(Ptrain,1)*Cg;
	% see - just a simple linear function! 
Ate = TestData'*Wg + ones(Ptest,1)*Cg;

errtr = 0;
AAtr = compet(Atr');
	% this compet function returns a sparse matrix with a 1
	% in the position of the largest element per row. 
	% convert to indices with vec2ind, below. 
TrainPredict = vec2ind(AAtr);
errtr = errtr + sum(sum(abs(AAtr-ind2vec(TrainClass))))/2;
netr = errtr/Ptrain;
PeTrain = 1-netr;

{724}
hide / edit[2] / print
ref: Oskoei-2008.08 tags: EMG pattern analysis classification neural network date: 04-07-2009 21:10 gmt revision:2 [1] [0] [head]

  • EMG pattern analysis and classification by Neural Network
    • 1989!
    • short, simple paper. showed that 20 patterns can accurately be decoded with a backprop-trained neural network.
  • PMID-18632358 Support vector machine-based classification scheme for myoelectric control applied to upper limb.
    • myoelectric discrimination with SVM running on features in both the time and frequency domain.
    • a survace MES (myoelectric sensor) is formed via the superposition of individual action potentials generated by irregular discharges of active motor units in a muscle fiber. It's amplitude, variance, energy, and frequency vary depending on contration level.
    • Time domain features:
      • Mean absolute value (MAV)
      • root mean square (RMS)
      • waveform length (WL)
      • variance
      • zero crossings (ZC)
      • slope sign changes (SSC)
      • William amplitude.
    • Frequency domain features:
      • power spectrum
      • autoregressive coefficients order 2 and 6
      • mean signal frequency
      • median signal frequency
      • good performance with just RMS + AR2 for 50 or 100ms segments. Used a SVM with a RBF kernel.
      • looks like you can just get away with time-domain metrics!!

{147}
hide / edit[0] / print
ref: Blankertz-2003.06 tags: BMI BCI EEG error classification motor commands Blankertz date: 0-0-2007 0:0 revision:0 [head]

PMID-12899253 Boosting bit rates and error detection for the classification of fast-paced motor commands based on single-trial EEG analysis

  • want to minimize subject training and maximize the major learning load on the computer.
  • task: predict the laterality of imminent left-right hand finger movements in a natural keyboard typing condition. they got ~15bits/minute (in one subject, ~50bits per minute!)
    • used non-oscilatory signals.
  • did a to detect 85% percent of error trials, and limited false-positives to ~2%

{66}
hide / edit[0] / print
ref: bookmark-0 tags: machine_learning classification entropy information date: 0-0-2006 0:0 revision:0 [head]

http://iridia.ulb.ac.be/~lazy/ -- Lazy Learning.