Tuesday, January 8, 2008

Null Space versus Orthogonal Linear Discriminant Analysis


This article tells me one important thing. LDA or Fisher discriminant criterion is still being studied by lots of guy. Last time I read one by Manli Zhu, in which they propose another different criterion than the traditional ones when the co-variance matrix is rank-deficient.

One of the simplest idea is applying a ridge purturbation to the singular with-in class covariance matrix, which is usu. called regularization. The other is finding an approximation in the principal subspace of Sw. Zhu's idea roots from the latter one, in which all principal directions are selected for later general eigenvalue problem. Zhu rejects those principal directions that are almost perpendicular to the subspace spanned by Sb.

In this paper, the author mentions several other strategies for dealing with the singularity of Sw, two of which they put attention to Orthognal LDA (OLDA) and Null Space LDA (NLDA). Their main result is they yield the same result under certain condtions. The condition is mild when dealing with high dimensional data (the number of dimension is much higher than the number of samples).

I guess I will review this article after I check those LDAs, which personally I suspect their usability. Here are those literatures:
  • A new LDA-based face recognition system which can solve the small size problem, Pattern Recognition 33 (2000) by Chen and et al.
  • Solving the small sample size problem of LDA, ICPR 2002, by Huang and et al.
  • Characterization of a family of algorithms for generalized discriminant analysis on undersampled problems, JMLR 2005.
  • Penalized discriminant analysis, Anals of Statistics 23 (1995) by Hastie and et al.

2 comments:

Anonymous said...

ignorant!

Unknown said...

well, I am glad that you know I am ignorant. that's why I am always struggling for new knowledge. hope you could bother telling me your opinion on this paper. I'd be always ready for your priceless idea.