Friday, March 28, 2008
Null Space versus Orthogonal Linear Discriminant Analysis
In this paper, several LDA algorithms are addressed. The main result says OLDA and NLDA under certain conditions will yield the same result.
I seldom consider the details of the LDA algorithms, but after I read the journal version about the OLDA, I think it's necessary to get a summary for their current work. I am thinking about the possibility of reusing their techniques in the graph-based algorithms. But for the current paper, it's too far.
One deeply-sighted observation is the simultaneous diagonalization of Sw, Sb and St. Then the optimization is generalized from ``inverse'' of St to its pseudo-inverse. Actually, we have several constraints for different LDA algorithms. The ULDA requires orthogonality w.r.t. St while OLDA simply requires orthogonality w.r.t. an identity matrix. The so called NLDA maximizes the projected between class scatterness in the null space of Sw.
The condition they find for the equality is the rank equality. They test their algorithm on several high-dimensional data sets. I guess it's easy to follow their motivation.
Though lots of graph-based dimensionality reduction algorithms extensively use the idea from LDA, the simultaneous diagonalization is seldom feasible. Maybe, it's worth trying to get a more generalized solution for that rank-deficit general eigenvalue problem.
Subscribe to:
Posts (Atom)