Wednesday, January 12, 2011

Learning with Local and Glabal Consistency


This paper is about label propagation in semi-supervised learning setting. The basic idea about label propagation is to use a graph as a random walk structure. The label is propagated with the following equation
Y(t + 1) = \alpha L Y(t) + (1 - \alpha) Y(t)
where Y(t) is the label matrix (in multi-class classification, it is a unit vector for labeled sample) and S is the normalized weight matrix. The iterative procedure will result in a limit of
Y^\star = (I - \alpha S)^{-1} Y
This reveals some connection with the graph Laplacian.
Th interesting part is how we may develop a parallel version of this idea.

No comments: