Wednesday, December 24, 2008

Regression on Manifold Using Kernel Dimension Reduction


This paper describes a regression problem: the samples come from a manifold embedded in a high dimensional space and the response is function defined on the manifold. There are 3 techniques applied in this problem: sufficient dimension reduction (SDR), kernel dimension reduction (KDR) and manifold learning.

The basic idea is from KDR, in which we seek a subspace (to be the central subspace) B such that the projection onto the subspace are sufficient to regress the responses. This idea (in linear case, namely SDR) requires solving the following optimization problem,

where KZc is projected Gram matrix. Here with manifold learning, the projected sample Bxi in the RKHS is approximated with the affine transformation of the coordinates obtained by the manifold learning algorithm (e.g. Laplacian Eigenmap). Therefore the optimization problem could be formulated as

After solving the problem, the regression could be done in the corresponding subspace. The difficult part is how to solve this nonlinear and nonconvex problem. It is solved with projected gradient descent method.

But it seems hard to extend to new data. It is argued that by computing the latent B, it is possible to extend the regressor on new data. But it doesn't say how. I guess it must be much tougher :-(

No comments: