Sunday, July 19, 2009

Regression by Dependence Minimization and its Application to Causal Inference in Additive Noise Models


This paper follows the NIPS 08 paper, with a different regressor. We know HSIC could be used for dependence maximization as well as independence maximization. Here we simply use its dependence. Basically there are two ways of doing regression, one to minimize the dependence of the residual and the input, the other to maximize the dependence of the response and the prediction (is that possible?)

With HSIC it is easier to minimize the dependence of the response and the residue. The pro is that we do not need to specify the additive noise's distribution. In many regression problems, we actually assume the noise is a Gaussian. Now we may forget about the possible violation of this assumption. But the con is that now the optimization of the model will be much more difficult.

The good thing about HSIC is that we have a statistical test, which allows us to judge whether it is statistically reliable to assert the independence. I am not sure whether we have a similar statistic for KDR-like terms.

No comments: