Sunday, May 10, 2009

Kernel Constrained Covariance for Dependence Measurement


For measuring independence, they propose another kernel-based criterion COCO (constrained covariance, in a previous paper, it is called kernel covariance, KC, however their objective actually comes from the multual information of Gaussian r.v.s) other than the previously scanned paper of Jordan's (KCC and KGV). This has been mentioned in their earlier work.

This work has quite different concerns. Since previous papers did not say how to choose the kernels. They prove COCO is only zero at independence for universal kernels. The so called universal kernel is a kernel such that on a compact metric space (\mathcal{X}, d), the RKHS induced by the kernel k(\cdot, \cdot) is dense in the continous function space over \mathcal{X}, namely C(\mathcal{X}) w.r.t. the topology induced by the infinity norm \| f - g \|_\infty. They proved Gaussian kernels and Laplacian kernels are universal.

They also point out the limitation of independence tests based on universal kernels. The proposed COCO also has its adversary. That is when we do have small COCO, the r.v.s are still dependent. But they found their calculation of COCO has an exponential convergence rate to the true COCO value.

The calculation of COCO is equivalent to
\mathrm{COCO}(z, F, G) = \frac{1}{n}\sqrt{\| \tilde{K}^f \tilde{K}^g \|_2}.
Their later paper actually uses Frobenius norm. Their experiment shows an example of applying COCO to ICA on fMRI data, compared with KMI and correlation method (only using correlation as a testing of independence).

No comments: