This paper tells us about an old clustering technique called k q-flats. It can be regarded as a generalization of k-means algorithm but we find a q-dimensional space to project onto instead of a centroid for each cluster,
\sum_{j = 1}^K \sum_{x \in K_j} \| x - P_{F_j} x \|^2
This can be solved with k-means-like (EM-like) algorithms. Basically if we want to apply it to classification problems, we may train a k q-flats for each class but this model lacks discriminative power.But this can be solved in a natural way
\sum_{i = 1}^c\sum_{j = 1}^K \left( \sum_{x \in K_{i, j}}g_1(\| F_{i, j}^\top x \|^2) + \sum_{x \not \in C_i} g_2(\| F_{i, j}^\top x\|^2)\right)
A little difference is the optimization has to be done with gradient. The selected g_i
are Hinge-like loss funtions and the derivatives are simple piecewise constant functions.
No comments:
Post a Comment