This paper is interesting in formulating a clustering problem as a convex optimization problem. Its framework is not brand new but the idea is very enlightening for a current project. referring to maximum margin method, everyone knows how SVM is modelled with a quadratic programming problem with linear constraints. In clustering, we are not interested in the parametrized separation boundary but in the partition of samples that maximizes the boundary. Then the problem turns out to be an integer optimization problem, which is often NP-hard.
The key idea is to use the out product, e.g. the kernel matrix of the assignment matrix
M = y y^\top
in the dual formulation of soft-margin SVM. And another interesting finding is that instead of using non-convex constraint \mathrm{rank}(M) = 1
, they use three linear constraints:M
encodes equivalent class information, i.e. transitiveM_{i, k} \geq M_{i, j} + M_{j, k} - 1
, reflexiveM_{i,i} = 1
and symmetricM_{i, j} = M_{j, i}
;M
has at most 2 equivalent classes, i.e.M_{j, k} \geq - M_{i, j} - M_{i, k} - 1
;M
has at least 2 equivalent classes,\sum_i M_{i, j} \leq N - 2
.
Then the problem becomes a SDP. It might not be easy to extend the idea to multi-clustering.