Monday, February 15, 2010

Projected Gradient Methods for Nonnegative Matrix Factorization


This paper proposed an optimization technique for NMF problems, which I think could also be extended to more general Bregman divergence. The idea is based on optimization. It is pointed out that most algorithms can converge to local stationary points not local minima. The multiplicative updating rule only ensures the convergence, whether it be stationary point (let alone local minima) or not. The proposed algorithm is quite simple, merely eliminating those coords out of the feasible domain (i.e. set those negative values to 0). The result looks better than the multiplicative updating rule.

No comments: