Friday, July 17, 2009
Boosting Products of Base Classifiers
Boosting is quite a useful technique for practical tasks, since we may get robust and high-precision classifier or regressor by combining the weak ones. Among them AdaBoost.MH is the most popular version (a little different from the one for binary version). This paper trains products of base classifiers instead of tree or MoE-like ones (considering Hinton's claim of the advantages of PoE over MoE). The design of the algorithm is not difficult though. The results on MNIST shows it is the second best algorithm (the best is DBN).
Well, implement one :-p
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment