logo Idiap Research Institute        
 [BibTeX] [Marc21]
DynaBoost: Combining Boosted Hypotheses in a Dynamic Way
Type of publication: Idiap-RR
Citation: Moerland-99.1a
Number: Idiap-RR-09-1999
Year: 1999
Institution: IDIAP
Abstract: We present an extension of Freund and Schapire's AdaBoost algorithm that allows an input-dependent combination of the base hypotheses. A separate weak learner is used for determining the input-dependent weights of each hypothesis. The error function minimized by these additional weak learners is a margin cost function that has also been shown to be minimized by AdaBoost. The weak learners used for dynamically combining the base hypotheses are simple perceptrons. We compare our dynamic combination model with AdaBoost on a range of binary and multi-class classification problems. It is shown that the dynamic approach significantly improves the results on most data sets when (rather weak) perceptron base hypotheses are used, while the difference in performance is small when the base hypotheses are MLPs.
Userfields: ipdmembership={learning},
Projects Idiap
Authors Moerland, Perry
Mayoraz, Eddy
Added by: [UNK]
Total mark: 0
  • rr99-09.pdf
  • rr99-09.ps.gz