CONF
Luo_CVPR-OLCV_2010/IDIAP
OM-2: An Online Multi-class Multi-kernel Learning Algorithm
Luo, Jie
Orabona, Francesco
Fornoni, Marco
Caputo, Barbara
Cesa-Bianchi, Nicolo
EXTERNAL
https://publications.idiap.ch/attachments/papers/2011/Luo_CVPR-OLCV_2010.pdf
PUBLIC
https://publications.idiap.ch/index.php/publications/showcite/Luo_Idiap-RR-06-2010
Related documents
In Proceeding of CVPR 2010, Online Learning for Computer Vision Workshop
2010
Efficient learning from massive amounts of information is a hot topic in computer vision. Available training sets contain many examples with several visual descriptors, a setting in which current batch approaches are typically slow and does not scale well. In this work we introduce a theo- retically motivated and efficient online learning algorithm for the Multi Kernel Learning (MKL) problem. For this algorithm we prove a theoretical bound on the number of multiclass mistakes made on any arbitrary data sequence. Moreover, we empirically show that its performance is on par, or better, than standard batch MKL (e.g. SILP, Sim- pleMKL) algorithms.
REPORT
Luo_Idiap-RR-06-2010/IDIAP
OM-2: An Online Multi-class Multi-kernel Learning Algorithm
Luo, Jie
Orabona, Francesco
Fornoni, Marco
Caputo, Barbara
Cesa-Bianchi, Nicolo
EXTERNAL
https://publications.idiap.ch/attachments/reports/2010/Luo_Idiap-RR-06-2010.pdf
PUBLIC
Idiap-RR-06-2010
2010
Idiap
April 2010
Efficient learning from massive amounts of information is a hot topic in computer vision. Available training sets contain many examples with several visual descriptors, a setting in which current batch approaches are typically slow and does not scale well. In this work we introduce a theo- retically motivated and efficient online learning algorithm for the Multi Kernel Learning (MKL) problem. For this algorithm we prove a theoretical bound on the number of multiclass mistakes made on any arbitrary data sequence. Moreover, we empirically show that its performance is on par, or better, than standard batch MKL (e.g. SILP, Sim- pleMKL) algorithms.