CONF Luo_ICCV11_2011/IDIAP Multiclass Transfer Learning from Unconstrained Priors Luo, Jie Tommasi, Tatiana Caputo, Barbara EXTERNAL https://publications.idiap.ch/attachments/papers/2011/Luo_ICCV11_2011.pdf PUBLIC https://publications.idiap.ch/index.php/publications/showcite/Luo_Idiap-RR-25-2011 Related documents Proceedings of the 13th International Conference on Computer Vision 2011 The vast majority of transfer learning methods proposed in the visual recognition domain over the last years ad- dresses the problem of object category detection, assuming a strong control over the priors from which transfer is done. This is a strict condition, as it concretely limits the use of this type of approach in several settings: for instance, it does not allow in general to use off-the-shelf models as priors. Moreover, the lack of a multiclass formulation for most of the existing transfer learning algorithms prevents using them for object categorization problems, where their use might be beneficial, especially when the number of categories grows and it becomes harder to get enough annotated data for training standard learning methods. This paper presents a multiclass transfer learning algorithm that allows to take advantage of priors built over different features and with different learning methods than the one used for learning the new task. We use the priors as experts, and transfer their outputs to the new incoming samples as additional information. We cast the learning problem within the Multi Kernel Learning framework. The resulting formulation solves efficiently a joint optimization problem that determines from where and how much to trans- fer, with a principled multiclass formulation. Extensive experiments illustrate the value of this approach. REPORT Luo_Idiap-RR-25-2011/IDIAP Multiclass Transfer Learning from Unconstrained Priors Luo, Jie Tommasi, Tatiana Caputo, Barbara EXTERNAL https://publications.idiap.ch/attachments/reports/2011/Luo_Idiap-RR-25-2011.pdf PUBLIC Idiap-RR-25-2011 2011 Idiap August 2011 The vast majority of transfer learning methods proposed in the visual recognition domain over the last years ad- dresses the problem of object category detection, assuming a strong control over the priors from which transfer is done. This is a strict condition, as it concretely limits the use of this type of approach in several settings: for instance, it does not allow in general to use off-the-shelf models as priors. Moreover, the lack of a multiclass formulation for most of the existing transfer learning algorithms prevents using them for object categorization problems, where their use might be beneficial, especially when the number of categories grows and it becomes harder to get enough annotated data for training standard learning methods. This paper presents a multiclass transfer learning algorithm that allows to take advantage of priors built over different features and with different learning methods than the one used for learning the new task. We use the priors as experts, and transfer their outputs to the new incoming samples as additional information. We cast the learning problem within the Multi Kernel Learning framework. The resulting formulation solves efficiently a joint optimization problem that determines from where and how much to trans- fer, with a principled multiclass formulation. Extensive experiments illustrate the value of this approach.