logo Idiap Research Institute        
 [BibTeX] [Marc21]
Classification using localized mixtures of experts
Type of publication: Conference paper
Citation: Moerland-98.2b
Booktitle: Proceedings of the International Conference on Artificial Neural Networks (ICANN'99)
Volume: 2
Year: 1999
Publisher: London: IEE
Note: (IDIAP-RR 98-14)
Crossref: moerland-98.2:
Abstract: A mixture of experts consists of a gating network that learns to partition the input space and of experts networks attributed to these different regions. This paper focuses on the choice of the gating network. First, a localized gating network based on a mixture of linear latent variable models is proposed that extends a gating network introduced by Xu et al, based on Gaussian mixture models. It is shown that this localized mixture of experts model, can be trained with the Expectation Maximization algorithm. The localized model is compared on a set of classification problems, with mixtures of experts having single or multi-layer perceptrons as gating network. It is found that the standard mixture of experts with feed-forward networks as gate often outperforms the other models.
Userfields: ipdmembership={learning},
Projects Idiap
Authors Moerland, Perry
Added by: [UNK]
Total mark: 0
  • localized98.pdf
  • moerland.localized.ps.gz