New Entropy Based Combination Rules in HMM/ANN Multi-stream ASR
Type of publication: | Conference paper |
Citation: | misr03 |
Booktitle: | Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) |
Year: | 2003 |
Month: | 4 |
Address: | Hong Kong |
Note: | IDIAP-RR 2002 31 |
Crossref: | misra-rr-02-31: |
Abstract: | Classifier performance is often enhanced through combining multiple streams of information. In the context of multi-stream HMM/ANN systems in ASR, a confidence measure widely used in classifier combination is the entropy of the posteriors distribution output from each ANN, which generally increases as classification becomes less reliable. The rule most commonly used is to select the ANN with the minimum entropy. However, this is not necessarily the best way to use entropy in classifier combination. In this article, we test three new entropy based combination rules in a full-combination multi-stream HMM/ANN system for noise robust speech recognition. Best results were obtained by combining all the classifiers having entropy below average using a weighting proportional to their inverse entropy. |
Userfields: | ipdmembership={speech}, |
Keywords: | |
Projects |
Idiap |
Authors | |
Added by: | [UNK] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|