Variational Information Maximization in Gaussian Channels
Type of publication: | Idiap-RR |
Citation: | barber:agakov:info:gauss:04:88 |
Number: | Idiap-RR-88-2004 |
Year: | 2004 |
Month: | 4 |
Institution: | IDIAP |
Address: | Rue de Simplon 4, Martigny, CH-1920, Switerland |
Note: | IDIAP-RR 04-88 |
Abstract: | Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector and its low dimensional representation . However, such results are based on assumptions of Gaussianity of the sources. In this paper, we show how our mutual information bound, when applied to this arena, gives PCA solutions, without the need for the Gaussian assumption. Furthermore, it naturally generalizes to providing an objective function for Kernel PCA, enabling the principled selection of kernel parameters. |
Userfields: | ipdmembership={learning}, |
Keywords: | |
Projects |
Idiap |
Authors | |
Added by: | [UNK] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|