REPORT barber:agakov:info:gauss:04:88/IDIAP Variational Information Maximization in Gaussian Channels Agakov, Felix Barber, David EXTERNAL https://publications.idiap.ch/attachments/reports/2004/agakov_barber_info_gauss_idiap_rr.pdf PUBLIC Idiap-RR-88-2004 2004 IDIAP Rue de Simplon 4, Martigny, CH-1920, Switerland April 2004 IDIAP-RR 04-88 Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector and its low dimensional representation . However, such results are based on assumptions of Gaussianity of the sources. In this paper, we show how our mutual information bound, when applied to this arena, gives PCA solutions, without the need for the Gaussian assumption. Furthermore, it naturally generalizes to providing an objective function for Kernel PCA, enabling the principled selection of kernel parameters.