CONF Moerland-94.1/IDIAP Results on the Steepness in Backpropagation Neural Networks Moerland, Perry Thimm, Georg Fiesler, Emile Aguilar, Marc Ed. (adaptive) learning rate activation function adaptive steepness backpropagation bias connectionism gain initial weight multilayer neural network neural computation neural computing neural network neurocomputing optical implementation sigmoid steepness slope SI Group for Parallel Systems - Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing 1994 Institute of Informatics, University P\'erolles, Fribourg, Switzerland October 1994 91-94 The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.