logo Idiap Research Institute        
 [BibTeX] [Marc21]
Results on the Steepness in Backpropagation Neural Networks
Type of publication: Conference paper
Citation: Moerland-94.1
Booktitle: Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
Year: 1994
Month: 10
Organization: SI Group for Parallel Systems
Address: Institute of Informatics, University P\'erolles, Fribourg, Switzerland
Abstract: The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.
Userfields: ipdmembership={neuron learning},
Keywords: (adaptive) learning rate, activation function, adaptive steepness, backpropagation, bias, connectionism, gain, initial weight, multilayer neural network, neural computation, neural computing, neural network, neurocomputing, optical implementation, sigmoid steepness, slope
Projects Idiap
Authors Moerland, Perry
Thimm, Georg
Fiesler, Emile
Editors Aguilar, Marc
Added by: [UNK]
Total mark: 0
Attachments
    Notes