logo Idiap Research Institute        
 [BibTeX] [Marc21]
Neural Network Pruning and Pruning Parameters
Type of publication: Conference paper
Citation: Thimm-96.5
Booktitle: The 1st Workshop on Soft Computing
Year: 1996
Month: 8
Organization: Dept. of Information Electronics Nagoya University
Address: Furo-cho, Chikusa-ku, Nagoya 464-01, Japan
Note: Is published as http://www.bioele.nuee.nagoya-u.ac.jp/wsc1/
Abstract: The default multilayer neural network topology is a fully interlayer connected one. This simplistic choice facilitates the design but it limits the performance of the resulting neural networks. The best-known methods for obtaining partially connected neural networks are the so called pruning methods which are used for optimizing both the size and the generalization capabilities of neural networks. Two of the most promising pruning techniques have therefore been selected for a comparative study. It is shown that these novel techniques are hampered by having numerous user-tunable parameters, which can easily nullify the benefits of these advanced methods. Finally, based on the results, conclusions about the execution of experiments and suggestions for conducting future research on neural network pruning are drawn.
Userfields: ipdhtml={https://www.idiap.ch/nn-papers/MLP_pruning_parameters/pruning.html}, ipdmembership={learning}, ipdpriority={7},
Keywords: generalization, network size, neural network, neural network optimization, parameters, pruning
Projects Idiap
Authors Thimm, Georg
Fiesler, Emile
Editors Furuhashi, Takeshi
Added by: [UNK]
Total mark: 0
Attachments
  • prune96.pdf
Notes