25970373
9781423543152
Out of Stock
The item you're looking for is currently unavailable.
Reducing a neural network's complexity improves the ability of the network to be applied to future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks has produced non-statistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgery as methods to remove connections with the least salience. There are conflicting views as to whether more than one parameter can be removed at a time. The methods proposed in this research use statistical multiple comparison procedures to remove multiple parameters in the model when no significant difference exists. While computationally intensive, the Tukey-Kramer method compares well with Optimal Brain Surgery in pruning and network performance. When the Tukey-Kramer method has inefficient sampling requirements, Weibull distribution theory alleviates the computation burden of bootstrap resampling with single sample analysis, while maintaining comparable network performance.Air Force Inst of Tech Wright-Patterson AFB OH is the author of 'Multiple Comparison Pruning of Neural Networks', published 1999 under ISBN 9781423543152 and ISBN 1423543157.
[read more]