Bounds on the complexity of neural‐network models and comparison with linear methods

Bounds on the complexity of neural‐network models and comparison with linear methods

Abstract

A class of non-linear models having the structure of combinations of simple, parametrized basis functions is investigated; this class includes widespread neural networks in which the basis functions correspond to the computational units of a type of networks. Bounds on the complexity of such models are derived in terms of the number of adjustable parameters necessary for a given modelling accuracy. These bounds guarantee a more advantageous tradeoff than linear methods between modelling accuracy and model complexity: the number of parameters may increase much more slowly, in some cases only polynomially, with the dimensionality of the input space in modelling tasks. Polynomial bounds on complexity allow one to cope with the so-called ‘curse of dimensionality’, which often makes linear methods either inaccurate or computationally unfeasible. The presented results let one gain a deeper theoretical insight into the effectiveness of neural-network architectures, noticed in complex modelling applications.

Grafik Top
Authors
  • Hlavackova-Schindler, Katerina
  • Sanguineti, Marcello
Grafik Top
Shortfacts
Category
Journal Paper
Divisions
Data Mining and Machine Learning
Journal or Publication Title
International Journal of Adaptive Control and Signal Processing 2003
ISSN
1099-1115
Volume
17
Date
2003
Export
Grafik Top