-
Something wrong with this record ?
Minimization of error functionals over perceptron networks
V Kurkova
Language English Country United States
NLK
Medline Complete (EBSCOhost)
from 1997-01-01 to 1 year ago
- MeSH
- Algorithms MeSH
- Biological Clocks physiology MeSH
- Financing, Organized MeSH
- Logic MeSH
- Neural Networks, Computer MeSH
- Normal Distribution MeSH
- Computer Simulation MeSH
- Artificial Intelligence MeSH
Supervised learning of perceptron networks is investigated as an optimization problem. It is shown that both the theoretical and the empirical error functionals achieve minima over sets of functions computable by networks with a given number n of perceptrons. Upper bounds on rates of convergence of these minima with n increasing are derived. The bounds depend on a certain regularity of training data expressed in terms of variational norms of functions interpolating the data (in the case of the empirical error) and the regression function (in the case of the expected error). Dependence of this type of regularity on dimensionality and on magnitudes of partial derivatives is investigated. Conditions on the data, which guarantee that a good approximation of global minima of error functionals can be achieved using networks with a limited complexity, are derived. The conditions are in terms of oscillatory behavior of the data measured by the product of a function of the number of variables d, which is decreasing exponentially fast, and the maximum of the magnitudes of the squares of the L(1)-norms of the iterated partial derivatives of the order d of the regression function or some function, which interpolates the sample of the data. The results are illustrated by examples of data with small and high regularity constructed using Boolean functions and the gaussian function.
- 000
- 00000naa 2200000 a 4500
- 001
- bmc10026373
- 003
- CZ-PrNML
- 005
- 20111210192331.0
- 008
- 101018s2008 xxu e eng||
- 009
- AR
- 040 __
- $a ABA008 $b cze $c ABA008 $d ABA008 $e AACR2
- 041 0_
- $a eng
- 044 __
- $a xxu
- 100 1_
- $a Kůrková, Věra. $7 _AN057212
- 245 10
- $a Minimization of error functionals over perceptron networks / $c V Kurkova
- 314 __
- $a Institute of Computer Science, Academy of Sciences of the Czech Republic, Prague, CZ 18207. vera@cs.cas.cz
- 520 9_
- $a Supervised learning of perceptron networks is investigated as an optimization problem. It is shown that both the theoretical and the empirical error functionals achieve minima over sets of functions computable by networks with a given number n of perceptrons. Upper bounds on rates of convergence of these minima with n increasing are derived. The bounds depend on a certain regularity of training data expressed in terms of variational norms of functions interpolating the data (in the case of the empirical error) and the regression function (in the case of the expected error). Dependence of this type of regularity on dimensionality and on magnitudes of partial derivatives is investigated. Conditions on the data, which guarantee that a good approximation of global minima of error functionals can be achieved using networks with a limited complexity, are derived. The conditions are in terms of oscillatory behavior of the data measured by the product of a function of the number of variables d, which is decreasing exponentially fast, and the maximum of the magnitudes of the squares of the L(1)-norms of the iterated partial derivatives of the order d of the regression function or some function, which interpolates the sample of the data. The results are illustrated by examples of data with small and high regularity constructed using Boolean functions and the gaussian function.
- 650 _2
- $a algoritmy $7 D000465
- 650 _2
- $a umělá inteligence $7 D001185
- 650 _2
- $a biologické hodiny $x fyziologie $7 D001683
- 650 _2
- $a počítačová simulace $7 D003198
- 650 _2
- $a logika $7 D008128
- 650 _2
- $a neuronové sítě $7 D016571
- 650 _2
- $a normální rozdělení $7 D016011
- 650 _2
- $a financování organizované $7 D005381
- 773 0_
- $w MED00003480 $t Neural computation $g Roč. 20, č. 1 (2008), s. 252-270 $x 0899-7667
- 910 __
- $a ABA008 $b x $y 7
- 990 __
- $a 20110112132033 $b ABA008
- 991 __
- $a 20110117105309 $b ABA008
- 999 __
- $a ok $b bmc $g 801478 $s 666225
- BAS __
- $a 3
- BMC __
- $a 2008 $b 20 $c 1 $d 252-270 $i 0899-7667 $m Neural computation $n Neural Comput $x MED00003480
- LZP __
- $a 2010-B/mk