• Je něco špatně v tomto záznamu ?

Iterative principles of recognition in probabilistic neural networks

J Grim, J Hora

. 2008 ; 21 (6) : 838-846.

Jazyk angličtina Země Spojené státy americké

Typ dokumentu práce podpořená grantem

Perzistentní odkaz   https://www.medvik.cz/link/bmc11006398
E-zdroje Online

NLK ScienceDirect (archiv) od 1993-01-01 do 2009-12-31

When considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed probabilistic description contradicts the well known short-term dynamic properties of biological neurons. By introducing iterative schemes of recognition we show that some parameters of probabilistic neural networks can be "released" for the sake of dynamic processes without disturbing the statistically correct decision making. In particular, we can iteratively adapt the mixture component weights or modify the input pattern in order to facilitate correct recognition. Both procedures are shown to converge monotonically as a special case of the well known EM algorithm for estimating mixtures.

000      
02171naa 2200325 a 4500
001      
bmc11006398
003      
CZ-PrNML
005      
20121113122311.0
008      
110401s2008 xxu e eng||
009      
AR
040    __
$a ABA008 $b cze $c ABA008 $d ABA008 $e AACR2
041    0_
$a eng
044    __
$a xxu
100    1_
$a Grim, Jiří. $7 _AN062545
245    10
$a Iterative principles of recognition in probabilistic neural networks / $c J Grim, J Hora
314    __
$a Institute of Information Theory and Automation, Czech Academy of Sciences P.O. BOX 18, CZ-18208 Prague 8, Czech Republic. grim@utia.cas.cz
520    9_
$a When considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed probabilistic description contradicts the well known short-term dynamic properties of biological neurons. By introducing iterative schemes of recognition we show that some parameters of probabilistic neural networks can be "released" for the sake of dynamic processes without disturbing the statistically correct decision making. In particular, we can iteratively adapt the mixture component weights or modify the input pattern in order to facilitate correct recognition. Both procedures are shown to converge monotonically as a special case of the well known EM algorithm for estimating mixtures.
590    __
$a bohemika - dle Pubmed
650    _2
$a algoritmy $7 D000465
650    _2
$a lidé $7 D006801
650    _2
$a statistické modely $7 D015233
650    _2
$a nervová síť $7 D009415
650    _2
$a neuronové sítě $7 D016571
650    _2
$a neurony $x fyziologie $7 D009474
650    _2
$a rozpoznávání automatizované $7 D010363
650    _2
$a rozpoznávání obrazu $x fyziologie $7 D010364
650    _2
$a rozpoznávání (psychologie) $7 D021641
655    _2
$a práce podpořená grantem $7 D013485
700    1#
$a Hora, Jan. $7 xx0278918
773    0_
$t Neural Networks $w MED00011811 $g Roč. 21, č. 6 (2008), s. 838-846 $x 0893-6080
910    __
$a ABA008 $b x $y 2
990    __
$a 20110414103000 $b ABA008
991    __
$a 20121113122326 $b ABA008
999    __
$a ok $b bmc $g 833992 $s 698499
BAS    __
$a 3
BMC    __
$a 2008 $b 21 $c 6 $d 838-846 $i 0893-6080 $m Neural networks $n Neural Netw $x MED00011811
LZP    __
$a 2011-1B09/jjme

Najít záznam

Citační ukazatele

Nahrávání dat ...

Možnosti archivace

Nahrávání dat ...