• Je něco špatně v tomto záznamu ?

Energy complexity of recurrent neural networks

J. Síma,

. 2014 ; 26 (5) : 953-73. [pub] 20140220

Jazyk angličtina Země Spojené státy americké

Typ dokumentu časopisecké články, práce podpořená grantem

Perzistentní odkaz   https://www.medvik.cz/link/bmc15032061

Recently a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this letter, we investigate the energy complexity of recurrent networks, which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size [Formula: see text] with the time overhead of [Formula: see text] per one input bit, using the energy O(e), for any e such that [Formula: see text] and e=O(s), which shows the time-energy trade-off in recurrent networks. In addition, for the time overhead [Formula: see text] satisfying [Formula: see text], we obtain the lower bound of [Formula: see text] on the energy of such a simulation for some constant c>0 and for infinitely many s.

Citace poskytuje Crossref.org

000      
00000naa a2200000 a 4500
001      
bmc15032061
003      
CZ-PrNML
005      
20151014101309.0
007      
ta
008      
151005s2014 xxu f 000 0|eng||
009      
AR
024    7_
$a 10.1162/NECO_a_00579 $2 doi
035    __
$a (PubMed)24555455
040    __
$a ABA008 $b cze $d ABA008 $e AACR2
041    0_
$a eng
044    __
$a xxu
100    1_
$a Síma, Jiří $u Institute of Computer Science, Academy of Sciences of the Czech Republic, P. O. Box 5, 18207 Prague 8, Czech Republic sima@cs.cas.cz.
245    10
$a Energy complexity of recurrent neural networks / $c J. Síma,
520    9_
$a Recently a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this letter, we investigate the energy complexity of recurrent networks, which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size [Formula: see text] with the time overhead of [Formula: see text] per one input bit, using the energy O(e), for any e such that [Formula: see text] and e=O(s), which shows the time-energy trade-off in recurrent networks. In addition, for the time overhead [Formula: see text] satisfying [Formula: see text], we obtain the lower bound of [Formula: see text] on the energy of such a simulation for some constant c>0 and for infinitely many s.
650    _2
$a akční potenciály $x fyziologie $7 D000200
650    _2
$a algoritmy $7 D000465
650    _2
$a mozek $x fyziologie $7 D001921
650    12
$a neuronové sítě $7 D016571
650    _2
$a neurony $x fyziologie $7 D009474
655    _2
$a časopisecké články $7 D016428
655    _2
$a práce podpořená grantem $7 D013485
773    0_
$w MED00003480 $t Neural computation $x 1530-888X $g Roč. 26, č. 5 (2014), s. 953-73
856    41
$u https://pubmed.ncbi.nlm.nih.gov/24555455 $y Pubmed
910    __
$a ABA008 $b sig $c sign $y a $z 0
990    __
$a 20151005 $b ABA008
991    __
$a 20151014101459 $b ABA008
999    __
$a ok $b bmc $g 1092937 $s 915187
BAS    __
$a 3
BAS    __
$a PreBMC
BMC    __
$a 2014 $b 26 $c 5 $d 953-73 $e 20140220 $i 1530-888X $m Neural computation $n Neural Comput $x MED00003480
LZP    __
$a Pubmed-20151005

Najít záznam

Citační ukazatele

Nahrávání dat ...

Možnosti archivace

Nahrávání dat ...