• This record comes from PubMed

Energy complexity of recurrent neural networks

. 2014 May ; 26 (5) : 953-73. [epub] 20140220

Language English Country United States Media print-electronic

Document type Journal Article, Research Support, Non-U.S. Gov't

Recently a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this letter, we investigate the energy complexity of recurrent networks, which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size [Formula: see text] with the time overhead of [Formula: see text] per one input bit, using the energy O(e), for any e such that [Formula: see text] and e=O(s), which shows the time-energy trade-off in recurrent networks. In addition, for the time overhead [Formula: see text] satisfying [Formula: see text], we obtain the lower bound of [Formula: see text] on the energy of such a simulation for some constant c>0 and for infinitely many s.

References provided by Crossref.org

Find record

Citation metrics

Loading data ...

Archiving options

Loading data ...