Metabolic cost of neuronal information in an empirical stimulus-response model
Language English Country Germany Media print-electronic
Document type Journal Article, Research Support, Non-U.S. Gov't
- MeSH
- Adenosine Triphosphate metabolism MeSH
- Action Potentials physiology MeSH
- Information Theory * MeSH
- Humans MeSH
- Models, Neurological * MeSH
- Brain cytology physiology MeSH
- Synaptic Transmission MeSH
- Neurons metabolism MeSH
- Animals MeSH
- Check Tag
- Humans MeSH
- Animals MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
- Names of Substances
- Adenosine Triphosphate MeSH
The limits on maximum information that can be transferred by single neurons may help us to understand how sensory and other information is being processed in the brain. According to the efficient-coding hypothesis (Barlow, Sensory Comunication, MIT press, Cambridge, 1961), neurons are adapted to the statistical properties of the signals to which they are exposed. In this paper we employ methods of information theory to calculate, both exactly (numerically) and approximately, the ultimate limits on reliable information transmission for an empirical neuronal model. We couple information transfer with the metabolic cost of neuronal activity and determine the optimal information-to-metabolic cost ratios. We find that the optimal input distribution is discrete with only six points of support, both with and without a metabolic constraint. However, we also find that many different input distributions achieve mutual information close to capacity, which implies that the precise structure of the capacity-achieving input is of lesser importance than the value of capacity.
References provided by Crossref.org
Shared input and recurrency in neural networks for metabolically efficient information transmission
The effect of inhibition on rate code efficiency indicators