Shared input and recurrency in neural networks for metabolically efficient information transmission

. 2024 Feb ; 20 (2) : e1011896. [epub] 20240223

Jazyk angličtina Země Spojené státy americké Médium electronic-ecollection

Typ dokumentu časopisecké články

Perzistentní odkaz   https://www.medvik.cz/link/pmid38394341

Odkazy
PubMed 38394341
PubMed Central PMC10917264
DOI 10.1371/journal.pcbi.1011896
PII: PCOMPBIOL-D-23-00763
Knihovny.cz E-zdroje

Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.

Zobrazit více v PubMed

Barlow HB. Possible Principles Underlying the Transformations of Sensory Messages. In: Sensory Communication. The MIT Press; 1961. p. 217–234.

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI

PubMed DOI

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI PMC

PubMed DOI PMC

Blahut R. Computation of channel capacity and rate-distortion functions. IEEE Trans Inf Theory. 1972;18(4):460–473. doi: 10.1109/TIT.1972.1054855 DOI

Jimbo M, Kunisawa K. An iteration method for calculating the relative capacity. Information and Control. 1979;43(2):216–223. doi: 10.1016/S0019-9958(79)90719-8 DOI

Suksompong P, Berger T. Capacity Analysis for Integrate-and-Fire Neurons With Descending Action Potential Thresholds. IEEE Trans Inf Theory. 2010;56(2):838–851. doi: 10.1109/TIT.2009.2037042 DOI

PubMed DOI

PubMed DOI

Stemmler M. A single spike suffices: the simplest form of stochastic resonance in model neurons. Network. 1996;7(4):687–716. doi: 10.1088/0954-898X_7_4_005 DOI

PubMed DOI

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI PMC

PubMed DOI

PubMed DOI

PubMed DOI PMC

PubMed DOI

PubMed DOI

PubMed DOI

PubMed DOI

PubMed

PubMed DOI PMC

PubMed DOI

Uhlenbeck GE, Ornstein LS. On the Theory of the Brownian Motion. Phys Rev. 1930;36(5):823–841. doi: 10.1103/PhysRev.36.823 DOI

PubMed DOI PMC

PubMed DOI

PubMed DOI PMC

PubMed DOI

Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W. Entropy and Information in Neural Spike Trains. Phys Rev Lett. 1998;80(1):197–200. doi: 10.1103/PhysRevLett.80.197 DOI

PubMed DOI

PubMed DOI

Paninski L. Estimation of Entropy and Mutual Information. Neural Comput. 2003;15(6):1191–1253. doi: 10.1162/089976603321780272 DOI

PubMed DOI

Najít záznam

Citační ukazatele

Nahrávání dat...

Možnosti archivace

Nahrávání dat...