-
Je něco špatně v tomto záznamu ?
Shared input and recurrency in neural networks for metabolically efficient information transmission
T. Barta, L. Kostal
Jazyk angličtina Země Spojené státy americké
Typ dokumentu časopisecké články
NLK
Directory of Open Access Journals
od 2005
Free Medical Journals
od 2005
Public Library of Science (PLoS)
od 2005
PubMed Central
od 2005
Europe PubMed Central
od 2005
ProQuest Central
od 2005-06-01
Open Access Digital Library
od 2005-01-01
Open Access Digital Library
od 2005-06-01
Open Access Digital Library
od 2005-01-01
Medline Complete (EBSCOhost)
od 2005-06-01
Health & Medicine (ProQuest)
od 2005-06-01
ROAD: Directory of Open Access Scholarly Resources
od 2005
- MeSH
- akční potenciály fyziologie MeSH
- modely neurologické MeSH
- nervová síť * fyziologie MeSH
- nervový přenos * fyziologie MeSH
- nervový útlum fyziologie MeSH
- neuronové sítě MeSH
- počítačová simulace MeSH
- reprodukovatelnost výsledků MeSH
- Publikační typ
- časopisecké články MeSH
Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.
Citace poskytuje Crossref.org
- 000
- 00000naa a2200000 a 4500
- 001
- bmc24007067
- 003
- CZ-PrNML
- 005
- 20240423155716.0
- 007
- ta
- 008
- 240412s2024 xxu f 000 0|eng||
- 009
- AR
- 024 7_
- $a 10.1371/journal.pcbi.1011896 $2 doi
- 035 __
- $a (PubMed)38394341
- 040 __
- $a ABA008 $b cze $d ABA008 $e AACR2
- 041 0_
- $a eng
- 044 __
- $a xxu
- 100 1_
- $a Barta, Tomas $u Laboratory of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic $u Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology, Onna-son, Okinawa, Japan $1 https://orcid.org/0000000204673240
- 245 10
- $a Shared input and recurrency in neural networks for metabolically efficient information transmission / $c T. Barta, L. Kostal
- 520 9_
- $a Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.
- 650 12
- $a nervový přenos $x fyziologie $7 D009435
- 650 _2
- $a akční potenciály $x fyziologie $7 D000200
- 650 _2
- $a reprodukovatelnost výsledků $7 D015203
- 650 _2
- $a počítačová simulace $7 D003198
- 650 12
- $a nervová síť $x fyziologie $7 D009415
- 650 _2
- $a modely neurologické $7 D008959
- 650 _2
- $a neuronové sítě $7 D016571
- 650 _2
- $a nervový útlum $x fyziologie $7 D009433
- 655 _2
- $a časopisecké články $7 D016428
- 700 1_
- $a Kostal, Lubomir $u Laboratory of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic $1 https://orcid.org/0000000227086268 $7 xx0098338
- 773 0_
- $w MED00008919 $t PLoS computational biology $x 1553-7358 $g Roč. 20, č. 2 (2024), s. e1011896
- 856 41
- $u https://pubmed.ncbi.nlm.nih.gov/38394341 $y Pubmed
- 910 __
- $a ABA008 $b sig $c sign $y - $z 0
- 990 __
- $a 20240412 $b ABA008
- 991 __
- $a 20240423155712 $b ABA008
- 999 __
- $a ok $b bmc $g 2081209 $s 1216834
- BAS __
- $a 3
- BAS __
- $a PreBMC-MEDLINE
- BMC __
- $a 2024 $b 20 $c 2 $d e1011896 $e 20240223 $i 1553-7358 $m PLoS computational biology $n PLoS Comput Biol $x MED00008919
- LZP __
- $a Pubmed-20240412