Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.
The apparent stochastic nature of neuronal activity significantly affects the reliability of neuronal coding. To quantify the encountered fluctuations, both in neural data and simulations, the notions of variability and randomness of inter-spike intervals have been proposed and studied. In this article we focus on the concept of the instantaneous firing rate, which is also based on the spike timing. We use several classical statistical models of neuronal activity and we study the corresponding probability distributions of the instantaneous firing rate. To characterize the firing rate variability and randomness under different spiking regimes, we use different indices of statistical dispersion. We find that the relationship between the variability of interspike intervals and the instantaneous firing rate is not straightforward in general. Counter-intuitively, an increase in the randomness (based on entropy) of spike times may either decrease or increase the randomness of instantaneous firing rate, in dependence on the neuronal firing model. Finally, we apply our methods to experimental data, establishing that instantaneous rate analysis can indeed provide additional information about the spiking activity.
- Publikační typ
- časopisecké články MeSH
The Fano factor, defined as the variance-to-mean ratio of spike counts in a time window, is often used to measure the variability of neuronal spike trains. However, despite its transparent definition, careless use of the Fano factor can easily lead to distorted or even wrong results. One of the problems is the unclear dependence of the Fano factor on the spiking rate, which is often neglected or handled insufficiently. In this paper we aim to explore this problem in more detail and to study the possible solution, which is to evaluate the Fano factor in the operational time. We use equilibrium renewal and Markov renewal processes as spike train models to describe the method in detail, and we provide an illustration on experimental data.
- Publikační typ
- časopisecké články MeSH
- MeSH
- duševní procesy fyziologie MeSH
- kongresy jako téma MeSH
- lidé MeSH
- nervová síť fyziologie MeSH
- neurofyziologie * MeSH
- systémová biologie * MeSH
- zvířata MeSH
- Check Tag
- lidé MeSH
- zvířata MeSH
- Publikační typ
- úvodní články MeSH
- úvodníky MeSH
In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency-the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer.
- MeSH
- adenosintrifosfát metabolismus MeSH
- akční potenciály fyziologie MeSH
- excitační postsynaptické potenciály fyziologie MeSH
- membránové potenciály fyziologie MeSH
- modely neurologické * MeSH
- nervové vedení fyziologie MeSH
- nervový přenos fyziologie MeSH
- nervový útlum fyziologie MeSH
- neurony fyziologie MeSH
- počítačová simulace MeSH
- poměr signál - šum MeSH
- výpočetní biologie MeSH
- zvířata MeSH
- Check Tag
- zvířata MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
In order to understand how olfactory stimuli are encoded and processed in the brain, it is important to build a computational model for olfactory receptor neurons (ORNs). Here, we present a simple and reliable mathematical model of a moth ORN generating spikes. The model incorporates a simplified description of the chemical kinetics leading to olfactory receptor activation and action potential generation. We show that an adaptive spike threshold regulated by prior spike history is an effective mechanism for reproducing the typical phasic-tonic time course of ORN responses. Our model reproduces the response dynamics of individual neurons to a fluctuating stimulus that approximates odorant fluctuations in nature. The parameters of the spike threshold are essential for reproducing the response heterogeneity in ORNs. The model provides a valuable tool for efficient simulations of olfactory circuits.
- MeSH
- akční potenciály fyziologie MeSH
- biologické modely MeSH
- čichové buňky účinky léků fyziologie MeSH
- elektrofyziologické jevy MeSH
- fyziologická adaptace * MeSH
- můry fyziologie MeSH
- sexuální lákadla farmakologie MeSH
- zvířata MeSH
- Check Tag
- mužské pohlaví MeSH
- zvířata MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
The efficient coding hypothesis predicts that sensory neurons adjust their coding resources to optimally represent the stimulus statistics of their environment. To test this prediction in the moth olfactory system, we have developed a stimulation protocol that mimics the natural temporal structure within a turbulent pheromone plume. We report that responses of antennal olfactory receptor neurons to pheromone encounters follow the temporal fluctuations in such a way that the most frequent stimulus timescales are encoded with maximum accuracy. We also observe that the average coding precision of the neurons adjusted to the stimulus-timescale statistics at a given distance from the pheromone source is higher than if the same encoding model is applied at a shorter, non-matching, distance. Finally, the coding accuracy profile and the stimulus-timescale distribution are related in the manner predicted by the information theory for the many-to-one convergence scenario of the moth peripheral sensory system.
- MeSH
- čichové buňky fyziologie MeSH
- čichové dráhy fyziologie MeSH
- elektrofyziologické jevy MeSH
- feromony fyziologie MeSH
- můry fyziologie MeSH
- neurony aferentní fyziologie MeSH
- pravděpodobnost MeSH
- reprodukovatelnost výsledků MeSH
- statistické modely MeSH
- tykadla členovců fyziologie MeSH
- zvířata MeSH
- Check Tag
- mužské pohlaví MeSH
- zvířata MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
The value of Shannon's mutual information is commonly used to describe the total amount of information that the neural code transfers between the ensemble of stimuli and the ensemble of neural responses. In addition, it is often desirable to know which features of the stimulus or response are most informative. The literature offers several different decompositions of the mutual information into its stimulus or response-specific components, such as the specific surprise or the uncertainty reduction, but the number of mutually distinct measures is in fact infinite. We resolve this ambiguity by requiring the specific information measures to be invariant under invertible coordinate transformations of the stimulus and the response ensembles. We prove that the Kullback-Leibler divergence is then the only suitable measure of the specific information. On a more general level, we discuss the necessity and the fundamental aspects of the coordinate invariance as a selection principle. We believe that our results will encourage further research into invariant statistical methods for the analysis of neural coding.
- MeSH
- biofyzika MeSH
- fyzická vytrvalost MeSH
- informační teorie * MeSH
- lidé MeSH
- modely neurologické * MeSH
- neurony fyziologie MeSH
- pravděpodobnost MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
- MeSH
- lidé MeSH
- mozek fyziologie MeSH
- neuronové sítě (počítačové) * MeSH
- periodika jako téma * MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- úvodníky MeSH
Recently, it has been suggested that certain neurons with Poissonian spiking statistics may communicate by discontinuously switching between two levels of firing intensity. Such a situation resembles in many ways the optimal information transmission protocol for the continuous-time Poisson channel known from information theory. In this contribution we employ the classical information-theoretic results to analyze the efficiency of such a transmission from different perspectives, emphasising the neurobiological viewpoint. We address both the ultimate limits, in terms of the information capacity under metabolic cost constraints, and the achievable bounds on performance at rates below capacity with fixed decoding error probability. In doing so we discuss optimal values of experimentally measurable quantities that can be compared with the actual neuronal recordings in a future effort.