Causal Inference in Time Series in Terms of Rényi Transfer Entropy
Status PubMed-not-MEDLINE Language English Country Switzerland Media electronic
Document type Journal Article
Grant support
19-16066S
Czech Science Foundation
PubMed
35885081
PubMed Central
PMC9321760
DOI
10.3390/e24070855
PII: e24070855
Knihovny.cz E-resources
- Keywords
- Rényi entropy, Rényi transfer entropy, Rössler system, multivariate time series,
- Publication type
- Journal Article MeSH
Uncovering causal interdependencies from observational data is one of the great challenges of a nonlinear time series analysis. In this paper, we discuss this topic with the help of an information-theoretic concept known as Rényi's information measure. In particular, we tackle the directional information flow between bivariate time series in terms of Rényi's transfer entropy. We show that by choosing Rényi's parameter α, we can appropriately control information that is transferred only between selected parts of the underlying distributions. This, in turn, is a particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of "black swan" events, such as spikes or sudden jumps, are of key importance. In this connection, we first prove that for Gaussian variables, Granger causality and Rényi transfer entropy are entirely equivalent. Moreover, we also partially extend these results to heavy-tailed α-Gaussian variables. These results allow establishing a connection between autoregressive and Rényi entropy-based information-theoretic approaches to data-driven causal inference. To aid our intuition, we employed the Leonenko et al. entropy estimator and analyzed Rényi's information flow between bivariate time series generated from two unidirectionally coupled Rössler systems. Notably, we find that Rényi's transfer entropy not only allows us to detect a threshold of synchronization but it also provides non-trivial insight into the structure of a transient regime that exists between the region of chaotic correlations and synchronization threshold. In addition, from Rényi's transfer entropy, we could reliably infer the direction of coupling and, hence, causality, only for coupling strengths smaller than the onset value of the transient regime, i.e., when two Rössler systems are coupled but have not yet entered synchronization.
See more in PubMed
Schreiber T. Interdisciplinary application of nonlinear time series methods. Phys. Rep. 1999;308:1–64. doi: 10.1016/S0370-1573(98)00035-0. DOI
Kantz H., Schreiber T. Nonlinear Time Series Analysis. Cambridge University Press; Cambridge, UK: 2010.
Pecora L.M., Carroll T.L. Synchronization in chaotic systems. Phys. Rev. Lett. 1990;64:821–824. doi: 10.1103/PhysRevLett.64.821. PubMed DOI
Boccaletti S., Kurths J., Osipov G., Valladares D.L., Zhou C.S. The synchronization of chaotic systems. Phys. Rep. 2002;366:1–101. doi: 10.1016/S0370-1573(02)00137-0. DOI
Quiroga R.Q., Arnhold J., Grassberger P. Learning driver-response relationships from synchronization patterns. Phys. Rev. 2000;E61:5142–5148. doi: 10.1103/PhysRevE.61.5142. PubMed DOI
Nawrath J., Romano M.C., Thiel M., Kiss I.Z., Wickramasinghe M., Timmer J., Kurths J., Schelter B. Distinguishing Direct from Indirect Interactions in Oscillatory Networks with Multiple Time Scales. Phys. Rev. Lett. 2010;104:038701. doi: 10.1103/PhysRevLett.104.038701. PubMed DOI
Sugihara G., May R., Ye H., Hsieh C., Deyle E., Fogarty M., Munch S. Detecting causality in complex ecosystems. Science. 2012;338:496–500. doi: 10.1126/science.1227079. PubMed DOI
Feldhoff J.H., Donner R.V., Donges J.F., Marwan N., Kurths J. Geometric detection of coupling directions by means of inter-system recurrence networks. Phys. Lett. 2012;A376:3504–3513. doi: 10.1016/j.physleta.2012.10.008. DOI
Wiener N. In: Modern Mathematics for Engineers. Beckenbach E.F., editor. McGraw-Hill; New York, NY, USA: 1956.
Granger C.W.J. Investigating Causal Relations by Econometric Models and Cross-spectral Methods. Econometrica. 1969;37:424–438. doi: 10.2307/1912791. DOI
Ancona N., Marinazzo D., Stramaglia S. Radial basis function approach to nonlinear Granger causality of time series. Phys. Rev. 2004;R70:056221. doi: 10.1103/PhysRevE.70.056221. PubMed DOI
Chen Y., Rangarajan G., Feng J., Ding M. Analyzing multiple nonlinear time series with extended Granger causality. Phys. Lett. 2004;A324:26–35. doi: 10.1016/j.physleta.2004.02.032. DOI
Wismüller A., Souza A.M.D., Vosoughi M.A., Abidin A.Z. Large-scale nonlinear Granger causality for inferring directed dependence from short multivariate time-series data. Sci. Rep. 2021;11:7817. doi: 10.1038/s41598-021-87316-6. PubMed DOI PMC
Zou Y., Romano M., Thiel M., Marwan N., Kurths J. Inferring indirect coupling by means of recurrences. Int. J. Bifurc. Chaos. 2011;21:1099–1111. doi: 10.1142/S0218127411029033. DOI
Donner R.V., Small M., Donges J.F., Marwan N., Zou Y., Xiang R., Kurths J. Recurrence-based time series analysis by means of complex network methods. Int. J. Bifurc. Chaos. 2011;21:1019–1046. doi: 10.1142/S0218127411029021. DOI
Romano M., Thiel M., Kurths J., Grebogi C. Estimation of the direction of the coupling by conditional probabilities of recurrence. Phys. Rev. 2007;E76:036211. doi: 10.1103/PhysRevE.76.036211. PubMed DOI
Vejmelka M., Paluš M. Inferring the directionality of coupling with conditional mutual information. Phys. Rev. 2008;77:026214. doi: 10.1103/PhysRevE.77.026214. PubMed DOI
Paluš M., Krakovská A., Jakubík J., Chvosteková M. Causality, dynamical systems and the arrow of time. Chaos. 2018;28:075307. doi: 10.1063/1.5019944. PubMed DOI
Schreiber T. Measuring Information Transfer. Phys. Rev. Lett. 2000;85:461–464. doi: 10.1103/PhysRevLett.85.461. PubMed DOI
Marschinski R., Kantz H. Analysing the Information Flow Between Financial Time Series. Eur. Phys. J. B. 2002;30:275–281. doi: 10.1140/epjb/e2002-00379-2. DOI
Jizba P., Kleinert H., Shefaat M. Rényi’s information transfer between financial time series. Physica A. 2012;391:2971–2989. doi: 10.1016/j.physa.2011.12.064. DOI
Paluš M., Vejmelka M. Directionality of coupling from bivariate time series: How to avoid false causalities and missed connections. Phys. Rev. 2007;75:056211. doi: 10.1103/PhysRevE.75.056211. PubMed DOI
Runge J., Heitzig J., Petoukhov V., Kurths J. Escaping the Curse of Dimensionality in Estimating Multivariate Transfer Entropy. Phys. Rev. Lett. 2012;108:258701. doi: 10.1103/PhysRevLett.108.258701. PubMed DOI
Faes L., Kugiumtzis D., Nollo G., Jurysta F., Marinazzo D. Estimating the decomposition of predictive information in multivariate systems. Phys. Rev. 2015;91:032904. doi: 10.1103/PhysRevE.91.032904. PubMed DOI
Sun J., Taylor D., Bollt E.M. Causal Network Inference by Optimal Causation Entropy. SIAM J. Appl. Dyn. Syst. 2015;14:73–106. doi: 10.1137/140956166. DOI
Leonenko N., Pronzato L., Savani V. A class of Rényi information estimators for multidimensional densities. Ann. Stat. 2008;36:2153–2182. doi: 10.1214/10-AOS773. Correction in Ann. Stat. 2008, 36, 3837–3838. DOI
Lungarella M., Pitti A., Kuniyoshi Y. Information transfer at multiple scales. Phys. Rev. 2007;76:056117. doi: 10.1103/PhysRevE.76.056117. PubMed DOI
Faes L., Nollo G., Stramaglia S., Marinazzo D. Multiscale Granger causality. Phys. Rev. 2017;76:042150. doi: 10.1103/PhysRevE.96.042150. PubMed DOI
Paluš M. Multiscale Atmospheric Dynamics: Cross-Frequency Phase-Amplitude Coupling in the Air Temperature. Phys. Rev. Lett. 2014;112:078702. doi: 10.1103/PhysRevLett.112.078702. PubMed DOI
Tsallis C. Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World. Springer; New York, NY, USA: 2009.
Thurner S., Hanel R., Klimek P. Introduction to the Theory of Complex Systems. Oxford University Press; London, UK: 2018.
Rössler O.E. An equation for continuous chaos. Phys. Lett. 1976;57:397–398. doi: 10.1016/0375-9601(76)90101-8. DOI
Shannon C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948;5:379–423, 623–656. doi: 10.1002/j.1538-7305.1948.tb01338.x. DOI
Jizba P., Arimitsu T. The world according to Rényi: Thermodynamics of multifractal systems. Ann. Phys. 2004;312:17–59. doi: 10.1016/j.aop.2004.01.002. DOI
Burg J.P. The Relationship Between Maximum Entropy Spectra In addition, Maximum Likelihood Spectra. Geophysics. 1972;37:375–376. doi: 10.1190/1.1440265. DOI
Tsallis C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988;52:479–487. doi: 10.1007/BF01016429. DOI
Havrda J., Charvát F. Quantification Method of Classification Processes: Concept of Structural α-Entropy. Kybernetika. 1967;3:30–35.
Frank T., Daffertshofer A. Exact time-dependent solutions of the Rényi Fokker–Planck equation and the Fokker–Planck equations related to the entropies proposed by Sharma and Mittal. Physica A. 2000;285:352–366. doi: 10.1016/S0378-4371(00)00178-3. DOI
Sharma B.D., Mitter J., Mohan M. On measures of “useful” information. Inf. Control. 1978;39:323–336. doi: 10.1016/S0019-9958(78)90671-X. DOI
Jizba P., Korbel J. On q-non-extensive statistics with non-Tsallisian entropy. Physica A. 2016;444:808–827. doi: 10.1016/j.physa.2015.10.084. DOI
Vos G. Generalized additivity in unitary conformal field theories. Nucl. Phys. B. 2015;899:91–111. doi: 10.1016/j.nuclphysb.2015.07.013. DOI
Rényi A. Probability Theory. North-Holland; Amsterdam, The Netherlands: 1970.
Rényi A. Selected Papers of Alfréd Rényi. 2nd ed. Akademia Kiado; Budapest, Hungary: 1976.
Campbell L.L. A coding theorem and Rényi’s entropy. Inf. Control. 1965;8:423–429. doi: 10.1016/S0019-9958(65)90332-3. DOI
Csiszár I. Generalized cutoff rates and Rényi’s information measures. IEEE Trans. Inform. Theory. 1995;26:26–34. doi: 10.1109/18.370121. DOI
Csiszár I., Shields P.C. Information and Statistics: A Tutorial. Publishers Inc.; Boston, MA, USA: 2004.
Aczél J., Darótzy Z. Measure of Information and Their Characterizations. Academic Press; New York, NY, USA: 1975.
Halsey T.C., Jensen M.H., Kadanoff L.P., Procaccia I., Schraiman B.I. Fractal measures and their singularities: The characterization of strange sets. Phys. Rev. 1986;A33:1141–1151. doi: 10.1103/PhysRevA.33.1141. PubMed DOI
Mandelbrot B.B. Fractals: Form, Chance and Dimension. W. H. Freeman; San Francisco, CA, USA: 1977.
Bengtsson I., Życzkowski K. Geometry of Quantum States. An Introduction to Quantum Entanglement. Cambridge University Press; Cambridge, UK: 2006.
Jizba P., Korbel J. Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies. Phys. Rev. Lett. 2019;122:120601. doi: 10.1103/PhysRevLett.122.120601. PubMed DOI
Jizba P., Korbel J. When Shannon and Khinchin meet Shore and Johnson: Equivalence of information theory and statistical inference axiomatics. Phys. Rev. 2020;E101:042126. doi: 10.1103/PhysRevE.101.042126. PubMed DOI
Lesche B. Instabilities of Rényi entropies. J. Stat. Phys. 1982;27:419–422. doi: 10.1007/BF01008947. DOI
Rényi A. On measures of entropy and information; Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability; Berkeley, CA, USA. 20–30 June 1961; pp. 547–561.
Jizba P., Ma Y., Hayes A., Dunningham J.A. One-parameter class of uncertainty relations based on entropy power. Phys. Rev. E. 2016;93:060104(R). doi: 10.1103/PhysRevE.93.060104. PubMed DOI
Hentschel H.G.E., Procaccia I. The infinite number of generalized dimensions of fractals and strange attractors. Physica D. 1983;8:435–444. doi: 10.1016/0167-2789(83)90235-X. DOI
Harte D. Multifractals Theory and Applications. Chapman and Hall; New York, NY, USA: 2019.
Latora V., Baranger M. Kolmogorov–Sinai Entropy Rate versus Physical Entropy. Phys. Rev. Lett. 1999;82:520–523. doi: 10.1103/PhysRevLett.82.520. DOI
Jizba P., Korbel J. On the Uniqueness Theorem for Pseudo-Additive Entropies. Entropy. 2017;19:605. doi: 10.3390/e19110605. DOI
Geweke J. Measurement of Linear Dependence and Feedback between Multiple Time Series. J. Am. Stat. Assoc. 1982;77:304–313. doi: 10.1080/01621459.1982.10477803. DOI
Barnett L., Barrett A.B., Seth A.K. Granger Causality and Transfer Entropy are Equivalent for Gaussian Variables. Phys. Rev. Lett. 2009;103:238701. doi: 10.1103/PhysRevLett.103.238701. PubMed DOI
Jizba P., Dunningham J.A., Joo J. Role of information theoretic uncertainty relations in quantum theory. Ann. Phys. 2015;355:87–114. doi: 10.1016/j.aop.2015.01.031. DOI
Seth A.K. A MATLAB toolbox for Granger causal connectivity analysis. J. Neurosci. Methods. 2010;186:262–273. doi: 10.1016/j.jneumeth.2009.11.020. PubMed DOI
Jizba P., Korbel J. Multifractal Diffusion Entropy Analysis: Optimal Bin Width of Probability Histograms. Physica A. 2014;413:438–458. doi: 10.1016/j.physa.2014.07.008. DOI
Kečkić J.D., Vasić P.M. Some inequalities for the gamma function. Publ. De L’Institut Mathématique. 1971;11:107–114.
Fisher R.A., Yates F. Statistical Tables for Biological, Agricultural and Medical Research. 3rd ed. Oliver & Boyd; Edinburgh, UK: 1963.
Matsumoto M., Nishimura T. Mersenne Twister: A 623-Dimensionally Equidistributed Uniform Pseudo-Random Number Generator. ACM Trans. Model. Comput. Simul. 1998;8:3–30. doi: 10.1145/272991.272995. DOI
Theiler J., Eubank S., Longtin A., Galdrikian B., Farmer J.D. Testing for nonlinearity in time series: The method of surrogate data. Physica D. 1992;58:77–94. doi: 10.1016/0167-2789(92)90102-S. DOI
Schreiber T., Schmitz A. Improved Surrogate Data for Nonlinearity Tests. Phys. Rev. Lett. 1996;77:635–638. doi: 10.1103/PhysRevLett.77.635. PubMed DOI
Schreiber T., Schmitz A. Surrogate time series. Physica D. 2000;142:346–382. doi: 10.1016/S0167-2789(00)00043-9. DOI
Paluš M. Advances in Nonlinear Geosciences. Springer International Publishing; Cham, Switzerland: 2018. Linked by Dynamics: Wavelet-Based Mutual Information Rate as a Connectivity Measure and Scale-Specific Networks; pp. 427–463.
Rosenblum M.G., Pikovsky A., Kurths J. Phase Synchronization of Chaotic Oscillators. Phys. Rev. Lett. 1996;76:1804–1807. doi: 10.1103/PhysRevLett.76.1804. PubMed DOI
Cheng A.L., Chen Y.Y. Analyzing the synchronization of Rössler systems—When trigger-and-reinject is equally important as the spiral motion. Phys. Lett. 2017;381:3641–3651. doi: 10.1016/j.physleta.2017.09.042. DOI
Rössler O.E. Different Types of Chaos in Two Simple Differential Equations. Z. Naturforsch. 1976;31:1664–1670. doi: 10.1515/zna-1976-1231. DOI
Virtanen P., Gommers R., Oliphant T., Travis E., Haberland M., Reddy T., Cournapeau D., Burovski E., Peterson P., Weckesser W., et al. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nat. Methods. 2020;17:261–272. doi: 10.1038/s41592-019-0686-2. PubMed DOI PMC
Hunter J.D. Matplotlib: A 2D Graphics Environment. Comput. Sci. Eng. 2007;9:90–95. doi: 10.1109/MCSE.2007.55. DOI
Harris C.R., Millman K.J., van der Walt S.J., Gommers R., Virtanen P., Cournapeau D., Wieser E., Taylor J., Berg S., Smith N.J., et al. Array programming with NumPy. Nature. 2020;585:357–362. doi: 10.1038/s41586-020-2649-2. PubMed DOI PMC
Use Branch tranfer_entropy. [(accessed on 16 March 2022)]. Available online: https://github.com/jajcayn/pyclits.
Dobrushin R.L. A simplified method of experimentally evaluating the entropy of a stationary sequence. Teor. Veroyatnostei I Ee Primen. 1958;3:462–464. doi: 10.1137/1103036. DOI
Vašíček O. A test for normality based on sample entropy. J. Roy. Stat. Soc. Ser. B Methodol. 1976;38:54–59.
Kaiser A., Schreiber T. Information transfer in continuous processes. Physica D. 2002;166:43–62. doi: 10.1016/S0167-2789(02)00432-3. DOI
Silverman B.W. Density Estimation for Statistics and Data Analysis. Chapman & Hall; London, UK: 1986.
Kraskov A., Stögbauer H., Grassberger P. Estimating mutual information. Phys. Rev. 2004;69:066138. doi: 10.1103/PhysRevE.69.066138. PubMed DOI
Frenzel S., Pompe B. Partial Mutual Information for Coupling Analysis of Multivariate Time Series. Phys. Rev. Lett. 2007;99:204101. doi: 10.1103/PhysRevLett.99.204101. PubMed DOI