Neglect of publication bias compromises meta-analyses of educational research
Jazyk angličtina Země Spojené státy americké Médium electronic-ecollection
Typ dokumentu časopisecké články, práce podpořená grantem
PubMed
34081730
PubMed Central
PMC8174709
DOI
10.1371/journal.pone.0252415
PII: PONE-D-20-29585
Knihovny.cz E-zdroje
- MeSH
- biomedicínský výzkum * MeSH
- lidé MeSH
- metaanalýza jako téma * MeSH
- publikační zkreslení * MeSH
- žurnalistika lékařská (masmédia) MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
Because negative findings have less chance of getting published, available studies tend to be a biased sample. This leads to an inflation of effect size estimates to an unknown degree. To see how meta-analyses in education account for publication bias, we surveyed all meta-analyses published in the last five years in the Review of Educational Research and Educational Research Review. The results show that meta-analyses usually neglect publication bias adjustment. In the minority of meta-analyses adjusting for bias, mostly non-principled adjustment methods were used, and only rarely were the conclusions based on corrected estimates, rendering the adjustment inconsequential. It is argued that appropriate state-of-the-art adjustment (e.g., selection models) should be attempted by default, yet one needs to take into account the uncertainty inherent in any meta-analytic inference under bias. We conclude by providing practical recommendations on dealing with publication bias.
Faculty of Education University of Presov Presov Slovakia
Institute of Psychology Faculty of Arts University of Presov Presov Slovakia
Zobrazit více v PubMed
Hattie J. Visible learning. New York, NY: Routledge; 2009.
Lortie-Forgues H, Inglis M. Rigorous large-scale educational RCTs are often uninformative: Should we be concerned? Educ Res. 2019;48(3):158–66.
Kvarven A, Strømland E, Johannesson M. Comparing meta-analyses and preregistered multiple-laboratory replication projects. Nat Hum Behav. 2020;4(4):423–34. doi: 10.1038/s41562-019-0787-z PubMed DOI
Rosenthal R. The file drawer problem and tolerance for null results. Psychol Bull. 1979;86(3):638–41.
Marks-Anglin A, Chen Y. A Historical Review of Publication Bias [Internet]. MetaArXiv; 2020. Available from: osf.io/preprints/metaarxiv/zmdpk doi: 10.1002/jrsm.1452 PubMed DOI
Ioannidis JPA. Why most discovered true associations are inflated. Epidemiology. 2008;19(5):640–8. doi: 10.1097/EDE.0b013e31818131e7 PubMed DOI
McShane BB, Böckenholt U, Hansen KT. Adjusting for publication bias in meta-analysis: An evaluation of selection methods and some cautionary notes. Perspect Psychol Sci. 2016;11(5):730–49. doi: 10.1177/1745691616662243 PubMed DOI
Rothstein HR, Sutton AJ, Borenstein DM. Publication Bias in Meta-Analysis. Chichester, UK: Wiley; 2006.
Doucouliagos C, Stanley TD. Are all economic facts greatly exaggerated? Theory competition and selectivity: Are all economic facts exaggerated? J Econ Surv. 2013;27(2):316–39.
Carter EC, McCullough ME. Publication bias and the limited strength model of self-control: has the evidence for ego depletion been overestimated? Front Psychol [Internet]. 2014;5. Available from: doi: 10.3389/fpsyg.2014.00823 PubMed DOI PMC
Hilgard J, Sala G, Boot WR, Simons DJ. Overestimation of action-game training effects: Publication bias and salami slicing. Collabra Psychol. 2019;5(1):30.
Hagger MS, Wood C, Stiff C, Chatzisarantis NLD. Ego depletion and the strength model of self-control: a meta-analysis. Psychol Bull. 2010;136(4):495–525. doi: 10.1037/a0019486 PubMed DOI
Hagger MS, Chatzisarantis NLD, Alberts H, Anggono CO, Batailler C, Birt AR, et al.. A multilab preregistered replication of the ego-depletion effect. Perspect Psychol Sci. 2016;11(4):546–73. doi: 10.1177/1745691616652873 PubMed DOI
Carter EC, Schönbrodt FD, Gervais WM, Hilgard J. Correcting for bias in psychology: A comparison of meta-analytic methods. Adv Methods Pract Psychol Sci. 2019;2(2):115–44.
Hong S, Reed WR. Using Monte Carlo experiments to select meta‐analytic estimators. Res Synth Methods. 2020;12(2):192–215 doi: 10.1002/jrsm.1467 PubMed DOI PMC
Augusteijn H, van Aert RCM, van Assen MALM. The Effect of Publication Bias on the Assessment of Heterogeneity [Internet]. OSF Preprints; 2017. Available from: osf.io/gv25c
Sala G, Aksayli ND, Tatlidil KS, Tatsumi T, Gondo Y, Gobet F. Near and far transfer in cognitive training: A second-order meta-analysis. Collabra Psychol. 2019;5(1):18.
Sala G, Tatlidil KS, Gobet F. Still no evidence that exergames improve cognitive ability: A commentary on Stanmore et al. (2017). Neurosci Biobehav Rev [Internet]. 2019; Available from: doi: 10.1016/j.neubiorev.2019.11.015 PubMed DOI
Kepes S, Thomas MA. Assessing the robustness of meta-analytic results in information systems: publication bias and outliers. Eur J Inf Syst. 2018;27(1):90–123.
Harrison JS, Banks GC, Pollack JM, O’Boyle EH, Short J. Publication bias in strategic management research. J Manage. 2017;43(2):400–25.
van Elk M, Matzke D, Gronau QF, Guan M, Vandekerckhove J, Wagenmakers E-J. Meta-analyses are no substitute for registered replications: a skeptical perspective on religious priming. Front Psychol. 2015;6:1365. doi: 10.3389/fpsyg.2015.01365 PubMed DOI PMC
Kuper N, Bott A. Has the evidence for moral licensing been inflated by publication bias? [Internet]. PsyArXiv; 2018. Available from: psyarxiv.com/93q5j
Field JG, Bosco FA, Kepes S. How robust is our cumulative knowledge on turnover? J Bus Psychol [Internet]. 2020; Available from: 10.1007/s10869-020-09687-3 DOI
White CA, Uttl B, Holder MD. Meta-analyses of positive psychology interventions: The effects are much smaller than previously reported. PLoS One. 2019;14(5):e0216588. doi: 10.1371/journal.pone.0216588 PubMed DOI PMC
Ritchie SJ. Publication bias in a recent meta-analysis on breastfeeding and IQ. Acta Paediatr. 2017;106(2):345. doi: 10.1111/apa.13539 PubMed DOI
Trinquart L, Chatellier G, Ravaud P. Adjustment for reporting bias in network meta-analysis of antidepressant trials. BMC Med Res Methodol. 2012;12(1):150. doi: 10.1186/1471-2288-12-150 PubMed DOI PMC
Hilgard J, Engelhardt CR, Rouder JN. Overstated evidence for short-term effects of violent games on affect and behavior: A reanalysis of Anderson et al. (2010). Psychol Bull. 2017;143(7):757–74. doi: 10.1037/bul0000074 PubMed DOI
Copas JB, Shi JQ. A sensitivity analysis for publication bias in systematic reviews. Stat Methods Med Res. 2001;10(4):251–65. doi: 10.1177/096228020101000402 PubMed DOI
Onishi A, Furukawa TA. Publication bias is underreported in systematic reviews published in high-impact-factor journals: metaepidemiologic study. J Clin Epidemiol. 2014;67(12):1320–6. doi: 10.1016/j.jclinepi.2014.07.002 PubMed DOI
Light RJ. Pillemer DB. Summing up. The science of reviewing research. Cambridge, MA: Harvard University Press; 1984.
Egger M, Davey Smith G, Schneider M, Minder C. Bias in meta-analysis detected by a simple, graphical test. BMJ. 1997;315(7109):629–34. doi: 10.1136/bmj.315.7109.629 PubMed DOI PMC
Begg CB, Mazumdar M. Operating characteristics of a rank correlation test for publication bias. Biometrics. 1994;50(4):1088–101. PubMed
Ioannidis JPA, Trikalinos TA. An exploratory test for an excess of significant findings. Clin Trials. 2007;4(3):245–53. doi: 10.1177/1740774507079441 PubMed DOI
Duval S, Tweedie R. Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics. 2000;56(2):455–63. doi: 10.1111/j.0006-341x.2000.00455.x PubMed DOI
Stanley TD, Doucouliagos H. Meta-regression approximations to reduce publication selection bias. Res Synth Methods. 2014;5(1):60–78. doi: 10.1002/jrsm.1095 PubMed DOI
Simonsohn U, Nelson LD, Simmons JP. P-curve and effect size: Correcting for publication bias using only significant results. Perspect Psychol Sci. 2014;9(6):666–81. doi: 10.1177/1745691614553988 PubMed DOI
van Assen MALM, van Aert RCM, Wicherts JM. Meta-analysis using effect size distributions of only statistically significant studies. Psychol Methods. 2015;20(3):293–309. doi: 10.1037/met0000025 PubMed DOI
Polanin JR, Tanner-Smith EE, Hennessy EA. Estimating the difference between published and unpublished effect sizes: A meta-review. Rev Educ Res. 2016;86(1):207–36.
Ferguson CJ, Heene M. A vast graveyard of undead theories: Publication bias and psychological science’s aversion to the null. Perspect Psychol Sci. 2012;7(6):555–61. doi: 10.1177/1745691612459059 PubMed DOI
Becker BJ. Failsafe N or File-Drawer Number. In: Publication Bias in Meta-Analysis. Chichester, UK: John Wiley & Sons, Ltd; 2006. p. 111–25.
Morey RD. The consistency test does not–and cannot–deliver what is advertised: A comment on Francis (2013). J Math Psychol. 2013;57(5):180–3.
Sterne JAC, Egger M. Regression methods to detect publication and other bias in meta-analysis. In: Publication Bias in Meta-Analysis. Chichester, UK: John Wiley & Sons, Ltd; 2006. p. 99–110.
Franco A, Malhotra N, Simonovits G. Social science. Publication bias in the social sciences: unlocking the file drawer. Science. 2014;345(6203):1502–5. doi: 10.1126/science.1255484 PubMed DOI
Gelman A., Loken E. The Statistical Crisis in Science. Am. Sci. 2014;102(6):460–465.
Friese M, Frankenbach J. p-Hacking and publication bias interact to distort meta-analytic effect size estimates. Psychol Methods. 2020;25(4):456–71. doi: 10.1037/met0000246 PubMed DOI
Renkewitz F, Keiner M. How to detect publication bias in psychological research? A comparative evaluation of six statistical methods [Internet]. PsyArXiv; 2018. Available from: psyarxiv.com/w94ep
Nelson L. How many studies have not been run? Why we still think the average effect does not exist [Internet]. Datacolada.org. 2018. Available from: http://datacolada.org/70
Simpson A. The misdirection of public policy: comparing and combining standardised effect sizes. J Educ Pol. 2017;32(4):450–66.
Pigott TD, Polanin JR. Methodological guidance paper: High-quality meta-analysis in a systematic review. Rev Educ Res. 2020;90(1):24–46.
Mathur MB, VanderWeele T. Estimating publication bias in meta-analyses of peer-reviewed studies: A meta-meta-analysis across disciplines and journal tiers [Internet]. OSF Preprints; 2019. Available from: osf.io/p3xyd PubMed PMC
Lau J, Ioannidis JPA, Terrin N, Schmid CH, Olkin I. The case of the misleading funnel plot. BMJ. 2006;333(7568):597–600. doi: 10.1136/bmj.333.7568.597 PubMed DOI PMC
Peters JL, Sutton AJ, Jones DR, Abrams KR, Rushton L. Contour-enhanced meta-analysis funnel plots help distinguish publication bias from other causes of asymmetry. J Clin Epidemiol. 2008;61(10):991–6. doi: 10.1016/j.jclinepi.2007.11.010 PubMed DOI
Emerson GB, Warme WJ, Wolf FM, Heckman JD, Brand RA, Leopold SS. Testing for the presence of positive-outcome bias in peer review: a randomized controlled trial: A randomized controlled trial. Arch Intern Med. 2010;170(21):1934–9. doi: 10.1001/archinternmed.2010.406 PubMed DOI
Senn S. Misunderstanding publication bias: editors are not blameless after all. F1000Res. 2012;1:59. doi: 10.12688/f1000research.1-59.v1 PubMed DOI PMC
Fanelli D. Negative results are disappearing from most disciplines and countries. Scientometrics. 2012;90(3):891–904.
Mathur MB, VanderWeele T. Sensitivity analysis for publication bias in meta-analyses [Internet]. OSF Preprints; 2019. Available from: osf.io/s9dp6 PubMed PMC
Maier M, Bartoš F, Wagenmakers E-J. Robust Bayesian Meta-Analysis: Addressing Publication Bias with Model-Averaging [Internet]. PsyArXiv; 2020. Available from: psyarxiv.com/u4cns PubMed
Guan M, Vandekerckhove J. A. Bayesian approach to mitigation of publication bias. Psychon Bull Rev. 2016;23(1):74–86. doi: 10.3758/s13423-015-0868-6 PubMed DOI
Moss J, De Bin R. Modelling publication bias and p-hacking [Internet]. arXiv; 2019. Available from: http://arxiv.org/abs/1911.12445 PubMed
Vevea JL, Woods CM. Publication Bias in Research Synthesis: Sensitivity Analysis Using A Priori Weight Functions. Psychol Methods. 2005;10(4):428–443 doi: 10.1037/1082-989X.10.4.428 PubMed DOI
Rodgers MA, Pustejovsky JE. Evaluating meta-analytic methods to detect selective reporting in the presence of dependent effect sizes. Psychol Methods [Internet]. 2020; Available from: doi: 10.1037/met0000300 PubMed DOI
IJzerman H, Hadi R, Coles N, Sarda E, Klein RA, Ropovik I. Social thermoregulation: A meta-analysis. Unpublished Manuscript; 2020.
Bell JF. The small-study effect in educational trials. Eff educ. 2011;3(1):35–48.
Cheung ACK, Slavin RE. How methodological features affect effect sizes in education. Educ Res. 2016;45(5):283–92.
Simonsohn U. The funnel plot is invalid because of this crazy assumption: r(n,d) = 0 [Internet]. Datacolada.org. 2017. Available from: http://datacolada.org/58
Ferguson CJ, Brannick MT. Publication bias in psychological science: Prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychol Methods. 2012;17(1):120–128. doi: 10.1037/a0024445 PubMed DOI
Lakens D, Hilgard J, Staaks J. On the reproducibility of meta-analyses: six practical recommendations. BMC Psychol. 2016;4(1):24. doi: 10.1186/s40359-016-0126-3 PubMed DOI PMC
Ioannidis JPA. The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses: Mass production of systematic reviews and meta-analyses. Milbank Q. 2016;94(3):485–514. doi: 10.1111/1468-0009.12210 PubMed DOI PMC
Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097. doi: 10.1371/journal.pmed.1000097 PubMed DOI PMC
Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Percie du Sert N, et al.. A manifesto for reproducible science. Nat Hum Behav. 2017;1(1):0021. doi: 10.1038/s41562-016-0021 PubMed DOI PMC