The following bibliography offers a thematically structured introduction to key aspects of Open Science. The selection is curated and covers key topics – from replication and statistics, open access and data availability to questions of publication culture and meta-research. This allows you to read specifically about how Open Science practices are impacting and developing in various areas of research.

Replication and replicability
Aguinis, H., Cascio, W. F., & Ramani, R. S. (2020).
Science’s reproducibility and replicability crisis: International business is not immune. In L. Eden (Ed.), JIBS Special Collections. Research Methods in International Business (pp. 45–66). Springer International Publishing AG.
https://doi.org/10.1007/978-3-030-22113-3_2
Altmejd, A., Dreber, A., Forsell, E., Huber, J., Imai, T., Johannesson, M., Kirchler, M., Nave, G., & Camerer, C. (2019).
Predicting the replicability of social science lab experiments. PloS One, 14(12), Article e0225826.
https://doi.org/10.1371/journal.pone.0225826
Baumeister, R. F., Tice, D. M., & Bushman, B. J. (2023).
A review of multisite replication projects in social psychology: Is it viable to sustain any confidence in social psychology’s knowledge base? Perspectives on Psychological Science, 18(4), 912–935.
https://doi.org/10.1177/17456916221121815
Brandt, M. J., IJzerman, H., Dijksterhuis, A., Farach, F. J., Geller, J., Giner-Sorolla, R., … & Van’t Veer, A. (2014).
The replication recipe: What makes for a convincing replication? Journal of Experimental Social Psychology, 50, 217-224.
Breznau, N., Rinke, E. M., Wuttke, A., Nguyen, H. H. V., Adem, M., Adriaans, J., Alvarez-Benjumea, A., Andersen, H. K., Auer, D., Azevedo, F., Bahnsen, O., Balzer, D., Bauer, G., Bauer, P. C., Baumann, M., Baute, S., Benoit, V., Bernauer, J., Berning, C., . . . Żółtak, T. (2022).
Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty. Proceedings of the National Academy of Sciences, 119(44), Article e2203150119.
https://doi.org/10.1073/pnas.2203150119
Brodeur, A., Dreber, A., La Hoces de Guardia, F., & Miguel, E. (2023).
Replication games: How to make reproducibility research more systematic. Nature, 621(7980), 684–686.
https://doi.org/10.1038/d41586-023-02997-5
Calderon, S., Mac Giolla, E., Ask, K., & Luke, T. J. (2023).
Effects of psychological distance on mental abstraction: A registered report of four tests of construal level theory (Stage 1 registered report). Advances in Methods and Practices in Psychological Science, Manuscript accepted in principle.
https://doi.org/10.31234/osf.io/wqbhd
Crandall, C. S., & Sherman, J. W. (2016).
On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93-99.
Davis, A. M., Flicker, B., Hyndman, K., Katok, E., Keppler, S., Leider, S., Long, X., & Tong, J. D. (2023).
A replication study of operations management experiments in Management Science. Management Science, 69(9), 4977–4991.
https://doi.org/10.1287/mnsc.2023.4866
Derksen, M., & Morawski, J. (2022).
Kinds of replication: Examining the meanings of “conceptual replication” and “direct replication”. Perspectives on Psychological Science, 17(5), 1490-1505.
Duckworth, A. L., & Milkman, K. L. (2022).
A guide to megastudies. PNAS Nexus, 1(5), Article pgac214.
https://doi.org/10.1093/pnasnexus/pgac214
Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., Baranski, E., Bernstein, M. J., Bonfiglio, D. B., Boucher, L., Brown, E. R., Budiman, N. I., Cairo, A. H., Capaldi, C. A., Chartier, C. R., Chung, J. M., Cicero, D. C., Coleman, J. A., Conway, J. G., . . . Nosek, B. A. (2016).
Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology, 67, 68–82.
https://doi.org/10.1016/j.jesp.2015.10.012
Fiedler, K., & Trafimow, D. (2024).
Using theoretical constraints and the TASI taxonomy to delineate predictably replicable findings.
Psychonomic Bulletin & Review. Advance online publication.
https://doi.org/10.3758/s13423-024-02521-4
Freese, J., & Peterson, D. (2017).
Replication in social science.
Annual Review of Sociology, 43, 147-165.
Galak, J., Leboeuf, R. A., Nelson, L. D., & Simmons, J. P. (2012).
Correcting the past: Failures to replicate ψ.
Journal of Personality and Social Psychology, 103(6), 933–948.
https://doi.org/10.1037/a0029709
Genschow, O., Westfal, M., Crusius, J., Bartosch, L., Feikes, K. I., Pallasch, N., & Wozniak, M. (2021).
Does social psychology persist over half a century? A direct replication of Cialdini et al.’s (1975) classic door-in-the-face technique.
Journal of Personality and Social Psychology, 120(2), e1.
Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016).
Comment on “Estimating the reproducibility of psychological science”.
Science, 351(6277), 1037.
https://doi.org/10.1126/science.aad7243
Ijzerman, H., Ropovik, I., Ebersole, C. R., Tidwell, N. D., Markiewicz, Ł., Lima, T. J. S. de, Wolf, D., Novak, S. A., Collins, W. M., Menon, M., Souza, L. E. C. de, Sawicki, P., Boucher, L., Białek, M., Idzikowska, K., Razza, T. S., Kraus, S., Weissgerber, S. C., Baník, G., . . . Day, C. R. (2020).
Many Labs 5: Registered Replication of Förster, Liberman, and Kuschel’s (2008) Study 1.
Advances in Methods and Practices in Psychological Science, 3(3), 366–376.
https://doi.org/10.1177/2515245920916513
Klein, R. A., Cook, C. L., Ebersole, C. R., Vitiello, C., Nosek, B. A., Hilgard, J., Ahn, P. H., Brady, A. J., Chartier, C. R., Christopherson, C. D., Clay, S., Collisson, B., Crawford, J. T., Cromar, R., Gardiner, G., Gosnell, C. L., Grahe, J., Hall, C., Howard, I., . . . Ratliff, K. A. (2022).
Many Labs 4: Failure to Replicate Mortality Salience Effect With and Without Original Author Involvement.
Collabra: Psychology, 8(1), Article 35271.
https://doi.org/10.1525/collabra.35271
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Bahník, Š., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Brumbaugh, C. C., Cemalcilar, Z., Chandler, J., Cheong, W., Davis, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., . . . Nosek, B. A. (2014).
Investigating variation in replicability.
Social Psychology, 45(3), 142–152.
https://doi.org/10.1027/1864-9335/a000178
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., Aveyard, M., Axt, J. R., Babalola, M. T., Bahník, Š., Batra, R., Berkics, M., Bernstein, M. J., Berry, D. R., Bialobrzeska, O., Binan, E. D., Bocian, K., Brandt, M. J., Busching, R., . . . Nosek, B. A. (2018).
Many Labs 2: Investigating variation in replicability across samples and settings.
Advances in Methods and Practices in Psychological Science, 1(4), 443–490.
https://doi.org/10.1177/2515245918810225
Machery, E. (2020).
What is a Replication?
Philosophy of Science, 87, 545-567.
Maier, M., Wong, Y. C., & Feldman, G. (2024).
Revisiting and rethinking the identifiable victim effect: Replication and extension of Small, Loewenstein, and Slovic (2007).
Collabra: Psychology, 9(1), Article 90203.
https://doi.org/10.1525/collabra.90203
O’Donnell, M., Dev, A. S., Antonoplis, S., Baum, S. M., Benedetti, A. H., Brown, N. D., Carrillo, B., Choi, A. L., Connor, P., Donnelly, K., Ellwood-Lowe, M. E., Foushee, R., Jansen, R., Jarvis, S. N., Lundell-Creagh, R., Ocampo, J. M., Okafor, G. N., Azad, Z. R., Rosenblum, M., . . . Nelson, L. D. (2021).
Empirical audit and review and an assessment of evidentiary value in research on the psychological consequences of scarcity.
Proceedings of the National Academy of Sciences, 118(44), Article e2103313118.
https://doi.org/10.1073/pnas.2103313118
Open Science Collaboration. (2015).
Estimating the reproducibility of psychological science.
Science, 349(6251), aac4716.
Röseler, L., Kaiser, L., Doetsch, C., Klett, N., Seida, C., Schütz, A., Aczel, B., Adelina, N., Agostini, V., Alarie, S., Albayrak-Aydemir, N., AlDoh, A., Al-Hoorie, A. H., Azevedo, F., Baker, B. J., Barth, C. L., Beitner, J., Brick, C., Brohmer, H., . . . Zhang, Y. (2024).
The Replication Database: Documenting the Replicability of Psychological Science.
Journal of Open Psychology Data, 12(1), 8.
https://doi.org/10.5334/jopd.101
Simons, D. J. (2014).
The value of direct replication.
Perspectives on Psychological Science, 9, 76-80.
Stroebe, W., & Strack, F. (2014).
The alleged crisis and the illusion of exact replication.
Perspectives on Psychological Science, 9, 59-71.
Tipu, S. A. A., & Ryan, J. C. (2022).
Are business and management journals anti-replication? An analysis of editorial policies.
Management Research Review, 45(1), 101-117.
Vohs, K. D., Schmeichel, B. J., Lohmann, S., Gronau, Q. F., Finley, A. J., Ainsworth, S. E., Alquist, J. L., Baker, M. D., Brizi, A., Bunyi, A., Butschek, G. J., Campbell, C., Capaldi, J., Cau, C., Chambers, H., Chatzisarantis, N. L. D., Christensen, W. J., Clay, S. L., Curtis, J., . . . Albarracín, D. (2021).
A multisite preregistered paradigmatic test of the ego-depletion effect.
Psychological Science, 32(10), 1566–1581.
https://doi.org/10.1177/0956797621989733
Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2018).
Making replication mainstream.
Behavioral and Brain Sciences, 41, e120.

Methods, statistics and p-hacking
Brodeur, A., Cook, N., & Heyes, A. (2022).
We need to talk about Mechanical Turk: What 22,989 hypothesis tests tell us about publication bias and p-hacking in online experiments,
IZA Discussion Paper No. 15478.
https://doi.org/10.2139/ssrn.4188289
Hou, K., Xue, C., & Zhang, L. (2020).
Replicating anomalies.
The Review of Financial Studies, 33(5), 2019–2133.
https://doi.org/10.1093/rfs/hhy131
Isager, P. M., van Aert, R. C. M., Bahník, Š., Brandt, M. J., DeSoto, K. A., Giner-Sorolla, R., Krueger, J. I., Perugini, M., Ropovik, I., van’t Veer, A. E., Vranka, M., & Lakens, D. (2023).
Deciding what to replicate: A decision model for replication study selection under resource and knowledge constraints.
Psychological Methods, 28(2), 438–451.
https://doi.org/10.1037/met0000438
John, L. K., Loewenstein, G., & Prelec, D. (2012).
Measuring the prevalence of questionable research practices with incentives for truth telling.
Psychological Science, 23(5), 524–532.
https://doi.org/10.1177/0956797611430953
Sarstedt, M., & Adler, S. J. (2023).
An advanced method to streamline p-hacking.
Journal of Business Research, 163, Article 113942.
https://doi.org/10.1016/j.jbusres.2023.113942
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011).
False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant.
Psychological Science, 22(11), 1359–1366.
https://doi.org/10.1177/0956797611417632
Wagenmakers, E.‑J., Sarafoglou, A., Aarts, S., Albers, C., Algermissen, J., Bahník, Š., van Dongen, N., Hoekstra, R., Moreau, D., van Ravenzwaaij, D., Sluga, A., Stanke, F., Tendeiro, J., & Aczel, B. (2021).
Seven steps toward more transparency in statistical practice.
Nature Human Behaviour, 5(11), 1473–1480.
https://doi.org/10.1038/s41562-021-01211-8

Preregistration and Registered Reports
Chambers, C. D., & Tzavella, L. (2022).
The past, present and future of Registered Reports.
Nature Human Behaviour, 6(1), 29–42.
https://doi.org/10.1038/s41562-021-01193-7
Fraser, N., Momeni, F., Mayr, P., & Peters, I. (2020).
The relationship between bioRxiv preprints, citations and altmetrics.
Quantitative Science Studies, 1(2), 618–638.
https://doi.org/10.1162/qss_a_00043
Hardwicke, T. E., & Ioannidis, J. P. A. (2018).
Mapping the universe of registered reports.
Nature Human Behaviour, 2(11), 793–796.
https://doi.org/10.1038/s41562-018-0444-y
Hardwicke, T. E., & Wagenmakers, E.‑J. (2023).
Reducing bias, increasing transparency and calibrating confidence with preregistration.
Nature Human Behaviour, 7(1), 15–26.
https://doi.org/10.1038/s41562-022-01497-2
Henderson, E. L., & Chambers, C. D. (2022).
Ten simple rules for writing a Registered Report.
PLoS Computational Biology, 18(10), Article e1010571.
https://doi.org/10.1371/journal.pcbi.1010571
Kvarven, A., Strømland, E., & Johannesson, M. (2020).
Comparing meta-analyses and preregistered multiple-laboratory replication projects.
Nature Human Behaviour, 4, 423–434.
https://doi.org/10.1038/s41562-019-0787-z
Lakens, D. (2024).
When and how to deviate from a preregistration.
Collabra: Psychology, 10(1), Article 117094.
https://doi.org/10.1525/collabra.117094
Mellor, D. T., & Nosek, B. A. (2018).
Easy preregistration will benefit any research.
Nature Human Behaviour, 2, Article 98.
https://doi.org/10.1038/s41562-018-0294-7
Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., van’t Veer, A. E., & Vazire, S. (2019).
Preregistration Is hard, and worthwhile.
Trends in Cognitive Sciences, 23(10), 815–818.
https://doi.org/10.1016/j.tics.2019.07.009
Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018).
The preregistration revolution.
Proceedings of the National Academy of Sciences, 115(11), 2600–2606.
https://doi.org/10.1073/pnas.1708274114
Nosek, B. A., & Lakens, D. (2014).
Registered reports.
Social Psychology, 45(3), 137–141.
https://doi.org/10.1027/1864-9335/a000192
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2021a).
Pre‐registration is a game changer. But, like random assignment, it is neither necessary nor sufficient for credible science.
Journal of Consumer Psychology, 31(1), 177–180.
https://doi.org/10.1002/jcpy.1207
Ofosu, G. K., & Posner, D. N. (2020).
Do pre-analysis plans hamper publication?
AEA Papers and Proceedings, 110, 70–74.
https://doi.org/10.1257/pandp.20201079
Ofosu, G. K., & Posner, D. N. (2023).
Pre-analysis plans: An early stocktaking.
Perspectives on Politics, 21(1), 174–190.
https://doi.org/10.1017/S1537592721000931
Pham, M. T., & Oh, T. T. (2021a).
On not confusing the tree of trustworthy statistics with the greater forest of good science: A comment on Simmons et al.’s perspective on pre‐registration.
Journal of Consumer Psychology, 31(1), 181–185.
https://doi.org/10.1002/jcpy.1213
Pham, M. T., & Oh, T. T. (2021b).
Preregistration is neither sufficient nor necessary for good science.
Journal of Consumer Psychology, 31(1), 163–176.
https://doi.org/10.1002/jcpy.1209
Simmons, J. P., Nelson, L., & Simonsohn, U. (2021b).
Pre‐registration: Why and how.
Journal of Consumer Psychology, 31(1), 151–162.
https://doi.org/10.1002/jcpy.1208
van den Akker, O. R., Bakker, M., van Assen, M. A. L. M., Pennington, C. R., Verweij, L., Elsherif, M. M., Claesen, A., Gaillard, S. D. M., Yeung, S. K., Frankenberger, J.‑L., Krautter, K., Cockcroft, J. P., Kreuer, K. S., Evans, T. R [Thomas Rhys], Heppel, F. M., Schoch, S. F., Korbmacher, M., Yamada, Y., Albayrak-Aydemir, N., . . . Wicherts, J. M. (2024).
The potential of preregistration in psychology: Assessing preregistration producibility and preregistration-study consistency.
Psychological Methods. Advance online publication.
https://doi.org/10.1037/met0000687
van’t Veer, A. E., & Giner-Sorolla, R. (2016).
Pre-registration in social psychology—A discussion and suggested template.
Journal of Experimental Social Psychology, 67, 2–12.
https://doi.org/10.1016/j.jesp.2016.03.004

Open Access
Dallmeier-Tiessen, S., Darby, R., Goerner, B., Hyppoelae, J., Igo-Kemenes, P., Kahn, D., Lambert, S., Lengenfelder, A., Leonard, C., Mele, S., Nowicka, M., Polydoratou, P., Ross, D., Ruiz-Perez, S., Schimmer, R., Swaisland, M., & van der Stelt, W. (2011).
Open access journals – what publishers offer, what researchers want.
Information Services & Use, 31(1-2), 85–91.
https://doi.org/10.3233/ISU-2011-0624
Greussing, E., Kuballa, S., Taddicken, M., Schulze, M., Mielke, C., & Haux, R. (2020).
Drivers and obstacles of open access publishing. A qualitative investigation of individual and institutional factors.
Frontiers in Communication, 5, Article 587465.
https://doi.org/10.3389/fcomm.2020.587465
Shah, D. T. (2017).
Open access publishing: pros, cons, and current threats.
Marshall Journal of Medicine, 3(3), 1.
Data availability and policy
Asswad, J., & Gómez, J. M. (2021).
Data ownership: A survey.
Information, 12(11), 465.
https://doi.org/10.3390/info12110465
Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K., & McGillivray, B. (2020).
The citation advantage of linking publications to research data.
PloS One, 15(4), Article e0230416.
https://doi.org/10.1371/journal.pone.0230416
Golder, P. N., Dekimpe, M. G., An, J. T., van Heerde, H. J., Kim, D. S., & Alba, J. W. (2023).
Learning from data: An empirics-first approach to relevant knowledge generation.
Journal of Marketing, 87(3), 319–336.
https://doi.org/10.1177/00222429221129200
Gomes, D. G. E., Pottier, P., Crystal-Ornelas, R., Hudgins, E. J., Foroughirad, V., Sánchez-Reyes, L. L., Turba, R., Martinez, P. A., Moreau, D., Bertram, M. G., Smout, C. A., & Gaynor, K. M. (2022).
Why don’t we share data and code? Perceived barriers and benefits to public archiving practices.
Proceedings. Biological Sciences, 289(1987), Article 20221113.
https://doi.org/10.1098/rspb.2022.1113
Perrier, L., Blondal, E., & MacDonald, H. (2020).
The views, perspectives, and experiences of academic researchers with data sharing and reuse: A meta-synthesis.
PloS One, 15(2), Article e0229182.
https://doi.org/10.1371/journal.pone.0229182
Rouder, J. N. (2016).
The what, why, and how of born-open data.
Behavior Research Methods, 48(3), 1062–1069.
https://doi.org/10.3758/s13428-015-0630-z
Soeharjono, S., & Roche, D. G. (2021).
Reported individual costs and benefits of sharing open data among Canadian academic faculty in ecology and evolution.
BioScience, 71(7), 750–756.
https://doi.org/10.1093/biosci/biab024

Peer review and publication culture
Bravo, G., Grimaldo, F., López-Iñesta, E., Mehmani, B., & Squazzoni, F. (2019).
The effect of publishing peer review reports on referee behavior in five scholarly journals.
Nature Communications, 10, Article 322.
https://doi.org/10.1038/s41467-018-08250-2
Berg, J. M., Bhalla, N., Bourne, P. E., Chalfie, M., Drubin, D. G., Fraser, J. S., Greider, C. W., Hendricks, M., Jones, C., Kiley, R., King, S., Kirschner, M. W., Krumholz, H. M., Lehmann, R., Leptin, M., Pulverer, B., Rosenzweig, B., Spiro, J. E., Stebbins, M., . . . Wolberger, C. (2016).
Preprints for the life sciences.
Science, 352(6288), 899–901.
https://doi.org/10.1126/science.aaf9133
Dougherty, M. R., & Horne, Z. (2022).
Citation counts and journal impact factors do not capture some indicators of research quality in the behavioural and brain sciences.
Royal Society Open Science, 9(8), Article 220334.
https://doi.org/10.1098/rsos.220334
Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L.‑S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C., Errington, T. M., Fiedler, S., & Nosek, B. A. (2016).
Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency.
PLoS Biology, 14(5), Article e1002456.
https://doi.org/10.1371/journal.pbio.1002456
Moshontz, H., Binion, G., Walton, H., Brown, B. T., & Syed, M. (2021).
A guide to posting and managing preprints.
Advances in Methods and Practices in Psychological Science, 4(2).
https://doi.org/10.1177/25152459211019948
Sarabipour, S., Debat, H. J., Emmott, E., Burgess, S. J., Schwessinger, B., & Hensel, Z. (2019).
On the value of preprints: An early career researcher perspective.
PLoS Biology, 17(2), Article 3000151.
https://doi.org/10.1371/journal.pbio.3000151
Walsh, E., Rooney, M., Appleby, L., & Wilkinson, G. (2000).
Open peer review: A randomised controlled trial.
The British Journal of Psychiatry, 176(1), 47–51.
https://doi.org/10.1192/bjp.176.1.47
Wolfram, D., Wang, P., Hembree, A., & Park, H. (2020).
Open peer review: promoting transparency in Open Science.
Scientometrics, 125(2), 1033–1051.
https://doi.org/10.1007/s11192-020-03488-4
Zong, Q., Huang, Z., & Huang, J. (2023).
Do Open Science badges work? Estimating the effects of Open Science badges on an article’s social media attention and research impacts.
Scientometrics, 128(6), 3627–3648.
https://doi.org/10.1007/s11192-023-04720-7

Meta-research and policy
Bahlai, C., Bartlett, L., Burgio, K., Fournier, A., Keiser, C., Poisot, T., & Whitney, K. (2019).
Open science isn’t always open to all scientists.
American Scientist, 107(2), 78.
https://doi.org/10.1511/2019.107.2.78
Carlsson, R., Danielsson, H., Heene, M., Innes-Ker, Å., Lakens, D., Schimmack, U., Schönbrodt, F. D., van Assen, M., & Weinstein, Y. (2017).
Inaugural editorial of Meta-Psychology.
Meta-Psychology, 1, Article a1001.
https://doi.org/10.15626/MP2017.1001
Ioannidis, J. P. A. (2005).
Why most published research findings are false.
PLoS Medicine, 2(8), Article e124.
https://doi.org/10.1371/journal.pmed.0020124
Korbmacher, M., Azevedo, F., Pennington, C. R., Hartmann, H., Pownall, M., Schmidt, K., Elsherif, M. M., Breznau, N., Robertson, O., Kalandadze, T., Yu, S., Baker, B. J., O’Mahony, A., Olsnes, J. Ø.‑S., Shaw, J. J., Gjoneska, B., Yamada, Y., Röer, J. P [Jan P.], Murphy, J., . . . Evans, T. (2023).
The replication crisis has led to positive structural, procedural, and community changes.
Communications Psychology, 1, Article 3.
https://doi.org/10.1038/s44271-023-00003-2
Lakens, D., & Ensinck, E. N. F. (2024).
Make abandoned research publicly available.
Nature Human Behaviour, 8, 609–610.
https://doi.org/10.1038/s41562-024-01829-4
Leavitt, K. (2013).
Publication bias might make us untrustworthy, but the solutions may be worse.
Industrial and Organizational Psychology, 6(3), 290–295.
https://doi.org/10.1111/iops.12052
Lehmann, S., & Bengart, P. (2016).
Replications hardly possible: reporting practice in top-tier marketing journals.
Journal of Modelling in Management, 11(2), 427–445.
https://doi.org/10.1108/JM2-04-2014-0030
Maier, M., Bartoš, F., Stanley, T. D., Shanks, D. R., Harris, A. J. L., & Wagenmakers, E.‑J. (2022).
No evidence for nudging after adjusting for publication bias.
Proceedings of the National Academy of Sciences, 119(31), Article e2200300119.
https://doi.org/10.1073/pnas.2200300119
Oberauer, K., & Lewandowsky, S. (2019).
Addressing the theory crisis in psychology.
Psychonomic Bulletin & Review, 26(5), 1596–1618.
https://doi.org/10.3758/s13423-019-01645-2
Pashler, H., & Wagenmakers, E.‑J. (2012).
Editors’ introduction to the special section on replicability in Psychological Science: A crisis of confidence?
Perspectives on Psychological Science, 7(6), 528–530.
https://doi.org/10.1177/1745691612465253
Schönbrodt, F., Gärtner, A., Frank, M., Gollwitzer, M., Ihle, M., Mischkowski, D., … & Leising, D. (2022).
Responsible Research Assessment I: Implementing DORA for hiring and promotion in psychology.
https://doi.org/10.23668/psycharchives.8162
Skubera, M., Korbmacher, M., Evans, T. R., Azevedo, F., & Pennington, C. R. (2025).
International initiatives to enhance awareness and uptake of open research in psychology: a systematic mapping review.
Royal Society Open Science, 12(3), Article 241726.
https://doi.org/10.1098/rsos.241726
Tipu, S. A. A., & Ryan, J. C. (2022).
Are business and management journals anti-replication? An analysis of editorial policies.
Management Research Review, 45(1), 101–117.
https://doi.org/10.1108/MRR-01-2021-0050
Vazire, S. (2018).
Implications of the credibility revolution for productivity, creativity, and progress.
Perspectives on Psychological Science, 13(4), 411–417.
https://doi.org/10.1177/1745691617751884
On the effectiveness of Open Science practices
Hardwicke, T. E., Thibault, R. T., Clarke, B., Moodie, N., Crüwell, S., Schiavone, S. R., Handcock, S. A., Nghiem, K. an, Mody, F., Eerola, T., & Vazire, S. (2024).
Prevalence of transparent research practices in psychology: A cross-sectional study of empirical articles published in 2022.
Advances in Methods and Practices in Psychological Science, 7(4).
https://doi.org/10.1177/25152459241283477
Ofosu, G. K., & Posner, D. N. (2023).
Pre-analysis plans: An early stocktaking.
Perspectives on Politics, 21(1), 174–190.
https://doi.org/10.1017/S1537592721000931
Soderberg, C. K., Errington, T. M., Schiavone, S. R., Bottesini, J., Thorn, F. S., Vazire, S., Esterling, K. M., & Nosek, B. A. (2021).
Initial evidence of research quality of registered reports compared with the standard publishing model.
Nature Human Behaviour, 5(8), 990–997.
https://doi.org/10.1038/s41562-021-01142-4
Van Vaerenbergh, Y., Hazée, S, & Zwienenberg, T. J. (2025).
Open science: A review of its effectiveness and implications for service research.
Journal of Service Research, Advance online publication.
https://doi.org/10.1177/1094670525133

Open Science and transparency initiatives
Adler, S. J., Röseler, L., & Schöniger, M. K. (2023).
A toolbox to evaluate the trustworthiness of published findings.
Journal of Business Research, 167, Article 114189.
https://doi.org/10.1016/j.jbusres.2023.114189
Adler, S. J., Sharma, P. N., & Radomir, L. (2023).
Toward Open Science in PLS-SEM: Assessing the state of the art and future perspectives.
Journal of Business Research, 169, Article 114291.
https://doi.org/10.1016/j.jbusres.2023.114291
Claesen, A., Gomes, S., Tuerlinckx, F., & Vanpaemel, W. (2021).
Comparing dream to reality: An assessment of adherence of the first generation of preregistered studies.
Royal Society Open Science, 8(10), Article 211037.
https://doi.org/10.1098/rsos.211037
Coles, N. A., Hamlin, J. K., Sullivan, L. L., Parker, T. H., & Altschul, D. (2022).
Build up big-team science.
Nature, 601(7894), 505–507.
https://doi.org/10.1038/d41586-022-00150-2
Deer, L., Adler, S., Datta, H., Mizik, N., & Sarstedt, M. (2025).
Toward Open Science in marketing research.
International Journal of Research in Marketing, 42(1), 212–233.
https://doi.org/10.1016/j.ijresmar.2024.12.005
Guzzo, R. A., Schneider, B., & Nalbantian, H. R. (2022).
Open science, closed doors: The perils and potential of Open Science for research in practice.
Industrial and Organizational Psychology, 15(4), 495–515.
https://doi.org/10.1017/iop.2022.61
Hostler, T. J. (2023).
The invisible workload of open research.
Journal of Trial and Error, 4(1).
https://doi.org/10.36850/mr5
Forscher, P. S., Wagenmakers, E.‑J., Coles, N. A., Silan, M. A., Dutra, N., Basnight-Brown, D., & Ijzerman, H. (2022).
The benefits, barriers, and risks of big-team science.
Perspectives on Psychological Science, 18(3), 607–623.
https://doi.org/10.1177/17456916221082970
Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., Norris, E., O’Mahony, A., Parker, A. J., Todorovic, A., Pennington, C. R., Garcia-Pelegrin, E., Lazić, A., Robertson, O., Middleton, S. L., Valentini, B., McCuaig, J., Baker, B. J., Collins, E., . . . Aczel, B. (2022).
A community-sourced glossary of open scholarship terms.
Nature Human Behaviour, 6(3), 312–318.
https://doi.org/10.1038/s41562-021-01269-4
Pennington, C. R. (2023).
A student’s guide to Open Science: Using the replication crisis to reform psychology.
Open University Press.
Ross-Hellauer, T. (2017).
What is open peer review? A systematic review.
F1000Research, 6, 588.
https://doi.org/10.12688/f1000research.11369.2
Ross-Hellauer, T., & Görögh, E. (2019).
Guidelines for open peer review implementation.
Research Integrity and Peer Review, 4, 4.
https://doi.org/10.1186/s41073-019-0063-9
Sarstedt, M., Adler, S. J., Ringle, C. M., Cho, G., Diamantopoulos, A., Hwang, H., & Liengaard, B. D. (2024).
Same model, same data, but different outcomes: Evaluating the impact of method choices in structural equation modeling.
Journal of Product Innovation Management. Advance online publication.
https://doi.org/10.1111/jpim.12738
Siegfried, D., Scherp, G., Linek, S., & Flieger, E. (2024).
Die Bedeutung von Open Science in den Wirtschaftswissenschaften. Eine empirische Untersuchung der ZBW Leibniz-Informationszentrum Wirtschaft.
https://hdl.handle.net/10419/303026
Soliman, M., Sarstedt, M., Adler, S.J. et al.
A Tale of Open Science: Emergence of a New Normal. Schmalenbach J Bus Res (2025).
https://doi.org/10.1007/s41471-025-00218-5
Spellman, B., Gilbert, E., & Corker, K. S. (2017).
Open science: What, why, and how.
PsyArXiv.
https://osf.io/preprints/psyarxiv/ak6jr
Tennant, J. P., Waldner, F., Jacques, D. C., Masuzzo, P., Collister, L. B., & Hartgerink, C. H. J. (2016).
The academic, economic and societal impacts of Open Access: An evidence-based review [version 3; peer review: 4 approved, 1 approved with reservations].
F1000Research, 5, Article 632.
https://doi.org/10.12688/f1000research.8460.3
Van Vaerenbergh, Y., Hazée, S., & Zwienenberg, T. J. (2025).
Open science: A review of its effectiveness and implications for service research.
Journal of Service Research, Advance online publication.
https://doi.org/10.1177/1094670525133
Vicente-Saez, R., & Martinez-Fuentes, C. (2018).
Open Science now: A systematic literature review for an integrated definition.
Journal of Business Research, 88, 428–436.
https://doi.org/10.1016/j.jbusres.2017.12.043
Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.‑W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., . . . Mons, B. (2016).
The FAIR guiding principles for scientific data management and stewardship.
Scientific Data, 3, Article 160018.
https://doi.org/10.1038/sdata.2016.18
