Swiss Medical Weekly’s methods review in a nutshell – the whys and hows

DOI: https://doi.org/https://doi.org/10.57187/s.3967

Jan A. Roth

Clinical Epidemiologist and Consulting Managing Editor, Swiss Medical Weekly, Basel, Switzerland

Introduction

Peer review is a central pillar in medical research evaluation. Rising numbers of submissions entering the peer review stage, accentuated by both misguided incentives for many publishers [1] and more recently by commercial paper mills [2], have led some scholars to declare a “peer review crisis” [3, 4]. Scientific journals face difficulties finding peer-reviewers, impacting authors and journals alike [4]. Consequently, peer review phases take longer, with journals devoting significant resources to identifying and contacting overburdened peer-reviewers who have limited incentives to “squeeze in” another review. For the Diamond Open Access journal Swiss Medical Weekly, obtaining a single review often takes over a dozen reviewing requests. In addition to the peer review issue, focus is being placed on the quality of medical research and evaluation, with an increasing number of paper retractions globally [5]. While reports on poor research conduct and reporting are decades old, the “scandal of poor medical research” described by Doug Altman in the British Medical Journal remains valid 30 years later [6].

“Methods first” workflow

As outlined by our editors-in-chief [7], maintaining high scientific quality standards for published articles and supporting young researchers are the top priorities for the Swiss Medical Weekly’s editorial board, which comprises over 30 academic editors covering all major medical disciplines.

Thanks to a grant from the Fondation Leenaards, the Swiss Medical Weekly has established methods reviews early in the journal workflow, not only for selected papers after peer review (as done previously [8]). Every qualitative or quantitative study that passes the initial assessments by the editorial office, with checks for issues such as plagiarism, is first reviewed by a paid methodologist. The methodologist assesses a study’s quality, including the study design, statistics, and reporting according to well-established guidelines (e.g. from the EQUATOR network [9]). Recommendations are then made to the editor, and the manuscript is assigned to the editor-in-chief or an academic editor. They then decide whether the manuscript is suitable for peer review or whether it should be rejected, based on the initial methods review and the editor’s own appraisal. Unclear or controversial cases are discussed with the editorial board. For borderline cases, authors can revise their manuscript based on the comments of the methods reviewer, with the aim of sending an improved manuscript to the peer-reviewers and avoiding redundant comments on methodology. For manuscripts that directly enter the peer-review stage, methodological revision points are sent to the authors later along with comments from the external peer reviewers. Using this adapted “methods first” workflow, we have observed several implications for authors, editors and readers (table 1).

Table 1Implications of the “methods first” review approach at the Swiss Medical Weekly.

Role Implications  
Authors Faster feedback for submissions that would have been rejected in a standard workflow only after a full peer review
Certainty that the methodological and statistical aspects are double-checked by a methodologist
Comprehensive methods review independent of the publication outcome with constructive recommendations
Editors/editorial office Submissions with relevant methodological issues can be identified at an early stage
Lower risk that poorly conducted studies are sent to peer review, allowing for better allocation of editors, the editorial office and, most importantly, peer-reviewer resources
Lower risk that publications must be retracted after publication for methodological reasons
Readers Certainty that the study methods and statistics have been checked independently, as many peer-reviewers focus on the medical aspects
Improved comprehensibility and completeness because manuscripts are required to follow established reporting guidelines
Independent checks that the study limitations have been clearly outlined and that misinterpretations/spins [10] are avoided

So, what goes wrong and how to avoid it?

At the Swiss Medical Weekly, we analysed the last 50 consecutive “methods first“ reviews until April 2024, which were based on the first submissions of original studies (table 2). Of these, 13 (26%) had severe methodological limitations. Eight of these were directly rejected by the editor (median of one day for methods review; final decision and comments sent to the author after a median of 8.5 days after the methods review). The remaining five submissions were with the editor at the time of writing.

Among the 50 submissions, statistical and study design issues were frequent, impacting 75–85% and 25–62% of the submissions, respectively. All manuscripts had at least a few reporting issues. Twelve of the 50 submissions (24%) only had minor issues and were generally of excellent quality.

Table 2Results of 50 consecutive methods reviews at the Swiss Medical Weekly. “Severe issues“ can usually not be addressed as part of a revision and may have severe implications for a study’s internal validity and interpretability. “Major issues“ can be addressed as part of a major revision. Expert help from a statistician or epidemiologist is often recommended. “Minor issues“ can be addressed as part of a minor revision.

Methodological appraisal Total n Issues with statistical analysis, n (%) Issues with study design, n (%) Insufficient reporting, n (%)
Severe issues 13 11 (85%) 8 (62%) 13 (100%)
Major issues 25 20 (80%) 10 (40%) 25 (100%)
Minor issues 12 9 (75%) 3 (25%) 12 (100%)
No issues 0 0 0 0

Methodological issues can occur throughout the entire study process and are heterogeneous. This highlights the importance of the methods first approach in addition to the standard peer review and more recent developments, such as preprint peer review platforms [11]. Research methods are what counts, and if methods are not valid (or not reported in the manuscript), a study can be done for nothing, as described in table 3.

Table 3Crude decision matrix for reviewers and editors (adapted from Prof. Dr. Manuel Battegay). “Old results” reflect previous research results and knowledge. “New results” are novel and may contradict previous research results and knowledge.

  Valid(ated) methods Unvalid(ated) or unclear methods
Old results Valid, but “nothing new”* Questionable and “nothing new”
New results Valid and “exciting”* Questionable but possibly “exciting”

* Can be part of relevant replication and reproducibility efforts. Study results (e.g., null findings) should not influence reviewer and editor decisions as long as the study uses valid methods to answer a relevant question.

Authors should consider several straightforward points to ensure they get it right from the beginning.

Firstly, when planning and designing a study, a biostatistician and/or epidemiologist should be involved whenever possible. This is rarely done, particularly in observational research. As a service for the members of the SMW supporting association, researchers from member institutions can have their study proposals assessed at an early stage by an epidemiologist/methodological reviewer [12]. 

Secondly, write a study protocol, even if no ethical approval is necessary: A missing study/analytical focus is one of the main reasons for rejections (e.g., p-hacking, hypothesising after the results are known, no clear outcomes). Primary and secondary outcome measures can be defined for observational studies as well as for intervention studies. Explorative studies should not be selectively reported.

Thirdly, get biostatistical support for your analyses whenever possible. Not all studies need extensive analyses and should remain descriptive [8]. Only use more complex methods with a clear rationale and when simpler methods are insufficient to avoid analytical overkill. Additionally, statistical methods have certain assumptions which should be checked.

Fourthly, take enough time to compose the manuscript, starting with tables and figures, and follow established guidelines (e.g. from the EQUATOR network [9]). Avoid over- and misinterpretations/spins [10].

Lastly, take the revision process seriously and write a comprehensive point-to-point reply: Changes to the manuscript should be clearly described and highlighted in the revised manuscript. Some authors ignore certain comments of the reviewers or omit important revision points, which unnecessarily prolongs the review process. 

If you want to know more about medical research success criteria from the perspective of a methods reviewer and editor, the author of this viewpoint has recently published a pocket guide which addresses these questions in more detail: www.epidemos.ch/get-published. The Swiss Medical Weekly is always eager to know how our authors can perform at their best and where more methodological support is needed. If you have specific suggestions, do not hesitate to contact us or to comment in the Op-ed section of the Swiss Medical Weekly.

Notes

Jan A. Roth is an independent epidemiological advisor for academic researchers. He is the author of the pocket guide “Get Your Health Research Published”.

Jan A. Roth, MD, MSc

SMW supporting association

c/o Gyr Gössi Olano Staehelin

Malzgasse 15

CH-4052 Basel

jan.roth[at]smw.ch

References

1.Aguzzi A. ‘Broken access’ publishing corrodes quality. Nature. 2019 Jun;570(7760):139. 10.1038/d41586-019-01787-2

2.Else H, Van Noorden R. The fight against fake-paper factories that churn out sham science. Nature. 2021 Mar;591(7851):516–9. 10.1038/d41586-021-00733-5

3.Tropini C, Finlay BB, Nichter M, Melby MK, Metcalf JL, Dominguez-Bello MG, et al. Time to rethink academic publishing: the peer reviewer crisis. MBio. 2023 Nov;14(6):e0109123. 10.1128/mbio.01091-23

4.Künzli N, Berger A, Czabanowska K, Lucas R, Madarasova Geckova A, Mantwill S, et al. «I Do Not Have Time»-Is This the End of Peer Review in Public Health Sciences? Public Health Rev. 2022 Nov;43(1):1605407. 10.3389/phrs.2022.1605407

5.‘The situation has become appalling’: fake scientific papers push research credibility to crisis point | Peer review and scientific publishing | The Guardian [Internet]. [cited 2024 Apr 25]. Available from: https://www.theguardian.com/science/2024/feb/03/the-situation-has-become-appalling-fake-scientific-papers-push-research-credibility-to-crisis-point

6.Altman DG. The scandal of poor medical research. BMJ. 1994 Jan;308(6924):283–4. 10.1136/bmj.308.6924.283

7.Aguzzi A, Waeber G. Swiss Medical Weekly: quo vadis? Swiss Med Wkly. 2022 Nov;152(4546):40030–40030. 10.57187/smw.2022.40030

8.Young J. When should you use statistics? Swiss Med Wkly. 2005 Jun;135(23-24):337–8. 

9.EQUATOR Network | Enhancing the QUAlity and Transparency Of Health Research [Internet]. [cited 2024 Apr 26]. Available from: https://www.equator-network.org/

10.Andaur Navarro CL, Damen JA, Ghannad M, Dhiman P, van Smeden M, Reitsma JB, et al. SPIN-PM: a consensus framework to evaluate the presence of spin in studies on prediction models. J Clin Epidemiol. 2024 Jun;170:111364. 10.1016/j.jclinepi.2024.111364

11.Review Commons – Improve your paper and streamline publication through journal-independent peer-review. [Internet]. [cited 2024 Apr 26]. Available from: https://www.reviewcommons.org/

12.Review of study proposals [Internet]. [cited 2024 May 8]. Available from: https://smw.pub/support-for-researchers-review-of-study-proposals/