Supplementary materials are a significant part of many research manuscripts. With the increasing focus on data sharing and transparency, researchers choose to include detailed methods, raw data, extended analyses, and additional figures and tables outside the main text as supplementary materials. This adds another layer of assessment for peer reviewers. If you need a better understanding of how you can approach supplementary materials as a peer reviewer, you will find this post helpful.
Why supplementary materials matter
Supplementary materials include several additional information pieces that add substance to the methodological approach of a study. For example, detailed protocols, statistical analysis codes, raw data tables, or additional validation experiments are not peripheral to a study’s validity; they are often central to it. A manuscript that presents clean, convincing results in the main text while containing important caveats or unresolved issues in its supplementary files is not a well-reported paper.
Reviewers who skip the supplementary materials miss a significant portion of what they are being asked to evaluate. The conclusions in the main text often depend directly on analyses reported only in the supplementary files. If those analyses are flawed or incompletely reported, the conclusions may not hold.
What to look for
When you open the supplementary files, first check whether they actually contain what the main text says they contain. If the main text refers readers to “Supplementary Table 3” for participant demographics, you should be able to find that table easily. It should be labeled clearly and contain the information described. Inconsistencies between what the main text promises and what the supplementary files deliver are a significant concern.
Beyond that, evaluate the supplementary materials in the same spirit as the main text. As yourself these questions:
- For extended methods: Are the procedures described in sufficient detail to be reproduced? Are the steps logical and complete?
- For supplementary data or analysis files: Are they clearly labeled and explained? Could a reader follow the analysis from the raw data to the reported result?
- For supplementary figures and tables: Do they support the main text conclusions, or do they contain patterns or results that are inconsistent with what the main text claims?
Data availability and transparency
Authors make underlying data available, either as supplementary files or in a public repository. If the journal has a data sharing policy, check whether the authors have complied with it. If data have been deposited in a repository, check whether the link is functional and that the deposited data appear to match what is described in the methods.
Data availability matters for reproducibility. A study that presents results without providing any access to the underlying data is making a fundamentally different kind of claim than one that provides full transparency. The growing expectation that research data should be Findable, Accessible, Interoperable, Reusable (FAIR) data principles is increasingly reflected in journal policies. Checking data availability is now a routine part of a thorough peer review.
Analysis codes and computational reproducibility
For studies that involve computational analyses, statistical modeling, bioinformatics pipelines, or machine learning, the availability and quality of analysis codes is increasingly important to evaluate. Here’s what you should ask yourself:
- Can you follow the analytical steps from the raw data to the reported results?
- Is the code documented well enough to understand what it does?
- Does the deposited code match the analysis described in the methods?
You do not need to run the code yourself to assess whether it is present and reasonably described. But flagging the absence of analysis code where it would be expected, or noting that code is present but so poorly documented as to be impossible to follow, is a legitimate and useful contribution to your review.
When supplementary materials raise concerns
If you identify a problem in the supplementary materials – such as an inconsistency with the main text, missing data, or an analysis that does not support the conclusion it is supposed to support – raise this in your report with a specific reference to the relevant supplementary file and the exact concern. Treat concerns arising from supplementary materials the same as concerns arising from the main text. A methodological flaw is a methodological flaw irrespective of whether it appears on page 4 of the manuscript or in Supplementary File 3.
It is also entirely reasonable to note in your report that you were unable to review certain supplementary materials due to technical issues, for instance in cases where a linked dataset was inaccessible, an analysis script was in a format you cannot open, or supplementary files were not included in the review materials provided. Editors need to know when key materials are missing.
When there are no supplementary materials
Not every manuscript requires supplementary materials, and their absence is not a problem. But if a study seems to require more methodological detail, data transparency, or validation than the main text provides, recommending that specific elements be added as supplementary materials is a legitimate suggestion. Make your suggestion specific by stating what should be added and why it matters for the paper’s validity.
We’d like to hear from you
Have you checked supplementary materials as part of your peer reviews? What has been the most significant thing you have found in them? Share your experience in the comments below.
For structured guidance on all aspects of thorough peer review, join the ReviewerOne Community.

Leave a Comment