Hidden inequities in peer review
In this article, Maria Machado examines how peer review, often seen as a fair and merit-based system, is influenced by less visible forms of bias.
The article brings together research on linguistic, geographic, and ethnic disparities to show how these factors intersect. Authors who write in non-standard English may be judged more critically, even when their research is sound. At the same time, reviewer homophily and editorial decision-making can favor certain regions or institutions, reinforcing existing hierarchies in global research.
What stands out is the idea that these are not isolated issues. They work together to create a cumulative disadvantage for underrepresented researchers. The article also highlights that common solutions like anonymized review do not fully address the problem, since bias can persist through shared academic norms and editorial practices.
To move forward, Maria indicates the need for structural changes. This includes diversifying reviewer pools, improving transparency in editorial decisions, and offering language support that does not influence acceptance outcomes. The goal is not just fairness in process, but a more inclusive and representative research ecosystem.
Rethinking publishing models and access
The MIT Graduate Student Council’s resolution on scientific publishing, shared via MIT Libraries, focuses on the growing imbalance in the academic publishing system and its impact on institutions and researchers.
The resolution outlines concerns around the rise of high-prestige journals and the financial strain they place on universities. High subscription costs, combined with limited access and unpaid peer review labor, are seen as misaligned with the broader goals of research and knowledge sharing.
In response, MIT has already begun shifting away from large subscription agreements toward more flexible access models. The resolution supports these efforts and calls for a stronger move toward open access, including the use of preprints and institutional open access policies.
Another key theme is how research quality is evaluated. The document encourages institutions to move beyond proxies like journal reputation and instead focus on the substance of the work and mentorship contributions.
Overall, the resolution reflects a broader shift across the industry toward more sustainable and equitable publishing practices, while also acknowledging the complexity of transitioning away from long-standing systems.
Could AI help fix peer review, or make it worse?
Dr. Michael A. Bruno’s guest post on The Scholarly Kitchen explores the role of artificial intelligence in a peer review system that is already under strain.
He begins by outlining the scale of the challenge. Submission volumes continue to rise, while the number of available reviewers is shrinking. This imbalance has led to delays, reviewer fatigue, and concerns about the consistency and quality of reviews.
AI is presented as both an opportunity and a risk. On one hand, it can support reviewers by summarizing literature, identifying plagiarism, and detecting issues like image manipulation. These capabilities could help reviewers work more efficiently and make more informed decisions.
On the other hand, the article raises concerns about misuse. Reviewers may rely too heavily on AI-generated feedback, while authors may use AI to produce large volumes of low-value submissions. There are also questions around accuracy, as AI tools can present information confidently even when it is incorrect.
The piece also looks ahead to alternative models, including more open and community-driven approaches to peer review. While these models offer speed and transparency, they are not yet widely scalable across disciplines.
The central message is that AI can support the peer review process, but it cannot replace human judgment. Any meaningful improvement will depend on how these tools are integrated alongside broader changes to the system.
If you’ve come across a piece lately that sparked reflection or raised important questions, feel free to share it with the ReviewerOne community.

Leave a Comment