Home Blog Industry Updates & Discussions

Scholarly publishing round-up: Navigating AI, authorship, and research integrity

review

ReviewerOne

20 Mar 2026 | Read Time: 3 mins

20

Mar
Scholarly publishing round-up: AI, burnout, and cracks in peer review

Responsible use of AI in research: Aligning innovation with integrity

The European Commission has now outlined a coordinated approach to ensure the responsible use of generative AI in research. Developed in collaboration with the European Research Area Forum, a set of new guidelines aim to bring clarity and consistency in the use of AI among the European research community.

The recommendations emphasize that while generative AI can improve efficiency and accelerate discovery, its use must not compromise principles of integrity in research. Researchers are advised to avoid using AI in sensitive contexts such as peer review and to remain mindful of risks such as bias, plagiarism, and data privacy concerns.

The guidance extends to institutions and funders. Institutions are encouraged to monitor AI use and support responsible adoption, while funders are asked to promote transparency in how AI tools are used in research proposals and outputs. Importantly, the framework is designed to evolve, recognizing that both the technology and its implications are still developing. Read the full article here

Rethinking authorship for modern science

A recent article in Proceedings of the National Academy of Sciences (PNAS) presents a reflection on the need to reform authorship practices, especially in an increasingly collaborative research landscape. The authors argue that traditional authorship models do not adequately reflect how research is conducted. Instead of relying solely on rigid criteria, they propose anchoring authorship decisions in three interconnected principles: transparency, credit, and accountability. These principles offer a more flexible and meaningful way to determine who qualifies to be an author and why.

The paper highlights that credit and accountability cannot be separated. Those who receive recognition must also take responsibility for their contributions. This approach could help clarify how practices such as honorary or ghost authorship undermine research integrity. The authors also call for practical changes. Research teams should initiate authorship discussions early, revisit them as projects evolve, and clearly document contributions. Institutions and journals also play a critical role in fostering a culture that supports fair decision-making, transparent reporting, and better alignment between authorship and research assessment. Read the full article here

Unreliable data in clinical models raises concerns over research integrity

A recent medRxiv preprint highlights evidence of unreliable data and poor data provenance in clinical prediction model research. The study examines two widely used health datasets from Kaggle and finds that both lack basic information about their origin, collection, and authenticity. Using the TRIPOD+AI framework, the authors show that neither dataset meets minimum reporting standards, with patterns suggesting the data may be simulated or fabricated. Despite this, these datasets have been used in over 100 published studies, many of which make clinical recommendations. Some models built on these datasets have already been cited widely and even tested in clinical settings.

The findings raise broader concerns about how quickly questionable data can influence research and practice. The study calls for stricter data provenance requirements, better oversight from journals and repositories, and greater caution among researchers and clinicians to ensure that clinical decisions are based on reliable evidence. Read the full article here

Peer review at the crossroads: Evolving models, persistent challenges

In this Learned Publishing article, Dmitry Kochetkov examines how peer review has evolved, and why it is now at a turning point. While long seen as the backbone of scholarly communication, traditional pre-publication peer review is increasingly strained by inefficiencies, bias, lack of transparency, and reviewer fatigue. The study highlights how these structural issues, combined with growing submission volumes and evolving research practices, have pushed the system into a state of ongoing tension.

In response, multiple alternative models are emerging. These include registered reports (focused on methodological rigor), modular publishing (breaking research into reviewable components), and the Publish–Review–Curate (PRC) model that separates dissemination from evaluation through preprints and post-publication review. The study emphasizes that no single model is universally superior. Each serves different disciplinary and institutional needs. Instead, the future of peer review lies in flexible, hybrid approaches that balance rigor, transparency, and accessibility while addressing long-standing gaps in evidence around their real-world impact. Read the full article here

If you’ve come across a piece lately that sparked reflection or raised important questions, feel free to share it with the ReviewerOne community.

About the Author

review

ReviewerOne

ReviewerOne is a reviewer-centric initiative focused on strengthening peer review by supporting the people who make it work. ReviewerOne provides current and aspiring reviewers with AI-powered tools and resources to help them review more confidently, consistently, and fairly, without removing the human judgment that peer review depends on.

The ReviewerOne ecosystem brings together a reviewer-friendly peer review platform with structured guidance and AI-assisted checks; a community forum to foster networking and collaboration; a Reviewer Academy with practical learning resources on peer review, AI, ethics, and integrity; and meaningful recognition through verified credentials and professional profiles. ReviewerOne aims to reduce friction in peer review while elevating reviewer expertise, effort, and contribution.

Connect:

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Take the next step in transforming your academic and professional journey

Get early access to a community and tools designed for peer reviewers

Join the ReviewerOne Community