Image manipulation is a significant and growing problem in scholarly publishing. Research by scientific integrity consultant Elisabeth Bik and colleagues found that close to four percent of biomedical papers contained problematic images, with a substantial portion showing evidence of deliberate manipulation rather than innocent error. For reviewers working in fields that rely on photographic or graphical data – such as microscopy, western blots, gel electrophoresis, histology, or flow cytometry – developing the ability to spot image irregularities is an important skill.
What is acceptable image processing?
Not all image editing is manipulation. Most publishers accept that a degree of processing is necessary, such as adjusting overall brightness and contrast for visibility, converting color to grayscale, or cropping for presentation. The key principle, articulated consistently by major publishers and the Office of Research Integrity (ORI), is that any adjustment must be applied uniformly to the whole image, must not selectively alter specific regions, and must not change the scientific meaning of what the image shows.
Acceptable practices include applying the same contrast adjustment to an entire image, cropping to remove irrelevant surrounding material (with this disclosed), and assembling composite figures from separate images with clear delineation between panels.
Impermissible practices include:
- Removing or adding structures within an image using cloning or stamping tools
- Selectively brightening or suppressing specific areas to make results look cleaner or stronger than the raw data supports
- Duplicating regions within an image to cover unwanted features
- Using an image from one experimental condition to represent a different condition.
Often, image discrepancies identified during review could provide sufficient grounds for further investigation, including requests for raw image files.
The three main categories of problematic images
Elisabeth Bik’s systematic analysis of the biomedical literature describes three principal categories of image manipulation:
- Duplication: The same image, or a substantial portion of it, is used more than once, either within the same paper to represent different experimental conditions, or across different papers published by the same authors. This can be easy to miss if panels are presented at different magnifications or orientations.
- Duplication with repositioning: A copied image area is moved, flipped, or rotated before being presented as a distinct result. Rotating or flipping a cell culture image and presenting it as a different sample is a clear example.
- Duplication with alteration: An image area is duplicated and then digitally modified, using stamping, patching, or cloning tools, to disguise the duplication. This is harder to detect than straightforward duplication but often leaves telltale artifacts in the image texture.
Western blots and gel images are the most common sites of manipulation in biomedical literature, followed by microscopy images. The same types of problem can arise in any field using photographic or graphical data, including flow cytometry, magnetic resonance imaging (MRI), electron micrographs, and histological sections.
Practical detection: What to look for
A systematic approach to image reviews is worth developing as a habit, particularly for studies with substantial figure data. The following checks are practically useful:
Texture and background consistency: Look for regions within an image that have a noticeably different texture, grain, or background from their surroundings. A cloned or stamped region often has a slightly different noise pattern, even after editing, particularly visible when viewed at high zoom. Abrupt differences in the background within a supposedly uniform lane or field are a warning sign.
Comparing panels within and across figures: Do any nominally different panels appear identical, or suspiciously similar? Include rotated and mirrored comparisons in your mental check. In a time-course or dose-response experiment where photographs are taken at different time points or concentrations, the images should show genuine biological variation; images that look like slight crops or zooms of each other are worth questioning.
Western blot lane consistency: For western blots, check whether individual lanes appear to have a consistent background, or whether some lanes seem to have been placed against a different background from others. Non-linear splices, where the band contrast or exposure changes abruptly between adjacent lanes, suggest lanes from different blots may have been combined without appropriate disclosure.
Scale bar consistency: Check whether the scale bar, if present, is consistent with the stated magnification level and appears in the same relative position across panels representing the same type of data. Inconsistencies can indicate panels have been cropped or resized to different scales without adjustment.
Cross-paper duplication: Where you have read other papers from the same research group, consider whether any figures look familiar. Duplicated images across papers are harder to detect but not impossible if you have broad familiarity with the field.
Free tools to support your assessment
Several accessible tools can support more detailed image analysis when something looks wrong:
- Forensically (29a.ch/photo-forensics): A browser-based tool providing error level analysis (ELA) and other forensic techniques. ELA highlights areas of an image that have been saved at different compression levels, which can reveal regions that have been edited or inserted.
- FotoForensics: A similar web-based tool offering ELA and metadata inspection. Useful for detecting post-processing inconsistencies in JPEG images.
- Fiji/ImageJ: This is an open-source image analysis tool used across life sciences. It can be used to examine pixel-level data, apply contrast adjustments systematically, and compare image regions.
These tools can support your assessment, but their outputs require interpretation. ELA anomalies can arise from normal image processing as well as manipulation. The purpose of using these tools is to help you describe a specific, testable concern for the editor, not to produce a definitive verdict.
Dealing with a problematic image
If you believe you have identified an image integrity problem, do not describe it in the main body of your review in a way that could be read by the authors as an accusation. Raise it in the confidential comments to the editor instead, with specific information about which image concerns you and why.
Request the editor to ask the authors for original, unprocessed image files. Raw image files should show consistent metadata, capture timestamps, and full-frame data that allow independent verification.
Remember that image discrepancies should be treated as concerns requiring investigation, not automatic proof of misconduct. Your goal as a reviewer is to flag the concern precisely enough that an investigation can take place, not to make a judgement.
If you want to approach peer review with more clarity and confidence, Join the ReviewerOne community .

Leave a Comment