Fake Publications in Biomedical Science: Red-flagging Method Indicates Mass Production

ABSTRACT

Background Integrity of academic publishing is increasingly undermined by fake science publications massively produced by commercial “editing services” (so-called “paper mills”). They use AI-supported, automated production techniques at scale and sell fake publications to students, scientists, and physicians under pressure to advance their careers. Because the scale of fake publications in biomedicine is unknown, we developed a simple method to red-flag them and estimate their number.

Methods To identify indicators able to red-flagged fake publications (RFPs), we sent questionnaires to authors. Based on author responses, three indicators were identified: “author’s private email”, “international co-author” and “hospital affiliation”. These were used to analyze 15,120 PubMed®-listed publications regarding date, journal, impact factor, and country of author and validated in a sample of 400 known fakes and 400 matched presumed non-fakes using classification (tallying) rules to red-flag potential fakes. For a subsample of 80 papers we used an additional indicator related to the percentage of RFP citations.

Results The classification rules using two (three) indicators had sensitivities of 86% (90%) and false alarm rates of 44% (37%). From 2010 to 2020 the RFP rate increased from 16% to 28%. Given the 1.3 million biomedical Scimago-listed publications in 2020, we estimate the scope of >300,000 RFPs annually. Countries with the highest RFP proportion are Russia, Turkey, China, Egypt, and India (39%-48%), with China, in absolute terms, as the largest contributor of all RFPs (55%).

Conclusions Potential fake publications can be red-flagged using simple-to-use, validated classification rules to earmark them for subsequent scrutiny. RFP rates are increasing, suggesting higher actual fake rates than previously reported. The scale and proliferation of fake publications in biomedicine can damage trust in science, endanger public health, and impact economic spending and security. Easy-to-apply fake detection methods, as proposed here, or more complex automated methods can help prevent further damage to the permanent scientific record and enable the retraction of fake publications at scale.

Competing Interest Statement

The authors have declared no competing interest.

Funding Statement

The study was funded by the Otto-v.-Guericke University of Magdeburg, Germany

Author Declarations

I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.

Yes

I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.

Yes

I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).

Yes

I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.

Yes

Data Availability

All data produced in the present study are available upon reasonable request to the authors

留言 (0)

沒有登入
gif