As a dedicated journal publishing company committed to advancing rigorous, high-quality scholarship across the sciences, we at Journal of Business Management and Innovation have always placed the integrity of peer-reviewed research at the heart of our mission. Recently, a groundbreaking report led by Professor Ilka Agricola of the University of Marburg, Germany, has shone a stark light on a pervasive issue in the mathematical sciences: systematic fraudulent publishing practices. Commissioned by the German Mathematical Society (DMV) and the International Mathematical Union (IMU), this investigation—detailed in preprints on arXiv and published in the Notices of the American Mathematical Society—reveals how commercial pressures are undermining the very foundations of scientific progress. It’s a wake-up call not just for mathematicians, but for the entire publishing ecosystem, and it prompts us to reflect deeply on our role in fostering trustworthy science.
The Shadow of Commercial Metrics Over Scientific Merit
In today’s academic landscape, the quality of research is too often distilled into cold, quantifiable metrics: the sheer volume of publications, citation counts, h-indexes, and journal impact factors. These indicators, calculated opaquely by commercial giants like Clarivate (Web of Science) and Elsevier (Scopus), prioritize sales of their databases over genuine scientific value. With revenues in the billions—Clarivate alone reported $2.66 billion in 2022—these companies have little incentive to scrutinize the integrity of the data they index, even when it includes predatory journals or low-quality megajournals.
This system creates fertile ground for fraud. Fraudulent services openly advertise on platforms like Telegram or dark web forums, offering everything from ghostwritten articles to bought citations and even fabricated affiliations—all for a fee. The payoff? For individual researchers, it means inflated CVs and career advancement; for institutions, higher rankings that unlock funding, elevated tuition fees, and top talent. But the cost to science is immense: a flood of unread, flawed, or outright fake papers that dilute the literature and erode public trust.
Consider the megajournals at the epicenter of this crisis. Outfits like MDPI’s Mathematics churn out over 6,000 articles annually, generating millions in Article Processing Charges (APCs)—estimated at 10 million Swiss francs for that journal alone in 2023. These pay-to-publish models now eclipse the output of all reputable, non-APC mathematics journals combined, often with minimal peer review. Scandals abound: MDPI faced delisting from Web of Science in 2023, and journals like Hindawi’s Journal of Function Spaces have been hit with mass retractions due to paper mill involvement. As the report notes, the correlation between these journals and quality is negligible; they’re engines for metric gaming, not knowledge advancement.
Striking Examples That Defy Belief
The Agricola-led study pulls no punches with its examples, illustrating how deeply entrenched these practices have become. One jaw-dropping case: In 2019, Clarivate’s Highly Cited Researchers (HCR) list crowned China Medical University in Taiwan as the global leader in mathematics, boasting 11 “world-class” researchers. The catch? The university doesn’t even offer mathematics as a subject. This anomaly propelled the institution into the top 300 worldwide universities per the Academic Ranking of World Universities (ARWU) in 2022—only for it to slip to the top 500 by 2024 after Clarivate excluded mathematics from HCR lists due to rampant manipulation.
Then there are citation cartels, where groups mutually inflate each other’s counts irrelevant of merit. A notorious instance involved Juan Manuel Corchado of the University of Salamanca, whose 75 papers were retracted in 2023 for such schemes. Paper mills, industrial-scale operations fabricating entire manuscripts, are another scourge—Russia-based ones have been dissected in the report, churning out AI-generated or plagiarized content sold anonymously. And predatory empires like OMICS Group Inc., based in India, spawned 700 fake journals and 3,000 sham conferences, raking in millions before a 2022 U.S. Federal Trade Commission crackdown.
The scale is staggering. Retractions in mathematics hit 1,009 by December 2024, up from negligible numbers two decades ago, with countries like Saudi Arabia, Pakistan, and China leading per capita rates (0.2% of publications in 2023, versus 0.02% in 2002). Globally, over 10,000 scientific articles were retracted in 2023 alone. Seven of the 89 mathematics HCRs from Clarivate even appear in retraction watchlists, and self-citation rates among them skew alarmingly high (median Self-Citing Score of 15.63, double that of top mathematicians or prizewinners).
A Danger to Science and Society
As IMU Secretary General Prof. Christoph Sorger warns, “‘Fake science’ is not only annoying, it is a danger to science and society.” Without reliable ways to distinguish valid results from fabrications, mathematicians can’t build on solid foundations, and disinformation seeps into policy and education. DMV President Prof. Jürg Kramer echoes this urgency: “The recommendations developed by the commission are a call to all of us to work toward a system change.” The broader ripple effects? Wasted resources on junk research, skewed funding away from genuine innovators, and a public increasingly skeptical of expert knowledge—exacerbated, no doubt, by AI tools that make fraud even easier to perpetrate.
Pathways Forward: Recommendations for a Healthier Ecosystem
The good news is that the study doesn’t stop at diagnosis; its companion report, “How to Fight Fraudulent Publishing in the Mathematical Sciences,” offers actionable joint recommendations from the IMU and the International Council of Industrial and Applied Mathematics (ICIAM), endorsed in May/June 2025. These guidelines target everyone in the chain: researchers, institutions, evaluators, publishers, and policymakers.
For researchers: Prioritize reading papers over chasing metrics, avoid predatory journals (and disclose past unwitting involvement on your CV), cite judiciously, and report misconduct boldly—using tools like ORCID to verify identities. Editorial board members should vet journals rigorously and resign publicly from predatory ones.
Institutions must ditch bibliometric quotas in hiring and promotions, focusing instead on a researcher’s best work and overall activity. No more degrees tied to publication tallies, and education on predatory pitfalls is essential.
Evaluators and policymakers: Champion expert-led assessments over commercial rankings like SJR or JCR, which correlate poorly with quality anyway. Governments should fund transparency initiatives and large-scale fraud monitoring.
As publishers, we find these directives particularly resonant. The report urges us to maintain transparent editorial processes, screen for plagiarism and fraud, and reject cooperation with dubious outlets. It also calls for “low entry” publication venues for solid but non-elite work, reducing the desperation that drives fraud.
At Journal of Business Management and Innovation, we’re proud that our journals already embody many of these principles: rigorous, independent peer review without APCs for core titles, transparent metrics, and a commitment to indexing only in reputable databases.
A Call to Reclaim Scientific Integrity
The revelations from Agricola’s team are a sobering reminder that the pursuit of knowledge shouldn’t be commodified at the cost of truth. As publishers, researchers, and stewards of scholarship, we must heed this call for change. By embracing expert evaluation, fostering transparency, and rejecting metric-driven shortcuts, we can protect the mathematical sciences—and science at large—from the fraud that’s threatening to undermine it. Let’s turn this stir into sustained momentum for a more trustworthy future.
For more on the reports:
- “Fraudulent Publishing in the Mathematical Sciences” (arXiv:2509.07257)
- “How to Fight Fraudulent Publishing…” (arXiv:2509.09877)
What are your thoughts on tackling publishing fraud? We’d love to hear from our community in the comments below.