Fraudulent Publishing in the Mathematical Sciences

75 pointsposted 15 hours ago
by bikenaga

37 Comments

Hackbraten

3 hours ago

I love the table of tortured phrases [0], which shows hilarious examples of synonyms of established scientific phrases, machine-generated by fraudulent authors to stay below the radar of plagiarism detectors.

My favorites from that table:

- “fuzzy logic” becomes “fluffy rationale”

- “spectral analysis” becomes “phantom examinations”

- “big data” becomes “enormous information”

[0]: https://arxiv.org/pdf/2509.07257#table.3

tho23i423423423

13 hours ago

Are "publication metrics" also used heavily in China by the bureaucracy ?

I know for a fact that the number of fake-journals exploded once the Govt. of India decided to use this for promotions.

It's a bit sad really: in the classical world both these countries spent inordinate amount of time on the questions of epistemology (India esp.). Now reduced to mimicking some silly thing that vaguely tracks knowledge-production even in the best case in the West.

porridgeraisin

an hour ago

Yes. And filtering out publications in "paper mills" and then judging the guy properly doesn't scale beyond the top few institutions. So you'll find a sudden drop-off in research quality once you reach the n_th university. It really is almost like a threshold.

pfdietz

10 hours ago

What a wonderful illustration of Goodhart's Law.

kaladin-jasnah

13 hours ago

Things like citation brokers (paid to cite papers), abuse of power, paper mills, and blackmail (pg. 10) is appalling to me. I have to question how we ended up here. Academia seems very focused on results and output, and this is used as a metric to measure a researcher's worth or value.

Has this always been an issue in academia, or is this an increasing or new phenomenon? It seems as if there is a widespread need to take shortcuts and boost your h-index. Is there a better way to determine the impact of research and to encourage researchers to not feel so pressed to output and boost their citations? Why is it like this today?

Academic mathematics, from what I've seen, seems incredibly competitive and stressful (to be fair, so does competition math from a young age), perhaps because the only career for many mathematicians (outside a topics with applications such as but not limited to number theory, probability, and combinatorics) is academia. Does this play into what this article talks about?

cycomanic

13 hours ago

In my time in academia (~20 years) I have seen the demands and competition increase quite significantly, however talking to older researchers the this really started in the 90s the demands to demonstrate measurable outcomes increased dramatically and funding moved to be primarily through competitive grants (compared to significant base funding for researchers previously). The issue is that while previously it was common for academics to have funding for 1-2 PhD students to look into new research areas, now many researchers are required to bring in competitive grants for even covering part of their salary.

What that means is that researchers become much more risk averse, and and stay in their research area even if they believe it is not the most interesting/imapactfull. You just can't afford to not publish for several years, to e.g. investigate a novel research direction, because without the publications it becomes much much harder to secure funding in the future.

throwawayqqq11

3 hours ago

So its economic pressure again, i assume put on academic institutions that in turn pass it through as lower funding/wages.

Its important to note that somehow we see the erosion of families, infractures and institutions everywhere but we never talk about the giant f'ing elephant in the room.

kaladin-jasnah

11 hours ago

This is interesting. Is there a reason why this started happening in the 90s?

japanuspus

6 hours ago

I think a lot of it is covered under "New Public Management" [0], which was maybe a result of the financialization happening in the 80's [1].

And I completely GP, having been in or in contact with academic research since the late 90's, there has been a very strong shift from a culture where the faculty had means for independent research, and were trusted to find their own direction, to the system we have today where a research project has much tighter overlook and reporting than most corporate projects.

A professor with a 4-5 person group will typically need two staggered pipelines of 4-5year funding projects to run risk free. In the EU it is virtually impossible to get funding for projects that do not involve multiple countries, so you need to set up and nurture partnerships for each project. Coordination the application process for these consortia is a major hassle and often outsourced at a rate of 50kEUR + win bonus. And you of course need to run multiple applications to make sure to get anything. When I talked to mentors about joining academia around 2010, the most common response was "don't".

[0]: https://en.wikipedia.org/wiki/New_public_management [1]: https://en.wikipedia.org/wiki/Financialization

thallium205

11 hours ago

Because the supply of academics have outpaced demand.

jackyinger

10 hours ago

I’d say that it is precisely because of superficial demand by bureaucracy that academic output has become superficial.

The demand for novel knowledge is always high. It is the supply that is short.

That’s why we hang around on HN hoping for something novel of true interest. You get a good find every once in a long while.

adgjlsfhk1

10 hours ago

education funding cuts

kaonwarb

9 hours ago

How does that square with the cost of education significantly outpacing inflation?

chowells

6 hours ago

Funding cuts. Like from the state and federal levels. This resulted in costs to students increasing while also forcing researchers to generate income. This is the natural result of treating college like a business. They raise costs on every side.

aaviator42

7 hours ago

The money doesn't go to the researchers.

grandinj

6 hours ago

Unfunded pension and health care

guyomes

13 hours ago

> Has this always been an issue in academia, or is this an increasing or new phenomenon?

The introduction of this article [1] gives an insight on the metric used in the Middle Ages. Essentially, to keep his position in a university, a researcher could win public debates by solving problems nobody else could solve. This led researchers to keep their work secret. Some researchers even got angry about having their work published, even with proper credit.

[1]: https://www.jstor.org/stable/27956338

aoki

12 hours ago

The issue in all fields became significantly worse as developing countries decided their universities needed to become world class and demanded more international publications for promotion. Look at the universities in the table in the paper and you can see which countries are clearly gaming the system. If your local bureaucrats can’t tell which journals are good and which are fake, the fake journals become the most efficient strategy. Even worse, publishers figured out that if you can attract a few high-citation papers, your impact factor will go way up (it’s an arithmetic mean) and your fake journal becomes “high quality” according to the published citation metrics!

Math is particularly susceptible to this because there are few legitimate publications and citation counts are low. If you are a medical researcher you can publish fake medical papers but more easily become “high impact” on leaderboards (scaled by subject) by adding math topics to your subjects/keywords.

non_aligned

13 hours ago

I've seen similar stuff in a couple of other places, including IT back in the 1990s (back when it wasn't nearly as glamorous as it is today).

I think some of this has to do with... resentment? You're this incredibly smart person, you worked really hard, and no one values you. No one wants to pay you big bucks, no one outside a tiny group knows your name even if you make important contributions to the field. Meanwhile, all the dumb people are getting ahead. It's easy to get depressed, and equally easy to decide that if life is unfair, it's OK to cheat to win.

Add to this the academic culture where, frankly, there are fewer incentives to address misbehavior and where many jobs are for life... and the nature of the field, which makes cheating is easy (as outlined in the article)... and you have an explosive mix.

moc_was_wronged

11 hours ago

I think some of this has to do with... resentment? You're this incredibly smart person, you worked really hard, and no one values you. No one wants to pay you big bucks, no one outside a tiny group knows your name even if you make important contributions to the field. Meanwhile, all the dumb people are getting ahead.

Part of it, too, is that, while no one goes into academia to get rich, people quickly find out that the academic world runs on money. If you don’t get grants, you die, even with tenure. So what’s the point?

The reality of academia is so dismal that most people, by 30, wish they had sold out and chased money like the dumb-dumbs, who are, as you correctly note, farther ahead.

SilverElfin

12 hours ago

Abuse of power is definitely not new. Professors have historically overworked their grad students and withheld support for their progress towards a PhD or a paper unless they get something out of it. For women it’s extra bad because they can use their power in other ways.

empiko

6 hours ago

Bibliometrics in science is just an unworkable approach in general, and IMO it causes more harms than not. Research is one of the least suitable human activities that you can possibly try to quantify, yet the entire scientific establishment runs on these metrics by now. I more or less believe that this strategy hinders scientific progress, as it pushes researchers into more and more risk-averse behaviors.

beezle

13 hours ago

Sabine Hossenfelder has been on about this topic in the field of physics publishing for quite some time now.

It really is a terrible thing, though I can understand how some researchers feel trapped in a system that gives them little if any alternative if they wish to be employed the next year. Not just one thing needs to be changed to fix it.

mlpoknbji

10 hours ago

Citation based metrics are much more prevalent in physics than in math (at least in the US and most countries in Europe). When compared with physics, my impression is that mathematics has the tradition "slow, long term" over "rapid, incremental." Of course, it's not perfect.

ykonstant

4 hours ago

>When compared with physics, my impression is that mathematics has the tradition "slow, long term" over "rapid, incremental."

Not anymore :(

HPsquared

29 minutes ago

Ironic that mathematics suffers due to an overemphasis on numbers.

mlpoknbji

11 hours ago

This article does not seem to quite convey the experience of a pure mathematician. Yes, citation fraud is happening on an apalling scale, but no it is not a serious issue for mathematicians.

The problem of AI generated papers is much more serious, although not happening on the same scale (yet!).

_alternator_

13 hours ago

TLDR: The publication culture of mathematics (with relatively few papers per researcher, few authors per paper, and few citations per paper) makes abuse of bibliometrics easier. The evidence suggests widespread abuse.

My take: I’ve published in well-regarded mathematical journals and the culture is definitely hard to explain to people outside of math. For example, it took more than two years to get my key graduate paper published in Foundations of Computational Mathematics, a highly regarded journal. The paper currently has over 100 citations, which (last I checked) is a couple times higher than the average citation count for the journal. In short, it’s a great, impactful work for a graduate student. But in a field like cell biology, this would be considered a pretty weak showing.

Given the long timelines and low citation counts, it’s not surprising that it’s so easy to manipulate the numbers. It is kinda ironic that mathematicians have this issue with numbers though.

aoki

11 hours ago

Pure math has a far greater vulnerability to this than applied math. Top journals have impact factors of around 5.0. Respectable but tiny specialist journals can have impact factors less than 1.0 (like, 0.4). Meanwhile, MDPI Mathematics is a Q1 journal with an impact factor over 2.0.

The now-standard bibliometrics were not designed by statisticians :-)

mlpoknbji

11 hours ago

The key is that mathematicians in the US and most parts of Europe do not count citations. So this is not really an issue.

Jaxan

3 hours ago

It is an issue if a mathematician has to apply for grants. Often they are in the same competition as physicists, for instance, and then metrics do matter.

mathattack

14 hours ago

Easy to see how social sciences can be games. Much sadder to see Mathematics get gamed too. It provides ammo to folks looking to defund the topics.

aleatorianator

13 hours ago

Mathematics did invent game theory, so in that way it simply takes more math to do math which isn't good

bee_rider

7 hours ago

Maybe the way forward is to break the impact factor game. Everybody in a field get together and publish a paper: literally every topologist could put their name on the paper “Generally we all agree that Topology is an interesting topic.”

Everybody in the field cite that paper going forward, giving it a massive impact factor, making impact factor useless. Do this occasionally, randomly, do it in niche sub fields, everybody who goes to a conference put their name on the paper “we had a nice time at <conference> this year.”

I mean, it is something everybody hates, right? There’s no point in preserving it.

paulpauper

10 hours ago

Publishing math is one of the most time consuming things ever, between the submission, review/revising, and editing. I with there was a faster way of doing it outside of arXiv. Without having to review the paper closely, typically an experienced editor can tell at fist glace if it's correct or sound.

It is what we could call the “zone of occasional poor practice”. Included are actions like

I think this is more common in computer science papers. I see this all the time, where 5- 10 authors will collaborate on a short paper, then collaborate on each other's papers in such a way that the effort is minimized and publishing count and citation count is maximized. .