Science A Philosopher Asks, 'Why Do Scientists Lie?'

  • ⚙️ Performance issue identified and being addressed.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account

Philosopher Liam Kofi Bright’s article “Why Do Scientists Lie?” starts by mentioning some modern diet wisdom commonly repeated in the news and online. Grocery shopping hungry means you buy more calories. Eating out of larger bowls means people will eat more. People get healthier food when they pre-order lunch. All of these statements are the result of inaccurate, misleading research conducted by Cornell nutrition scientist Brian Wansink. Many of his findings were the result of either misreporting his own data or using inappropriate statistical techniques that he should have known were invalid. Bright describes this behavior more bluntly and concisely: He committed fraud.


In his article, Bright raises the naive position we might take on scientific fraud. It goes something like this: “There are lots of careers that make more money (and include better hours) than science, so if a comfortable job is what you want, pursuing science doesn’t make much sense. Instead, scientists must be more motivated by a deep desire to uncover truths about the world. If that’s really their deepest motive, why would scientists lie? Doesn’t lying contradict their deepest goals?” Of course, Bright makes clear that this naive position misses most of the key developments in a modern understanding of scientists since the 1960s.


A modern understanding of scientists (or at least one informed by Robert Merton’s writing in the 1960s) is one that sees them not as motivated by a pure quest for truth, but as “credit seekers.” That is, they are competing for the recognition of their scientific colleagues and other relevant communities of experts. They do this by racing to establish priority on claims, typically by publishing novel claims in journals reviewed by their peers. This community of peers, who review claims and award tokens of success, are the real audience for new science. Even if an individual scientist doesn’t particularly like playing “the game” of catering to this audience, you need to play just to be given the funds (or a job with dedicated time) for research.


Chasing credit from a community of peers has obvious downsides, as Bright suggests in his opening story. Wansink was (presumably) so motivated by a desire to get credit that he was willing to lie. The biography of Diederik Stapel, a former Dutch psychologist who committed fraud, explicitly identifies community pressure as a motivator. He saw other social psychologists getting exciting results, and didn’t know why he couldn’t. Surely his theory was right, and he wanted to contribute, so he was justified in making up data to support it. This system also gives little credit (and may implicitly penalize) those who point out errors in prior work.


 Mark Neal/Pexels

An advantage of the credit-seeking paradigm is exploring many different scientific routes.
Source: Mark Neal/Pexels
Yet Bright also points out upsides at the community level to this system of credit-seeking. It means that scientists are encouraged to “fan out” and explore new directions of research, in order to be able to make novel contributions. (The idea that scientists need to explore many different ways of generating knowledge to hit upon the truth was also recently advocated by another group, including psychologists.) It also encourages what Merton called the “communist norm” of science: the idea that you succeed by giving away knowledge. Credit comes from sharing discoveries that others can then learn from and build on. Scientists eschew a private patent on their knowledge in return for glory bestowed on them by their community.


Bright ends his article without a clear prescription for countering scientific fraud, but suggests that it presents a great opportunity for doing philosophy. His description also calls out for doing psychology. I have two strong reactions to his characterization of the situation in scientific fraud.


 Jopwell/Pexels

The community standard in peer review can replace the need to question yourself.
Source: Jopwell/Pexels
First, very few people think they are explicitly lying, even when they are. The way Wansink’s fraud was uncovered speaks to this: He wrote a blog post laying out advice for graduate students that explicitly encouraged them to engage in the types of misleading practices that had led to his success. He posted his practices on a public blog as a student resource, presumably because he didn’t see these practices as fraudulent (even though researchers are often explicitly taught not to do them in first-year stats and methodology courses).


Instead, I think that for researchers like Wansink all external standards fall away except whether their community of peers will accept a set of results. In other words, if no peer ever told him that he couldn’t publish a set of results because of misleading practices, then he would never question whether he was doing things right. This leads to a view that “It’s not fraud if it got past the reviewers.” The community standard can become an excuse not to do your own due diligence or ask yourself hard questions. Persuasion becomes more important than precision.


My other major reaction is that psychology research’s current “crisis” is not about whether to embrace peer review, with both its upsides and downsides. It’s about a split over whose expert review should be respected. Based on comments I have seen in the last decade, there are prestigious scientists whose scientific opinions no longer carry much weight with me. I don’t particularly care whether research gets their stamp of approval, because I don’t think they would ask the necessary, hard questions when they do peer review. Similarly, I would guess some of these prestigious scientists would dismiss the concerns (and reviews) of certain outspoken science reformers.


So the relevant concern isn’t just about why scientists lie, but about what happens when consensus on what is and isn’t a lie breaks down. In that way, I believe Bright’s call to action touches on many other deeper themes in modern life: the dissolution of social norms, the fracturing of communities, and when people actually understand themselves to be lying.
 
That's actually addressed in the article. Scientists could make a lot more money by doing basically anything other than publishing research. So money isn't really the motivation here.
The conclusion this story's pushing is that it's more about clout chasing. Scientists are massive narcissists who want to have their name attached to some discovery and/or be seen as experts.
 
I think we'd need to see evidence for the claim that the liars would make more money elsewhere.


Engineers make, on average, about as much as the top earners in scientific research. And they don't need a PhD.
 
The insinuation I am making is that they wouldn't be able to cut it there.

I can't imagine the examples cited in your article would.
 
The insinuation I am making is that they wouldn't be able to cut it there.

I can't imagine the examples cited in your article would.
As a general rule, any practical application of a field is more profitable than research in the field.
The psychologist mentioned probably couldn't be an engineer. But he'd probably make more money as a therapist or advertising consultant of some sort than he did publishing bullshit.
 
Did you guys ever see the British political series The Thick of It? I think a lot about Peter Mannion, the stodgy old-guard Tory minister from the 3rd/4th series. He had as many fuckups as the other characters but, uniquely, it seemed like he had a social life and self-worth outside of politics. Every time he cocked up and looked bad in the media, it seemed like he could at any point just say "fuck this, I'm having a Twix" and go off to live in his country home and watch snooker.

EVERY other character in that show had made politics their entire life and had no friends outside of work. In the modern day, with the traditional community and fraternal organisations destroyed, work life is all people have, on the show and IRL. Every single job in every profession - journalist, municipal politician, game reviewer, scientist - also has to do double-work as your surrogate social club. Unless you're one of the Peter Mannions of the world, you can't afford to say "fuck it" and quit at any time.
 
As a general rule, any practical application of a field is more profitable than research in the field.
The psychologist mentioned probably couldn't be an engineer. But he'd probably make more money as a therapist or advertising consultant of some sort than he did publishing bullshit.
Right, but I am suggesting the practical side requires a different mentality,.

For example, you brought up therapy. Psychiatrists are licensed to be able to do therapy, but many don't because they make more money just writing prescriptions, so they suck at it. Turns out those are two different skillsets.
 
Right, but I am suggesting the practical side requires a different mentality,.

For example, you brought up therapy. Psychiatrists are licensed to be able to do therapy, but many don't because they make more money just writing prescriptions, so they suck at it. Turns out those are two different skillsets.
Right. Just like how a psychologist who specializes in what makes advertising effective could be pretty shit at consulting on the matter when actually working at a firm, opting to just providing the same exact advice he picked up in psych 101. The point isn't that they'd be great at it. It's that they'd make more money doing it. Ergo, money probably isn't the thing that motivates researchers.
 
I mean, you might be right, but I'd just think he just wouldn't get that many consulting gigs if he sucked at it. Might be me being naive.
 
Did you know? Smoking cigarettes is shown to boost your weight loss! The nice scientist who’s friends with the tobacco representative told me so!

Pure income is not where the money comes from you dense motherfuckers.
 
I mean, you might be right, but I'd just think he just wouldn't get that many consulting gigs if he sucked at it. Might be me being naive.
In private practice? Sure. Working for a firm? He only needs to be hired once and then proceed to not drastically fuck anything up.
Every ad produced by any major company had a psychologist assess it. You tell me if you think most of these people are competent.
 
Did you know? Smoking cigarettes is shown to boost your weight loss! The nice scientist who’s friends with the tobacco representative told me so!

Pure income is not where the money comes from you dense motherfuckers.
Would you mind elaborating?
 
Would you mind elaborating?
Comparing the wages that a top level scientist makes against the wages of an engineer is retarded; the money isn’t coming from his income, it’s coming from bribes, insider trading, and other gifts.
 
Comparing the wages that a top level scientist makes against the wages of an engineer is retarded; the money isn’t coming from his income, it’s coming from bribes, insider trading, and other gifts.
Bribes are a really risky business here (not that they don't happen of course).
Get caught misrepresenting data? Not too bad for you, the editor should have caught it!
Get caught accepting "financial contributions" that you "failed to disclose" in your conflict of interest statement? Nobody will ever publish your research again.
 
Bribes are a really risky business here (not that they don't happen of course).
Get caught misrepresenting data? Not too bad for you, the editor should have caught it!
Get caught accepting "financial contributions" that you "failed to disclose" in your conflict of interest statement? Nobody will ever publish your research again.
Disclosed? Certainly! Published, discussed, or made widely known? Not so much.
 
I have thought about this a lot, having known some people on the path to becoming a scientist (some of whom are now). It seems to be a mix of people who really love it, people who are just autistically on that path for some reason they can't or won't explain (possibly literal autism, can also be family pressure), and yes, people who are narcissists and like the recognition.

One of my favorite professors (not science) was a guy who did all the big money making stuff and just became a professor in retirement for fun. That guy really knew his stuff.
 
Disclosed? Certainly! Published, discussed, or made widely known? Not so much.
A conflict of interest statement is part of a paper. If the paper gets published, so is the conflict of interest.
To take money from tobacco companies while studying the effects of smoking is not necessarily a deal breaker. But you have to disclose it in the paper and that may raise a red flag at peer review (which is kinda the point). They'll scrutinize the fuck out of your work but will publish if it passes the sniff test.
 
Back
Top Bottom