Science A Philosopher Asks, 'Why Do Scientists Lie?'

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account

Philosopher Liam Kofi Bright’s article “Why Do Scientists Lie?” starts by mentioning some modern diet wisdom commonly repeated in the news and online. Grocery shopping hungry means you buy more calories. Eating out of larger bowls means people will eat more. People get healthier food when they pre-order lunch. All of these statements are the result of inaccurate, misleading research conducted by Cornell nutrition scientist Brian Wansink. Many of his findings were the result of either misreporting his own data or using inappropriate statistical techniques that he should have known were invalid. Bright describes this behavior more bluntly and concisely: He committed fraud.


In his article, Bright raises the naive position we might take on scientific fraud. It goes something like this: “There are lots of careers that make more money (and include better hours) than science, so if a comfortable job is what you want, pursuing science doesn’t make much sense. Instead, scientists must be more motivated by a deep desire to uncover truths about the world. If that’s really their deepest motive, why would scientists lie? Doesn’t lying contradict their deepest goals?” Of course, Bright makes clear that this naive position misses most of the key developments in a modern understanding of scientists since the 1960s.


A modern understanding of scientists (or at least one informed by Robert Merton’s writing in the 1960s) is one that sees them not as motivated by a pure quest for truth, but as “credit seekers.” That is, they are competing for the recognition of their scientific colleagues and other relevant communities of experts. They do this by racing to establish priority on claims, typically by publishing novel claims in journals reviewed by their peers. This community of peers, who review claims and award tokens of success, are the real audience for new science. Even if an individual scientist doesn’t particularly like playing “the game” of catering to this audience, you need to play just to be given the funds (or a job with dedicated time) for research.


Chasing credit from a community of peers has obvious downsides, as Bright suggests in his opening story. Wansink was (presumably) so motivated by a desire to get credit that he was willing to lie. The biography of Diederik Stapel, a former Dutch psychologist who committed fraud, explicitly identifies community pressure as a motivator. He saw other social psychologists getting exciting results, and didn’t know why he couldn’t. Surely his theory was right, and he wanted to contribute, so he was justified in making up data to support it. This system also gives little credit (and may implicitly penalize) those who point out errors in prior work.


 Mark Neal/Pexels

An advantage of the credit-seeking paradigm is exploring many different scientific routes.
Source: Mark Neal/Pexels
Yet Bright also points out upsides at the community level to this system of credit-seeking. It means that scientists are encouraged to “fan out” and explore new directions of research, in order to be able to make novel contributions. (The idea that scientists need to explore many different ways of generating knowledge to hit upon the truth was also recently advocated by another group, including psychologists.) It also encourages what Merton called the “communist norm” of science: the idea that you succeed by giving away knowledge. Credit comes from sharing discoveries that others can then learn from and build on. Scientists eschew a private patent on their knowledge in return for glory bestowed on them by their community.


Bright ends his article without a clear prescription for countering scientific fraud, but suggests that it presents a great opportunity for doing philosophy. His description also calls out for doing psychology. I have two strong reactions to his characterization of the situation in scientific fraud.


 Jopwell/Pexels

The community standard in peer review can replace the need to question yourself.
Source: Jopwell/Pexels
First, very few people think they are explicitly lying, even when they are. The way Wansink’s fraud was uncovered speaks to this: He wrote a blog post laying out advice for graduate students that explicitly encouraged them to engage in the types of misleading practices that had led to his success. He posted his practices on a public blog as a student resource, presumably because he didn’t see these practices as fraudulent (even though researchers are often explicitly taught not to do them in first-year stats and methodology courses).


Instead, I think that for researchers like Wansink all external standards fall away except whether their community of peers will accept a set of results. In other words, if no peer ever told him that he couldn’t publish a set of results because of misleading practices, then he would never question whether he was doing things right. This leads to a view that “It’s not fraud if it got past the reviewers.” The community standard can become an excuse not to do your own due diligence or ask yourself hard questions. Persuasion becomes more important than precision.


My other major reaction is that psychology research’s current “crisis” is not about whether to embrace peer review, with both its upsides and downsides. It’s about a split over whose expert review should be respected. Based on comments I have seen in the last decade, there are prestigious scientists whose scientific opinions no longer carry much weight with me. I don’t particularly care whether research gets their stamp of approval, because I don’t think they would ask the necessary, hard questions when they do peer review. Similarly, I would guess some of these prestigious scientists would dismiss the concerns (and reviews) of certain outspoken science reformers.


So the relevant concern isn’t just about why scientists lie, but about what happens when consensus on what is and isn’t a lie breaks down. In that way, I believe Bright’s call to action touches on many other deeper themes in modern life: the dissolution of social norms, the fracturing of communities, and when people actually understand themselves to be lying.
 
This is why we shouldn't put any credit into the so called social sciences.

If your results aren't able to be replicated outside your own findings then it's not science it bloody fan fiction. That's why these soft sciences fell so fast to the postmodern scourge. When you have no real foundation everything is up for grabs and the loudest voice win.

That's what I love about the universe. These progressive regressives can screech all they want about 1+1=3 but reality dont give two shits. Go on try building a bridge or a plane with post modern intersectional decolonized math and see what happens bitches, I dare you.
 
Suppose a meteorite is destined to strike the earth causing a complete collapse of society worldwide and there is nothing anyone can do about it. Why tell people a month in advance its best to just let it happen. Ignorance is bliss in this case.
 
Power and Narcissim.
I think this is Fauci's motivation.
I do not care enough to read this fattie article.
 
Imagine filing your taxes and writing "philosopher" as occupation.

I mean, I used "Gentleman Adventurer" when I was a happy NEET but still
 
Back
Top Bottom