Science Lying increases trust in science, study finds - If science isn't trusted, society becomes more vulnerable to misinformation and less able to effectively respond to complex challenges such as pandemics.

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
https://phys.org/news/2025-07-science.html
https://archive.is/bLSeQ
Research by philosopher of science and Honorary Research Associate at Bangor University, Byron Hyde, looked at the role of transparency in fostering public trust in science.

The paper, published in the journal Theory & Society, starts by outlining the "bizarre phenomenon" known as the transparency paradox: that transparency is needed to foster public trust in science, but being transparent about science, medicine and government can also reduce trust.
Hyde argues that, to find a solution to this paradox, it is important to consider what institutions are being transparent about.

The study revealed that, while transparency about good news increases trust, transparency about bad news, such as conflicts of interest or failed experiments, decreases it.

Therefore, one possible solution to the paradox, and a way to increase public trust, is to lie (which Hyde points out is unethical and ultimately unsustainable), by for example making sure bad news is hidden and that there is always only good news to report.

Instead, he suggests that a better way forward would be to tackle the root cause of the problem, which he argues is the public overidealising science. People still overwhelmingly believe in the 'storybook image' of a scientist who makes no mistakes, which creates unrealistic expectations.
Hyde is calling for a renewed effort to teach the public about scientific norms, which would be done through science education and communication to eliminate the "naïve" view of science as infallible.

Honorary Research Associate at Bangor University, Byron Hyde said, "Scientists and government leaders know that public trust in science is important because it enables informed decisions, guides public policy, and supports collective actionon critical issues like health, climate, and technology. If science isn't trusted, society becomes more vulnerable to misinformation and less able to effectively respond to complex challenges such as pandemics. Though it is often assumed transparency will increase trust in science, I argue that it can decrease trust in science instead.

"The truth is science isn't perfect. Scientists are just as biased and equally as liable to make mistakes as everyone else. Most people think that science is and ought to be a lot better than it is or is even capable of being. I argue that people lose trust in science when it doesn't match their expectations. This means that they distrust science that's untrustworthy but, if their expectations are too high, it also means that they don't trust science that's imperfect but still trustworthy."

Hyde says that the problem is that, although scientific facts are taught at school, the facts "about" science are not taught well enough.

He added, "For example, most people know that global temperatures are rising, but very few people know how we know that. Not enough people know that science 'infers to the best explanation' and doesn't definitively 'prove' anything. Too many people think that scientists should be free from biases or conflicts of interest when, in fact, neither of these are possible. If we want the public to trust science to the extent that it's trustworthy, we need to make sure they understand it first."
 
Too many people think that scientists should be free from biases or conflicts of interest when, in fact, neither of these are possible.
The truth is science isn't perfect. Scientists are just as biased and equally as liable to make mistakes as everyone else.
Then it seems like a healthy distrust of science is completely rational behavior. This article is saying "we're just as fallible as anyone else, therefore it's vital that you trust us implicitly no matter what".
 
I understand scientists being frustrated that they explain something, then a person jumps to the wrong conclusion based on a different part of science/life that doesn't apply to the situation at hand. But it's ok to question things. Scientists need to learn how to explain things clearly to people outside their space.
 
The issue isn’t that science can result in unreliable outcomes or make mistakes. It’s when “science” is used by unelected global bureaucracies and governments to extract wealth and crush the rights of citizens. All while declaring anyone who doesn’t instantly embrace the newest narratives as ignorant, racist, phobic, etc etc.

Science is one of humanities best tools for advancing the species. Unfortunately the ones at the top embrace shit like COVID restrictions and gender bullshit. They decided to use science as a weapon and it hurts all of us.
 
I call it science fetishism. Boomers grew up in awe of the atom bomb, and our space race. Americans basically masturbated to themselves over our achievements. War of the worlds and Star Trek were extremely popular, then Star Wars, and plenty of media in between the release of those (superheroes created through radiation, time travel, other space stuff)

That’s really where the over-idealization began. And, like all bad things, it ends with the worst generation.
 
Scientists need to learn how to explain things clearly to people outside their space.
Yeah in my experience, engineers usually are too autistic and can't explain clearly to others who aren't engineers without using technical terms.
 
Last edited:
1.webp

Facts and Science are useful only to the extent that they support the Narrative. If they don't, then they must be destroyed like everything else.
 
Even the progressives favorite black science man said something about how he doesn't want truth, he wants concurrence.
 
The study revealed that, while transparency about good news increases trust, transparency about bad news, such as conflicts of interest or failed experiments, decreases it.
Did the study also look at the effect of trust if you lie about bad news and pretend it is good, but then the audience finds out it's bad on their own?
 
Last edited:
The usual:

1) It's not happening
2) It's happening, but it's no big deal
3) Here's why it's good it's happening <-YOU ARE HERE
4) The people who notice it are the real problem.
 
Science sucks and is basically unreliable.

You should only trust what you yourself have observed first-hand and doubt everything else until you can prove it yourself.
 
We investigated ourselves, found that we did in fact do wrong, but, you shouldn't worry about that... we're still right.
 
This "journalist" deserves to burn to death in a fire. Science doesn't deserve even a modicum of trust, in fact the idea of trust is antithetical to the scientific method. Good scientific principles stand up to any and all scrutiny. Attacking the conclusions of an experiment is central to the process.

The problem is the "journalist" isn't describing scientists, but rather communists with a lab coat. Many such cases.
 
Even the progressives favorite black science man said something about how he doesn't want truth, he wants concurrence.
Consensus not concurrence, which unironically is why Galileo got punished. Galileo literally didn't own his work and part of his struggles with the Church was, iirc, his patron had beef with them and he couldn't actually prove half of his conclusions even with his work. He got fucked in his trial because he antagonized multiple orders of priests even after two popes pulled his ass out of the fire.

Seriously, Medieval astronomy was more bullshit artists than scientists. Copernicus didn't discover shit and more or less guessed something that he was still wrong about, but it was more politics and his successors propped him up. Modern Science is much the same. Scientists play politics to get funding, Jeffrey Epstein used to prowl around Harvard, CalTech, and other schools. Scientists are cheap whores without a solid philosophical core.

Modern Science was very happy to censor anyone during covid or how vax companies are often irresponsible. They don't give a shit about ethics, morality, or truth.
 
I understand scientists being frustrated that they explain something, then a person jumps to the wrong conclusion based on a different part of science/life that doesn't apply to the situation at hand. But it's ok to question things. Scientists need to learn how to explain things clearly to people outside their space.
It also doesn't help that they can't tell us what a woman is.
 
"Science" is not an ideology to be believed or trusted. Science is a means of understanding the observable universe. Even the deists who spawned from the Enlightenment knew this. Anyone who says otherwise is usually a self-described "atheist" who really just replaced God with "science," which are just studies and articles that reinforce their worldview
 
The study revealed that, while transparency about good news increases trust, transparency about bad news, such as conflicts of interest
Yeah funny enough, I do distrust the results of a study that has an obvious conflict of interest. The solution to me seems like maybe biased bought and paid for studies with obvious conflicts of interest should be disregarded or not published at all.
 
Back
Top Bottom