Part 2: Science literacy (or, how not to be seduced by “quantum” anything)

There’s a quiet problem that sits underneath a lot of the confusion I see online about mental health, trauma, and therapy. It’s not that people are stupid, or that they don’t care. It’s that science literacy is, frankly, not something most of us were ever properly taught.

Science literacy isn’t about memorising the periodic table or being able to explain string theory at dinner parties (please don’t). According to the National Academies of Sciences, Engineering, and Medicine, being scientifically literate involves three broad things: knowing some basic scientific facts and concepts; understanding how science actually works (think hypothesis testing, probability, and the difference between correlation and causation); and recognising that science is a social process shaped by peer review, funding, conflict of interest, and the slow accumulation of evidence over time.

That’s a big ask. No one is perfectly scientifically literate. Even experts are experts in a very narrow lane. I’m a clinical psychologist, but I’m not a neuroscientist, an immunologist, or a climate scientist. Becoming scientifically literate does not mean becoming an expert in everything. It means developing intellectual humility. It means knowing when to trust expertise and when to ask better questions. It means being comfortable saying, “I don’t actually know enough about this to have a strong opinion.”

If it has the word “quantum” in it, take a breath

Which brings us to therapies with names that sound like they were generated by a late-night marketing team armed with a thesaurus and a particle accelerator. These impressive-sounding modalities will promise to cure trauma, chronic pain, emotional blocks, your fear of public speaking, and possibly your Wi-Fi connection.

Here is a helpful rule of thumb: if a therapy has the word “quantum” in it, and it’s not being delivered by a physicist in a lab, you should be extremely sceptical.

Quantum mechanics is a complex, counterintuitive field of physics. It is not a metaphor for your nervous system having a bad day. When scientific language is used in a vague, grandiose way without clear mechanisms, peer-reviewed trials, or plausible explanations, that’s usually a red flag. It’s meant to sound impressive. It’s not meant to be understood.

Science literacy protects you from this. Not because you need to understand quantum physics, but because you understand just enough to recognise when it’s being misused.

Why this actually matters

Research consistently shows that lower science literacy is associated with greater susceptibility to pseudoscience and misinformation. People with weaker science literacy skills are more likely to believe that vaccines cause autism, that we only use 10% of our brains, that homeopathy can cure diseases, or that there’s a suppressed cure for cancer being hidden by shadowy forces. These beliefs aren’t harmless quirks. They shape health decisions. And health decisions have consequences.

At the same time, we need to avoid the comforting fantasy that “smart people don’t get fooled.” You could have multiple degrees and still fall prey to misinformation. Our minds are wired for shortcuts. We’re influenced by our prior beliefs, our social circles, our anxieties, and our need for certainty. Science literacy doesn’t make you invincible. It simply makes you a little harder to con.

The zipper test: You know less than you think (and so do I)

There’s a cognitive bias that illustrates this beautifully: the illusion of explanatory depth. If I asked you to explain how a zipper works, you might feel fairly confident. You’ve used thousands of them. But if I asked you to explain, in detail, the exact mechanism by which the interlocking teeth align and lock, your confidence might start to wobble. We often think we understand things much better than we actually do.

The same thing happens with complex topics like trauma, neurobiology, or psychiatric medication. We read a few articles or watch a couple of reels. Suddenly, we feel quite certain. But when asked to explain the mechanisms, the evidence base, the limitations, and the counterarguments, that certainty tends to shrink.

This is where scientific consensus becomes useful. Scientific consensus isn’t about blind faith. It’s the cumulative position that has survived repeated scrutiny, replication, and critique from experts in a given field. It’s not perfect, and it can change over time. But it’s generally a far more reliable guide than your neighbour’s Facebook post or a charismatic wellness influencer with a ring light.

Interestingly, research published in Science Advances found that people who most strongly rejected scientific consensus on various topics tended to know less about those topics while believing they knew more. Confidence and knowledge are not always correlated. In fact, sometimes they’re inversely related.

“But it worked for my cousin”: The problem with anecdotes

Another common trap I see is over-reliance on anecdotal evidence. Someone says, “I tried X therapy and my depression improved, therefore it works.” I am genuinely glad their symptoms improved. Truly. But from a scientific perspective, that’s not enough.

The plural of anecdote is not data.

Anecdotes are useful starting points. They generate hypotheses. They don’t establish effectiveness. To ethically recommend a treatment, particularly in mental health, we need rigorous testing: randomised controlled trials, replication, systematic reviews. We need to ask: compared to what? Under what conditions? For whom? At what cost? With what risks?

Our brains are particularly vulnerable to the post hoc fallacy: “after this, therefore because of this.” If your symptoms improved after a therapy, it’s very tempting to conclude that the therapy caused the improvement. But symptoms fluctuate. Time passes. Placebo effects are powerful. Other variables shift quietly in the background. Correlation is not causation, even when the timing feels compelling.

This is also why, in Australia, psychologists are prohibited from using patient testimonials in advertising. Testimonials can create the illusion of guaranteed effectiveness, blur power dynamics, and mislead potential clients. It’s not because we don’t have success stories. It’s because anecdotes are not a reliable substitute for evidence.

Science is a tool, not a personality trait

Thinking scientifically doesn’t mean worshipping science. Science is not a belief system. It’s a method. It’s a tool. It’s an evolving, imperfect, human process that attempts to reduce error over time. To “believe in science” as though it’s a deity misses the point entirely. The point is to understand how it works: how evidence is weighed, how studies differ in quality, how peer review functions, and how findings accumulate (or fail to).

In mental health, not all evidence is created equal. A single small, poorly controlled study published in an obscure journal does not carry the same weight as a large, well-designed randomised trial replicated across multiple settings. Animal studies, case reports, and mechanistic speculation all have their place. But they sit lower on the hierarchy of evidence when we’re making treatment decisions for real people.

Trust, but with receipts

Science literacy, then, is less about memorising facts and more about developing a posture. A posture of curiosity, humility, and measured scepticism. Of knowing that expertise matters, and that not all opinions are equally valid when it comes to complex scientific questions.

In a world saturated with confident voices, bold claims, and very convincing jargon, that posture is protective. It won’t make you immune to misinformation. But it will make you pause before handing over your money to anything that promises a “quantum reset” of your nervous system.

And honestly, that’s a very good start.

Next
Next

Gay men and the “spark”: When attraction is actually anxiety