Every so often a new health documentary appears on Netflix, and every time I can feel the dread rise in my chest as I anticipate the questions and requests that typically come my way (“Have you seen it?”, “What did you think of it?”) along with the inevitable misinformation that I’m asked to correct.
When the trailer for (Un)Well appeared, I found myself filled with curiosity instead of dread. It seemed… interesting? That feeling lasted about as long as the intro sequence to the first episode. It was largely downhill from there.
(Un)Well is a 6-part documentary series in typical Netflix style: no narrator, lots of fancy B-roll, and a hell of a lot of fence-sitting. Each episode covers a particular trend in the Wellness industry: essential oils, tantric sex, breast milk for athletes, fasting, Ayahuasca, and bee-sting therapy. The description states that it aims to deep-dive and shine light on these, which is a noble goal in theory, but in practice they made such a mess of it that even the few good parts can’t make up for the over-confident characters and the dangerous messages they spread.
First and foremost, this documentary series falls prey to the common issue of false balance. Instead of outright condemning the various risky practices, they simply tell a few stories, bring in a few experts, and leave you to make up your own mind. To portray someone’s anecdote as being equivalent to an expert’s researched verdict is misleading. The two are far from the same. But how can an expert saying that consuming breast milk ordered via Facebook carries great risks possibly match up to someone claiming it cured their prostate cancer? The emotional weight of the story is far more convincing to a susceptible lay audience.
Dr Joy Bowles
Speaking of experts, they don’t receive anywhere near enough airtime in this series. We see incredibly articulate and intelligent arguments made by the likes of Steven Novella, Christy Harrison, and Dr Joy Bowles (who utters the most iconic line: “People say there are lots of studies – I used to think that was impressive too”. BURN) but they are cut short to make way for footage of vulnerable and sick individuals desperately trying whatever it takes to heal themselves. We see Dr Steven Novella expertly break down the components of bee venom only to immediately return to ‘The Heal Hive’ where founder Brooke Geahan (who has the audacity to name her social media @everydayexpert) says “we don’t know why it works”. We definitely needed more from the experts here.
In addition, there simply wasn’t enough emphasis on the potential for harm. During most episodes we hear from someone who has suffered greatly, or even died, as a result of the therapies and practices mentioned: whole-body rashes from essential oils, sexual assault in tantra workshops, death by water fasting, death by ayahuasca, and death by bee sting. We watch a group of people partaking in a (likely illegal) ayahuasca ceremony in the US where one woman has a bad reaction and stops breathing. The facilitators group around her asking each other “what do you want to do?” as if the answer was not blatantly obvious. Cue me shouting at my TV screen “call an ambulance now!” to no avail. Spoiler alert: she survived, and left terrified at the idea of trying that again. These stories are an important component, and yet the absence of a narrator meant the audience’s attention couldn’t be fully directed to where it was most needed.
It’s worth pointing out a few things that this documentary series did well. They discussed the issue of cultural appropriation in the Wellness industry, which is a huge problem that is too often swept under the carpet. During the ayahuasca episode we hear a local healer says that westerners want to take everything from them, including their spirituality. He’s not wrong. There are also some interesting overlaps between Wellness practices and religion. “Dr” Z, a blogger and chiropractor who sells essential oil masterclasses, claims that Jesus cured his depression (but Jesus couldn’t extend that to acne, apparently that was too hard); a father in Gaza states that “Bees receive divine messages from God” while holding down his son so he can be stung behind the ears to cure his hearing issues; and a tech-CEO in Silicon Valley asks “did God tell us to eat three meals a day?” as part of his argument for fasting (the irony of his name being Woo was not lost on me). Wellness can be cult-like, with a community to rival that of organised religion, and when so much of the documentary was focused on the US it didn’t surprise me to see so much religious narrative present. While they didn’t explicitly delve into this, it was interesting.
It’s clear to see why many of the individuals we follow through the episodes try these unusual and unproven therapies – they’re desperate, vulnerable, and willing to try whatever it takes to improve their lives in the face of difficult diagnoses and years of chronic illness. Of course they are. I think many of us would struggle not to do the same if we were in the same position. I have no anger to direct towards these individuals, I reserve it entirely for those creating shiny healing centres with false promises and false hope, for those profiting from people’s vulnerability. These are the people who say “I feel like I’ve played a positive part in these changes [in people]”, yet would never say “I feel like I’ve played a part in the negative things that happen”. Willing to accept all the praise yet none of the responsibility.
Overall, (Un)Well was majorly disappointing. I wouldn’t recommend wasting your time with it. What I would suggest, instead, is that you listen to myself and cardiothoracic surgeon Dr Nikki Stamp analysing the claims on our podcast, In Bad Taste, so that you can argue with Auntie Karen when she brings up essential oils as a cure for all your ills.
This image depicting fairies dancing in a ring comes from around 1600 and was in its original form a woodcut, before being beautifully hand-coloured to appear in an issue of The Strand magazine in 1892.
A group of fairies dancing in a ring close to the doorway of their fairy mound
circa 1600
Before the English Reformation fairies seem to have been a part of normal folkore, but after it they acquired a Satanic tone as the new Protestant theology held that everything not made for good by God was against Him.
Fairy folklore throughout the British Isles was influenced by the presence of (often circular) iron-age earthworks whose human origins were not properly understood. Notice also that there is a door which goes into the mound, into fairyland.
If you see a fairy gathering like the one depicted, remember not to join in with the dance – you will not escape alive.
Picture the scene: someone you care a lot about buys a medical test over the counter, but instead of buying the one that’s a few pence per test, they insist they want the more expensive one because they think it’s “more accurate”.
Being the good skeptic that you are, you wonder if that’s really true. How does the pricey test compare to the budget one? You’ve heard all the stories about people being scammed out of money for little more than branding. So you buy the test. You take it apart to see how it works. And it turns out it’s exactly the same as the budget one, only 20 times the price. You fire out a ranty tweet explaining your findings and excoriating the company who produced the needlessly expensive test. Case closed, right?
That’s what one Twitter user did when their wife wanted to confirm their pregnancy with a digital test. She’d had a positive test from a few of the cheaper strip tests but she wanted to confirm it with a “more accurate” digital test. The husband was skeptical, so he took apart the test to look inside. And, lo and behold, the test was a simple strip packaged up in some plastic casing with electronic detectors and an LCD screen that announces if you are “pregnant” or “not pregnant”. A few months later, once the pregnancy was announced, he fired off the tweet showing the scam and the tweet went viral. At the time of writing the initial tweet has over ten thousand likes and hundreds of replies.
I have to confess; I liked the tweet myself. It’s a good tweet: it illustrates very neatly that more expensive pregnancy testing methods work the same way as the cheaper ones. But there is a problem here. I mean it’d be a fairly dull story for me to tell you if there wasn’t.
Once this first tweet went viral, a second Twitter user did the same thing, buying a test from a brand called Equate, and took it apart. Being a techy person they stripped it all the way back, talked about how it all worked and how much of a waste of plastic and technology it was to generate these single use devices and then charge people more for something they don’t need.
But what neither tweeter – and many of the responders – failed to consider, was whether it really was something that people who want to test for pregnancy don’t need? Is it really a pointless piece of e-waste that’s filling up landfill and scamming people who might be pregnant, or does it serve a genuine purpose?
To understand this, it’s important to understand how pregnancy tests work, and how the simple strip tests differ from the digital tests. So first let’s take a look at the simple strip test. On Amazon you can get a pack of 15 for less than £3 – that’s 20p per test. It says “99%” accurate.
The tests are a fairly straight forward antibody test looking for a hormone called human chorionic gonadotropin (HCG). HCG is produced by the placenta. You only have a placenta if you’re pregnant. Clever ey? The small absorbent strips are infused with an antibody that recognises and binds to HCG which is linked to a gold nanoparticle. If there’s any HCG in your urine, it binds to the gold labelled antibody and along with the urine it absorbs its way up the strip until it reaches the test line. The test line has another antibody on it and if (and only if) your urine sample has interacted successfully with the first antibody (i.e. there’s HCG in the urine), then this second antibody binds to it and the test strip turns pink.
Positive result; pregnancy. However, tests can go wrong for loads of reasons. Maybe during manufacture the machine misses the strip and fails to add any antibody to that test strip. If that happened, then there’s nothing to turn pink if there’s HCG in your urine, which means a negative result doesn’t necessarily mean no pregnancy. So you need a way to prove the test is definitely ok and it’s definitely working. There’s a third antibody on a second line (keeping up?) and as the urine continues to travel up the absorbent paper, it reaches this second line, the control line and no matter whether there’s any HCG or not, it recognises the gold-linked antibody and it turns pink. The control line proves that the antibodies definitely got added to the paper, and that all the antibodies are doing what they should do and reacting in the right way.
You have three potential results:
The test line is pink and the control line is pink = positive result; the test worked and you’re pregnant
The test line is not pink and the control line is pink = negative result; the test worked and you’re not pregnant
The test line is either pink or not pink, and the control line is not pink = inconclusive result; the test failed, you can’t be sure the result is accurate, you need to take another test
These simple test strips are great. They’re really, really accurate, and they’re really, really cheap. But they are not perfect.
They aren’t always easy to interpret – it’s not a yes or no answer, it’s contingent on that control line showing that the test worked properly. There are many reasons why a person might find the control line confusing and make a mistake in reading the test. Especially if the result holds some emotion for them.
The other problem is that you need to dip them into a urine sample. These test strips have a small line a part way up the test that is the “max dip” line. You need the urine sample to absorb slowly up the strip so it encounters all the antibodies in the right order. So you can’t just pee on the strip. You need to pee in a clean, disposable pot and dip the strip into the pot and hold it there for 5 seconds. Not everyone has access to the facilities that allow them to pee into a such a pot. Either they don’t have access to a pot they can pee into, or they don’t have access to a safe bathroom that gives them the space and privacy to pee into a pot.
And then there’s the issue of ability, or disability. The test strips are a small, thin piece of paper about the size of a coffee stirrer. If you have mobility problems in your hands you might struggle to hold it. It’s also a visual test, the lines are quite small. If you have visual impairment you might struggle to read the test. You might not have someone you feel comfortable (or safe) asking for help.
The plastic, digital tests that have been doing the rounds on Twitter work in exactly the same way – a simple test strip. They cost around £8.50 for two tests on Amazon. But inside the plastic casing are some small LED lights that light up the test and control line and two light sensors that detect the result. They then use a small computer to convert the result into words. They have multiple benefits. Firstly – they’re really easy to interpret. It says “pregnant” or “not pregnant”, no checking and rechecking the instructions to be absolutely sure you got it right. I can’t overstate how reassuring that is for people who want to be really, really sure about the result.
Secondly, they’re much easier to use. These tests keep the strip encased in plastic and they have another absorbent strip that sticks outside of the plastic. You can hold that part into the flow of your urine. No need for a pot, no dipping. Which means you can take the test anywhere. If you don’t have access to clean, safe, private bathroom facilities, you can still find a place to take these tests. The tests are also much bigger, they’re easier to hold if you have mobility issues and the test identifies the line for you. You don’t need to worry if you can’t quite see a faint line. The sensors are way more sensitive than your eyes. And if you have visual impairment that allows you to read a digital screen, they are easier to read. Not perfect for people who are blind or visually impaired but certainly better. Because of that, you could argue, as tech blogger Naomi Wu does, that these digital tests might be more accurate, purely because they’re easier to use: you’re less likely to make a mistake and the machine can detect weak results. But, the companies who sell these tests do not claim that they are more accurate. Their promotional material says the same as the paper test strips: “99% accurate”. Any notion that these are more accurate comes from the perception and assumptions of the consumers (albeit a perception the manufacturers and advertisers may not go out of their way to dispel).
But it’s still important; if people think they’re more accurate and are buying them solely for that reason – then that is wasteful. And these digital tests are single use tech and plastic that are thrown away after use. They can both be useful tests for accessibility purposes, and a waste of plastic, electronics and money for people who don’t need them. Companies could work to make streamlined, less wasteful digital tests or even reusable plastic tests that can be fitted with paper test strips. But I guess that nuance is hard to get across in 240 characters.
If you want to hear more about pregnancy tests including how the tests that tell you how far along you are work, and that weird Tik Tok trend, listen to episode 285 of Skeptics with a K.
Every day, in real life and on social media, we encounter a vocal antivaxxer brigade, evangelical proponents of grain free and raw meat diets, online stores selling magnetic therapy, and chiropractors, homeopaths and reiki healers promoting their businesses. This may sound pretty familiar to those involved in Skeptic societies. But I’m not a doctor. I’m a vet.
It comes as a surprise, even to those who have been discussing pseudoscience for years, that the use of alternative medicine and implausible products is as common in veterinary medicine as in human healthcare.
Over recent years, the aggressive promotion of alternative therapies to prevent and treat animal diseases has become widespread.
Conventional veterinary medicine is a victim of its own success. The fact that vaccination campaigns have been so effective means people are less aware of the dangers of diseases, and more concerned about the potential side effects of vaccines.
As with human medicine, conventional veterinary medicine is also a victim of its own success. The fact that national vaccination campaigns have been so effective against previously common dog diseases such as distemper and parvovirus means that people are less aware of the dangers of the diseases and more concerned about the potential side effects of the vaccines.
There is a widespread and very pervasive conspiracy theory that vets and pharmaceutical companies are promoting dangerous conventional medicines in order to make additional profit when pets become sick. In order to protect their profits, many of these groups believe vets and pharmaceutical companies are trying to suppress knowledge about effective and safe alternative medicines.
Social media has undoubtedly played a role. Facebook groups act as echo chambers where medically implausible claims go unchallenged, and those who express different viewpoints are removed. Facebook groups containing thousands of UK pet owners discuss their mistrust of vets and the pharmaceutical industry. A common discussion point is that vaccines and many commercially available pet foods contain dangerous “chemicals” that make animals sick or cause illnesses such as cancer.
Members of these groups urge the use of “safer” alternative therapies such as homeopathy, reiki, Chinese medicine and chiropractic interventions. Anti-vaxxers recommend orally administered “like cures like” nosodes which they claim are a safe and effective alternative to conventional vaccines, and in some groups animal psychics even offer therapy via Facebook Messenger or phone. There are dog and cat groups containing thousands of members purely discussing what they believe to be ‘natural’ diets for animals, motivated by the supposedly dangerous content of commercially available dog and cat foods. Members of these groups champion unproven health benefits of feeding their pets on raw meat-based, grain-free, organic, or even vegan diets. Some of these raw diets have been shown to be a public health risk.
To add weight to the opinions of those who distrust conventional medicine, a small number of vets are very vocal advocates of alternative medicines, and have huge social media followings. Their clinics offer therapies including homeopathy and bioresonance, and the British Association of Homeopathic Veterinary Surgeons has multiple case reports of cancer cured by homeopathic treatments on their website. These vets have a dedicated following of animal owners who believe that they are whistleblowers speaking out against a corrupt profession who are in the pocket of ‘Big Pharma’, just as discredited ex-doctor Andrew Wakefield purported to do in human healthcare.
To address potential animal welfare harm caused by a small number of vets promoting anti-vaccination advice, homeopathy and other treatments with no scientific plausibility, the Royal College of Veterinary Surgeons (RCVS) published a position statement in November 2017. That statement advised that vets should only use complementary and alternative medicines in conjunction with appropriate treatment, so as not to cause animal suffering by replacing or delaying treatment. The statement went on to explain that homeopathy exists without a recognised body of evidence for its use and that it is not based on sound scientific principles. Such was the strength of opposition that protestors, led by an MP, marched outside the RCVS Headquarters and submitted a petition signed by over 11,000 members of the public, urging them to retract the statement.
Former MP David Tredinnick addresses a crowd as part of a protest against the RCVS.
Delaying or withholding effective treatments can cause animals to become more ill as a disease progresses. In humans, people argue (rightly or wrongly) that the placebo effect can very real and dramatic. However, there is no evidence that animals experience a psychological placebo effect when given ineffective remedies, most probably because they are unaware that they are receiving any treatment. Fascinatingly, as with parents of young children, owners may experience a ‘caregiver placebo effect’ whereby any improvement in the pet’s demeanour is attributed to the effectiveness of the treatment – rather than recognising that many animals with long term heath conditions such as arthritis have ‘good weeks’ and ‘bad weeks’ which are affected by a huge variety of factors.
In conditions such as osteoarthritis, substituting actual painkillers with implausible alternative therapies would lead to an animal experiencing avoidable pain. In cases of severe bacterial infection, death is a real risk if appropriate antibiotics are not administered in a timely manner.
Pet owners often fail to understand that delaying or withholding effective treatments can cause animals to become very ill, and that they could even be prosecuted if their animal suffers because they did not seek veterinary advice.
It is difficult to engage in constructive dialogue with people who believe that anyone with a different opinion is part of a conspiracy. Focussing purely on educational interventions and explaining statistics to anti-vaxxers seems to have little effect. Understanding how these beliefs form in the first place is key. Anti-science attitudes are often more closely tied to an individual’s personal values, and even their politics, than to their level of education. This means that epidemiologists and advocates of evidence based medicine will probably have to rely on the expertise and research of political science and social psychology to develop successful techniques or campaigns to influence animal owners and their decision making.
Adult humans can choose to ignore medical advice and elect to use unproven or dangerous alternatives to genuine medical treatment. However, animals are entirely dependent on their carers to make medical decisions for them. It is therefore unethical to force them to receive treatments that are implausible in place of proven treatments. They have no choice.
Over the coming weeks and months we will be discussing further the various alternative therapies and medicines commonly used in animals, and, importantly, we’ll look at how to engage with animal owners in a compassionate and constructive way to ensure animal welfare is not compromised.
After all, we must remember that we have a lot in common, as those who champion alternative medicine want the same outcome that we vets are striving for: happy, healthy pets.
The Skeptic would like to thank Zoe Belshaw for her contributions to this article.
This twice-weekly podcast, running since 2018, looks at some of the worst people in history. As you might imagine from the title, the Behind the Bastards takes in a lot of obvious targets, from the Ku Klux Klan to Jeffrey Epstein, via Saddam Hussein’s erotic fiction and Joseph Stalin’s childhood.
While almost everyone can get on board with hating the above-mentioned bastards, the show also covers a lot of targets that skeptics in particular may consider worthy of vitriol and discussion. Episodes have focused on everyone from Samuel Hahnemann, founder of homeopathy, to discredited ex-physician Andrew Wakefield, the man behind the MMR anti-vaccine scare that lingers to this day.
The podcast is hosted by former Cracked.com editor and Bellingcat journalist Robert Evans. Evans is fairly open and vocal about his opinions, and this makes the podcasts very entertaining, and also leads to fun political connections, from the links between Nazis and Flat Earth to the racist origins of phrenology.
Evans is always joined by a guest – either a writer, actor, podcaster or comedian, including David X Cohen, Edgar Momplaisir and Sofiya Alexandra – with whom he has excellent chemistry. The style is fairly irreverent and despite the often extremely dark subject matter remains an upbeat and funny listen. This is primarily down to Robert Evans’ infectious enthusiasm, and his endless jokes about starting a cult, podcasting machetes, and the super-awkward cuts to adverts, during which he typically damns the upcoming products and services with extremely faint praise that relate to the podcast topic at hand, promising that they defintely won’t murder your children or are highly unlikely to commit genocide.
Finally, for a skeptic it’s reassuring to note that all of the episodes are well-referenced, in case you want to go behind-Behind the Bastards and check Evans’ sources.
With over 40,000 deaths in the UK and over 800,000 worldwide related to COVID-19 at the time of writing, you’d be forgiven for not having dental issues at the forefront of your mind at the moment. Unless that is, you’re one of the many people who have read about the new dental phenomenon known as ‘mask mouth’.
I hadn’t heard about mask mouth until a dentist friend of mine pointed me in the direction of an article on David Icke’s website. Icke’s site links to an original piece in the Daily Mail, and the story also made it into The Sun and Glamour magazine (now removed but available via archive), among others. You might think that these aren’t the most reliable of sources, and you’d probably be right, so it’s worthwhile looking at these claims to see if they stand up to scrutiny.
As the story goes, the wearing of facemasks during the pandemic is causing an ‘explosion’ in the number of patients suffering from both cavities in their teeth and gum disease. People who have had no previous history of dental problems are starting to have issues with their teeth, and according to the dentists interviewed for the articles, it’s affecting up to half of their patients.
But are these claims realistic, or are they simply a confection? Let’s start with the claims that wearing a mask leads to tooth decay.
Decay, or ‘dental caries’ as dentists like to call it, is one of the most common diseases worldwide. To experience decay, you need four things: teeth (obviously), bacteria, a fermentable substrate (for this; read sugar) and time. Remove any of these, and you remove the chance of decay.
Of course, it’s a little bit more complicated than that. The decay process occurs when the bacteria in the mouth break down the sugars you ingest into acids. If these acids demineralise the surface of the tooth faster than your saliva (or artificial alternative) remineralises it, your tooth begins to decay. The bacteria in the mouth live in a complex colony called a biofilm, which is removed for a short period when you brush your teeth.
It’s worth a brief interlude here to discuss the anatomy of the tooth Teeth consist of three types of hard tissue: enamel, dentine and cementum. These protect the pulp, which is where the tooth receives its nerve and blood supply. If decay reaches the pulp, you tend to end up with toothache of the agonising kind.
Enamel is the hard-outer layer; the first line of defence for the tooth against the outside world. You probably help strengthen this layer by using a fluoride-containing toothpaste. And if you don’t, you should.
The dentine layer of the tooth lies just below the enamel and consists of microscopic canal-like tubules. These tubules contain projections of the pulp and it’s these pulpal projections that cause the sensitivity you may experience when you have something particularly cold. Sensitive toothpastes work by either blocking the tubules or reducing the responsiveness of the pulp
Cementum covers the root surfaces of the teeth and helps attach to the bone of the jaw via the periodontal ligament. You’re only likely to experience decay of the cementum if you’ve had a significant amount of recession of the gums due to long-standing gum disease or trauma.
Dentine and cementum have lower mineral content than enamel. If the decay reaches these deeper layers, it’s likely to spread more quickly, so what we characteristically see is decay breaching the enamel and blooming at the junction between the enamel and dentine. It’s not unusual for what appears to be an insignificant cavity on the surface to be much more sizeable when you get into it.
With that short dental biology lesson complete, we can return to ‘mask mouth’, and ask: is the wearing of masks likely to be a factor in the decay process? In short, the answer is no. Wearing a face covering doesn’t affect the bacteria in the mouth, and it doesn’t increase the amount of sugar you take in. For the short amount of time that most of us will be wearing masks in our everyday lives, the effect on the oral environment is likely to be negligible.
But what about gum disease? Perhaps unsurprisingly this looks unlikely too.
Gum disease is an inflammatory disease of the tissues that surround the teeth. We generally divide this into two types, gingivitis and periodontitis.
Gingivitis is the gum’s inflammatory response to plaque. Importantly, gingivitis doesn’t cause loss of the bone that supports the teeth – that bone loss is a sign of the less common, but more severe, periodontitis. Again, we know that gum disease can be controlled, for most people, by efficient biofilm disruption. And by that, I mean brushing your teeth well. Make sure to concentrate on the gum line, where the biofilm starts to grow; and use interdental brushes or floss to clean in between your teeth regularly.
For most people, the bone loss associated with periodontitis may take years to become apparent. By the time patients have noticed symptoms, which usually involve loose or drifting teeth, the disease can be at an advanced stage. Wearing a mask for the last few months, even if you were doing it for hours every day, is not going to accelerate that process.
Speaking of which, do you know who is wearing masks for many hours a day, every single day at the moment? Dentists. Sure, we are probably a bit better than most at keeping our teeth clean, but there simply haven’t been any reports of dentists developing ‘mask mouth’. And, come to think of it, other than the one or two dentists cited in the mask mouth articles, I can’t think of a single dentist who has even heard of it.
What is ‘mask mouth’ then? One answer might be that it is a cynical ploy to drum up trade by a handful of dentists who haven’t been able to work during the coronavirus pandemic. And that’s what really leaves a bad taste in the mouth.
Since 2012, The Skeptic has had the pleasure of awarding the Ockham Awards – our annual awards celebrating the very best work from within the skeptical community. The awards were founded because we wanted to draw attention to those people who work hard to get a great message out. The Ockhams recognise the effort and time that have gone into the community’s favourite campaigns, activism, blogs, podcasts, and outstanding contributors to the skeptical cause.
I am pleased to announce that nominations for the 2020 Ockham Awards are now open!
In fact, prior to becoming editor of The Skeptic, I was honoured to receive the 2018 Ockham for Skeptical Activism and the 2016 Ockham for Best Skeptical Campaign for my work in stopping NHS homeopathy, as part of the Good Thinking Society. Needless to say, I won’t be eligible for nomination this year!
While we recognise the best in skepticism, our awards are also an opportunity to highlight the danger posed by promoters of pseudoscience with our Rusty Razor award. The Rusty Razor is designed to spotlight individuals or organisations who have been prominent promoters of unscientific ideas within the last year.
Previous Rusty Razor winners have included Andrew Wakefield for his ongoing promotion of anti-vaxx misinformation, and Gwyneth Paltrow for her pseudoscience-peddling wellness empire, Goop.
One of the most important elements of our awards are that the nominations come from you – the skeptical community. We’d like you to tell us who you think deserves to receive the Skeptic of the Year award, and who deserves to receive the Rusty Razor.
Submit your nominations now!
Nominations are open now and will close on October 31st. Winners will be chosen by our editorial board from the nominations we receive, and they will be announced at Skeptics in the Pub Online on November 19th, at 7pm.
What follows is a slightly modified and abbreviated version of the introduction to Professor Edzard Ernst’s recently published book, Don’t Believe What You Think.
Why do even intelligent, well-educated people so often draw wrong conclusions about so-called alternative medicine (SCAM)? The main reasons, I submit, are misinformation, motivated ignorance, motivated reasoning, confirmation bias, and denialism.
Misinformation overload
The popular press has an insatiable appetite for SCAM. There are currently some 50 million websites on the topic. This amount is extraordinary; no other medical subject comes even remotely close. Amazon UK lists currently over 50,000 books of SCAM. This figure dwarfs the numbers of books on most other medical subjects – gynaecology: 10,000, orthopaedics: 10,000, rheumatology: 4,000, for instance. These numbers become even more impressive, once we realise that the vast majority of all SCAM books are written for a lay readership, while most of the books on other medical topics are for healthcare professionals. It is clear from these data: consumers are currently bombarded with information about SCAM.
But such an abundance of material is not necessarily a bad thing. What makes this plethora of information worrying is the fact that the information includes an opulenceof falsehoods. SCAM has been an industry of ‘fake news’ and ‘post truths’ long before these terms even entered our vocabulary. Crucially, most SCAM proponents seem to be unwilling or unable to differentiate between reliable and unreliable sources of information; those that agree with their preconceived ideas tend to be viewed as reliable, and those that don’t are dismissed as unreliable.
Some 15 years ago, my team systematically investigated this ‘industry of untruths’ by extracting from 7 best-selling SCAM books the number of treatments being recommended for given diseases. The results were staggering; every SCAM seemed to be recommended for every condition regardless of any common sense or scientific evidence. In fact, for most of the recommendations, the evidence was either non-existent or outright negative. Even worse, there was no agreement between the 7 books indicating that the recommendations were entirely random. Here are a few examples (the numbers in brackets refer to the number of different SCAMs recommended for that condition):
Addictions (120)
Arthritis (131)
Asthma (119)
Cancer (133)
Depression (87)
Diabetes (89)
Hypertension (95)
When we evaluated the content of SCAM websites for certain diseases, such as cancer, our findings revealed that most sites recommended for an abundance of treatments not supported by sound evidence. The potential for patients to get harmed by these books was undeniable. We concluded that “the most popular websites on SCAM for cancer offer information of extremely variable quality. Many endorse unproven therapies, and some are outright dangerous.”
We also assessed the content of the websites of 200 chiropractors. Our findings showed that 95% of these sites made unsubstantiated claims. We concluded that “the majority of chiropractors and their associations in the English-speaking world seem to make therapeutic claims that are not supported by sound evidence”.
But this bombardment of consumers with misinformation is not just confined to the written word. When the falsehoods come directly from the mouth of a SCAM enthusiast, they get even more dangerous. SCAM practitioners issue wrong advice to their patients all the time, and the same sadly applies to many pharmacists. Another of our investigations determined what advice health food store employees present to individuals seeking treatment options for breast cancer. Eight researchers asked employees of all health food stores in a major Canadian city what they would recommend for a patient with breast cancer. Thirty-four stores were examined, and a total of 33 different products were recommended. Crucially, none of these products were supported by good evidence of efficacy.
Many consumers will readily accept the false information as true. Others might initially have some doubts but will eventually yield to the constant onslaught. The plenitude of cleverly marketed untruths acts like a brainwash and is unquestionably a prime reason for consumers’ erroneous beliefs about SCAM.
Motivated ignorance has been studied extensively. In one investigation, for instance, volunteers were given the choice of reading either a pro or a contra article on the topic of gay marriage. Those who chose to read the opinion they agreed with would be given a chance to win $7, while those who chose to read the opinion they disagree with would have the chance to win $10. The results showed that 63% of the volunteers opted for motivated ignorance and chose not to read the opinion they disagreed with thus preferring to forfeit the chance to win more money.
We all regularly succumb to motivated ignorance, if we want to avoid disconcerting information. Such avoidance of knowledge occurs particularly in the area of healthcare. In SCAM, it is an important reason for people to shut their eyes and ears to the uncomfortable reality that their favourite SCAM might be not as fabulous as they think. The phenomenon is especially powerful when someone has acquired a quasi-religious belief in SCAM. On my blog, I often receive comments from such believers. Whenever they are confronted with the evidence that their SCAM might not be as beneficial as they had assumed, they deny the validity of this information. They might state, for instance, that they reject the evidence because they assume:
the evidence is fabricated,
the research is paid for by ‘Big Pharma’,
the study is fatally flawed.
It is hard to convince a religious person that her god does not exist. In my experience, it is equally difficult to convince SCAM-believers that their SCAM is bogus, and even harder perhaps that their guru is a charlatan. In fact, in many instances, this proves to be impossible. For people who are bent on rejecting science, even the most persuasive evidence will have little impact. In the extreme, motivated ignorance can kill the person who is afflicted by it; think, for instance, of patients refusing life-saving therapy and preferring to opt for an ineffective SCAM despite overwhelming evidence that the conventional treatment would save their lives.
Motivated reasoning can be seen as an attempt to avoid cognitive dissonance – the mental discomfort we experience when faced with contradictory beliefs – frequently at the expense of integrity and sometimes even honesty. Yet, motivated reasoning is common and few of us are immune to it. In SCAM, it is applied abundantly both by individuals as well as organisations, and there are countless examples to demonstrate it.Motivated reasoning is not merely distorting the views of SCAM providers and their patients, it also has the potential to endanger public health.
Confirmation bias
Confirmation bias describes our preference for information that is in line with our prior beliefs. For explaining why consumers often believe in falsehoods about SCAM, it probably is, next to misinformation overload (see above), the most important phenomenon.
While motivated reasoning and motivated ignorance are largely based on our conscious decisions, confirmation bias occurs without us even noticing it. It functions like a filter that eliminates disagreeable information and selectively allows material to enter our mind that agrees with our views. Thus, we subconsciously focus on information that confirms our opinion, while rejecting data that contradict it.
The effects of confirmation bias can be profound and alarming: we might pride ourselves of having an open mind and to fairly consider all the available evidence, while our preconceived beliefs continuously eliminate much of the information that would otherwise contradict them. In the end, we might feel entirely confident to be well-informed and correct, while we might, in fact, be totally misinformed and wrong.
The best protection from falling victim to confirmation bias is to systematically follow where robust science leads us. The criterion must be the reliability of the science, not the direction of its results. Avoiding being influenced by confirmation bias can be a big ask indeed. It often requires not merely accessing the scientific literature but also differentiating between rigorous science, poor science, pseudoscience and fake science.
Patients, particularly those suffering from serious conditions, want the benefit of their pet therapy to be true, and thus it becomes true. They filter out unwelcome information and what remains is a positive impression.
Science denial is defined as dismissal of established scientific evidence for irrational reasons. Even though they sometimes get confused, denialism is fundamentally different from scepticism. In some ways, it is even the exact opposite. Scepticism is based on rational analysis, scientific information, theoretical predictions and empirical evidence. By contrast, science denial is irrational and disregards scientific information and evidence, even in the face of overwhelming data.
Science denial can be naively innocent and inconsequential, for example, in the case of members of the flat earth society. However, in many instances, it causes considerable harm. The denial of the benefits of immunisations, for instance, has brought back measles epidemics and caused the death of hundreds of children. The South African AIDS denialists claimed that AIDS was caused not by the HIV-virus but by poverty-induced malnutrition, recommended nutritional treatments for AIDS and banned the use of antiretroviral drugs. This prompted around 171,000 HIV infections which caused ~343,000 deaths between 1999 and 2007.
Science denial is often paired with conspiratorial thinking which can be viewed as critical thinking gone badly wrong. Conspiration theorists tend to explain certain events as a result of secret machinations of powerful, dark forces. Thus, anti-vacationists as well as AIDS denialists tend to believe that the pharmaceutical industry is conspiring against the public in order to make money at the cost of our health.
In a similar vein, some SCAM proponents believe that their SCAM is suppressed by the powerful interests of the establishment. A study from the US, for example, found that belief in conspiracy theories is rife in the realm of SCAM. The investigators presented people with 6 different conspiracy theories, and the one that was most widely believed was the following: THE FOOD AND DRUG ADMINISTRATION IS DELIBERATELY PREVENTING THE PUBLIC FROM GETTING NATURAL CURES FOR CANCER AND OTHER DISEASES BECAUSE OF PRESSURE FROM DRUG COMPANIES. A total of 37% agreed with this statement, 31% had no opinion on the matter, and 32% disagreed. What is more, the belief in this particular conspiracy correlated positively with the usage of SCAM. Such findings imply that the current popularity of SCAM is at least partly driven by the conviction that there is a sinister plot by ‘the establishment’ that prevents people from benefiting from SCAM.
Perhaps the most tragic examples of the damage caused by confirmation bias combined with science denial and conspiratorial thinking are those of parents preventing their own children from receiving life-saving treatments.
The three-year old Noah was diagnosed with acute lymphoblastic leukaemia, a blood cancer with a very good prognosis when treated (~85% of all children affected can be completely cured and expect to live a normal life). The child was admitted to hospital and, initially, chemotherapy was started. But the treatment was not finished, because the parents took their child home prematurely. The mother, a 22-year-old ‘holistic birth attendant’, had been against conventional treatments from the start. She nevertheless agreed to the first two rounds of chemotherapy “because they can get a medical court order to force you to do it anyways for a child with his diagnosis”. Then she absconded and treated her son with several forms of SCAM: rosemary, colloidal silver, reishi mushroom tea, and apricot seeds.
In a matter of hours, they were found by the police. Noah was then taken from his parents to be medically treated. The parents, meanwhile, were being investigated on suspicion of child neglect. They insisted that they were merely trying to give their son alternative medical care, accusing the police and medical officials of stripping them of the right to choose their own treatment plan for their son. Their supporters called the state’s decision to take custody of Noah a “medical kidnapping”.
The parents of a seven-year-old boy who died after they decided to treat his otitis with homeopathy were convicted of manslaughter. Francesco, from Cagli near Pesaro, died on May 27, 2017 from bilateral bacterial otitis. His parents were found guilty of complicity in aggravated culpable manslaughter. They were given a suspended sentence of three months in jail. The parents had entrusted their son’s care to Pesaro homeopathic doctor Massimiliano Mecozzi, who is set to go on trial on September 24. The homeopath advised treatment with homeopathic products instead of the antibiotics which would have saved him, the court found. Francesco died after the otitis degenerated into encephalitis.
Such cases exemplify how misinformation overload, motivated ignorance, motivated reasoning, confirmation bias and science denial can generate the wishful thinking that misleads many of us into believing things that are not true. I therefore feel that it is wise and timely to caution: do not believe what you think!