To detect fake news and conspiracy theories faster, we need to train our cognitive skills. One way to do this is to mentally inoculate people; to administer misinformation in diluted doses, honing their knowledge of deceptive methodologies to boost their cognitive immune system.
This is the simple but brilliant idea that Cambridge University professor Sander van der Linden, a social psychologist and authority on decision-making, influence, polarisation and psychological inoculation, presents in Foolproof. Why Misinformation Infects Our Minds and How to Build Immunity. If you want to know why people believe fake news and how to defend yourself against it, then this is the book to read.
In Foolproof, van der Linden delivers his thesis in three parts. Firstly, that there actually exist, in the form of misinformation, conspiracy theories and other not-so-innocent ideas, viruses of the mind that disrupt our view of the world. Just as viruses in our bodies are difficult to combat – antibiotics don’t help, for example – it turns out to be enormously difficult to get misinformation out of our brains once it has nestled there.
His second contention is that viruses of the mind are even more contagious than biological ones. No virus can exist without hosts, but a virus of the mind doesn’t even require physical contact to transmit. Moreover, such viruses are harmful to the health of individuals and democracies.
Van der Linden’s final, more hopeful thesis is that, as a treatment method, there is a psychological vaccine against fake news – with the most convenient advantage that no needles need to be involved.
Developing such a vaccine requires understanding how the virus works. Van der Linden identifies six manipulation methods such viruses use:
- discrediting authorities and critics
- polarising groups in society
- capitalising on people’s emotions to manipulate them
- circulating conspiracy theories
- boosting fake experts and organisations
- trolling people and online discussions.
Van der Linden ends each chapter with an overview of four to six fake news antigens. That makes the book both insightful and practical, along the lines of Thinking, Fast and Slow (2011) by Daniel Kahneman, The Believing Brain (2011) by Michael Shermer, Irrationality (1992) by Stuart Sutherland, How We Know What Isn’t So (1991) by Thomas Gilovic, and The Parasitic Mind (2020) by Gad Saad.
In Foolproof, readers will find an accessible crash course on how human psychology functions in processing information and assessing its truthfulness. Van der Linden is highly skilled at condensing the most relevant knowledge into a highly coherent text.
Critical Reflection
Van der Linden puts previous research, both of others and his own, into perspective, pointing out its limitations and elaborating on experiments that attempt to accommodate these limits. He examines their findings to see if they measured what they intended to find out. For example, did they collect enough data to make valid conclusions? Are the experimental findings obtained from the lab situation comparable to the effects in the real, online world?
Two examples to illustrate this: in one experiment, van der Linden and his colleagues inoculated people against misinformation about climate change by having them come into the lab and read the prebunk (the pre-emptive refutation of a false petition that was circulating online claiming that over 30,000 scientists agree that global warming is not real). They did this by deconstructing the false appearance of expertise, and highlighting that many of the signatories were clearly fake – for example, one of the names on the list was Charles Darwin.
Although the inoculation was successful across the political spectrum, his team then realised that this approach wouldn’t readily scale across different issues or the population at large. It was a narrow-spectrum vaccine. Another limitation concerned environment of the intervention: you simply cannot have everyone come into the lab.
Van der Linden and his team subsequently developed a real-world fake news game to generalise this principle and expose people to weakened doses of common manipulation techniques, such as conspiracy theories and polarising headlines.
However, the initial research was based on a card game where they tested high school students in relatively small numbers. The sample was not big enough to make reliable claims. Accordingly, they designed an online game, Bad News, to accommodate a more realistic social media experience and test thousands of people in a short period of time using in-game surveys. They then also questioned their first results, because the number of test (fake news) items was relatively small. So van der Linden and his team conducted randomised trials and replications with more test items to confirm the findings. His modest and critical attitude – including toward his own research – is particularly pleasing.
Microtargeting
In this second part of the book, van der Linden discusses at length the question that fascinates many: can social media influence people’s opinions and, if necessary, voting behaviour? In doing so, he focuses on the American-British political consulting firm Cambridge Analytica, which tried to use data mining to steer political campaigns in favor of Trump and the Leave camp of Brexit. The answer to this is not so straightforward, requires some essential nuances, and still contains some unresolved issues.
To summarise this briefly is not easy, but I’ll try to paint a picture. Van der Linden discusses the results of empirical studies of microtargeting, which show that although many traits can be easily predicted from your online digital footprints, complex psychological traits, like people’s personalities, require much more data. The accuracy of those models in predicting personality is generally lower than predicting things like people’s gender.
However, crucial newer evidence comes from causal experiments that actually tested the direct impact of personality-based microtargeting messages on voting intentions and behaviour, and find that microtargeting can influence voting more effectively than non-targeted messages. The nuance is that we will never know whether Cambridge Analytica effectively influenced the US election, but the evidence suggests it might have contributed by targeting susceptible swing voters.
Many scientists are skeptical about the theory that ads, messages, and videos on social media can influence us to, for example, switch our votes. But what if, through well-chosen data points, one can create a largely accurate psychological profile and use that as the basis for developing messages in a style and tone that appeals to that person more than another? If you get information aimed at you in the right context, research provides a more disturbing picture, van der Linden argues. This microtargeting through social media channels, based on your psychological profile, is what Sander van der Linden calls weapons of mass persuasion.
We should not be too complacent and think that we can escape the influence of fake news and effortlessly detect nonsense when it is presented to us. Do you still expect yourself to be an expert at recognising fake news? Here’s just one example of an essentially simple exercise that doesn’t even use microtargeting:
- Putin issued an international arrest warrant against George Soros.
- A baby was born in California named “heart eyes emoji” 😍
- A criminal farted so loud that he betrayed his hiding place.
Sander van der Linden presented six similar statements to 1,500 people. 50% said they were quite confident in their own ability to distinguish true from false. Unfortunately, only 4% could correctly identify all of them. In the above list, for example, only one is true and the other two are made up. Do you know which one? No spoilers here; to know which statement is true, you will have to read the book.
The illusory truth effect is another characteristic example to mention. This effect shows how repeating a lie makes it feel more true, because the brain mistakes fluency and familiarity for truth. Unfortunately, prior knowledge does not protect us against illusory truth. For example, former President Donald Trump falsely claimed that Democrats had stolen the election by committing widespread voter fraud, claims that were frequently repeated on social media. These bogus claims ranged from ‘mysterious vote dumps for Joe Biden’ to ‘dead people voting’. These claims had a real-world effect: while the violent Capitol riots on January 6 2021 weren’t solely the consequence of conspiracy theories, an extensive investigation by the independent fact-checker PolitiFact showed that when rioters were prosecuted for their participation, in at least half of the cases misinformation had shaped the defendants’ beliefs.
Prebunking
It was during a United Nations meeting in 2016 that van der Linden started realising that top-down approaches via legal restrictions and deplatforming are insufficient to tackle the core of the problem: our psyche. The discussions during that three-day meeting on ‘fake news’ quickly converged on the insight that the law and technology are blunt tools; whatever legislation is introduced, it is not going to address the spread of misinformation in and of itself. Van der Linden realised that inoculating people against disinformation has another obvious advantage: you don’t have to make any infringements on free speech. After all, you are just empowering people to spot manipulation so they can make up their own minds accordingly.
Admittedly, both the idea that you can inoculate people against disinformation and conspiracy theories and the listing of those six manipulation techniques do not in themselves contain earth-shattering insights. Inoculation theory has been around for many years. For example, the title of Chapter 8, “The New Science of Prebunking“, suggests something not quite right. In fact, skeptics have long been cracking their heads to find a method of making people immune to nonsense. Still, the studies and charts in Foolproof provide a fascinating look at how and to what extent prebunking works effectively and the author briefly explains, in passing, why Aristotle was probably the first to see the usefulness of it.
To illustrate a real-world case of how prebunking has been implemented, van der Linden and his team created “prebunking” videos for the ad space in YouTube videos that were scaled across millions of users. Van der Linden’s team tested people with a quiz within 24 hours of having seen either the inoculation or a control video. The prebunk here was the weakened dose administration of a logical fallacy, for example, a false dilemma. These are often used by politicians and extremists, by presenting people with only two options (while in reality there are more), taking away all nuance. For example, “either you support AR15 rifles or you’re against the 2nd Amendment” or “either you join ISIS or you’re not a good Muslim”.
The weakened dose (prebunk) in the videos is something rather non-political and innocuous: a clip from Star Wars where Anakin Skywalker (who becomes Darth Vader) challenges Obi-Wan Kenobi by saying “If you’re not with me, then you’re my enemy!” To which Obi-Wan replies: “Only a Sith deals in absolutes”. Note that the weakened dose has the same structure as the other false dilemmas. Van der Linden found that these short prebunks help people discern and recognise these manipulation techniques better.
Yet van der Linden sometimes seems just a little too enthusiastic to me when he states, “Now that we had discovered an effective psychological vaccine, we had to find a way to spread it to the population” (p. 229). Indeed, his team developed several online games, including Bad News, in which you can work on some misinformation techniques yourself. While mental immunity does increase by a stunning 20-25% that way, I can hardly call that an effective vaccine. Especially when van der Linden himself points out that viruses of the mind are even more contagious than biological ones. How high should group immunity be?
Tone and Approach
The book ends with a chapter on how to mentally inoculate friends and family members. Van der Linden distinguishes between an inoculation focused on facts and one on techniques. This is particularly useful because it helps you switch from one method to another if you find that your interlocutor is not amenable to a particular strategy.
Although he seems to tread lightly about it, it must be said, his approach is quite intense. Realising that it probably can’t be helped, because I too suspect that it easily takes several months to years to turn ideas around, I fear that you won’t make yourself very popular if you regularly apply van der Linden’s advice. You might well be in danger of being labeled a tiresome pedant at family parties and gatherings.
Nevertheless, this book – in addition to being engaging – is also a solid read whose content is also practical. Those who are not afraid of being challenged in terms of content will find this book one of the better works on the subject.
Foolproof: Why We Fall for Misinformation and How to Build Immunity by Sander van der Linden is out now, published by Harper Collins