When was the last time you realised you were wrong about something? Was it hard to change your mind? How about when someone else misinterpreted what you said or wrote? Was it harder to change their mind?
The work of many scientists around the world has been publicly scrutinised, sometimes misunderstood and occasionally misrepresented to an unnerving degree over the last few months. For example, confusion around work carried out on previous coronavirus generated claims that the existence of SARS-CoV-2 was known before the official date and that a vaccine already exists.
A huge wave of fake news and conspiracy theories have spread in parallel to the COVID-19 pandemic. Treatments to “boost” the immune system, claims that “washing hands does not help”, suggestions to inject bleach or apply UV radiation directly to the skin, and conspiracy theories about the origin of the virus or Bill Gates’ leading role in spreading it: falsehoods range from the bizarre to the outright dangerous. This disinformation has caused confusion and alarm, and has already cost lives – as tragically illustrated by the accidental deaths in Iran from alcohol poisoning trying to prevent infection from SARS-CoV-2. The disinformation also poses a risk to the public’s perception of science.
Like a lie that borrows from the truth, fake news resembles legitimate news. A video of a real event might be used with a different context; an audio recording of an 'expert' might be shared without details about the speaker; a familiar headline might lead to a misleading article. Synthetic images or videos (deepfakes) and text generated by artificial intelligence (readfakes) are increasingly common.
While format is important, content is key. Disinformation often seems plausible, uses relevant vocabulary, and might even contain some truth, albeit usually distorted.
Disinformation also has powerful engines. It is a profitable business, whether through programmatic advertisement or by harvesting people’s personal information. It can also be used to undermine governments and organisations. Or it can foster criminal activities, such as cybercrime and fraud, as reported by Europol, the European Union law enforcement agency.
But disinformation is also driven by our needs. It will flood into an information vacuum. If we have questions or feel unclear, we will be open to information. If that information is not available or appropriate, disinformation might move in instead.
Disinformation messages are designed to stir emotions, such as anger or fear, that might cloud our critical thinking and make us seek validation from others (i.e. more sharing). Feeling powerless, vulnerable or uncertain makes us more receptive to disinformation.
And importantly, disinformation appeals to our preconceived ideas. This is particularly relevant for conspiracy theories and pseudoscience. The Conspiracy Theory Handbook has compiled the seven traits of conspiratorial thinking, which include being immune to evidence and interpreting random events as highly significant, both important factors when choosing what to believe.
Generally Europeans think that science and technological innovation have a beneficial effect on our society and economy, and in particular on health and medical care, as reported by the European Commission. Obviously antivaxxers, homeopaths and other pseudo-therapists, creationists, climate change deniers, and those who burn 5G telecom towers, might disagree with that view. It might be too difficult to change their minds, but it is essential to stop them influencing the opinion of others.
One way to do this is by stopping the spread of disinformation and provide accurate information instead. Another important approach is to “inoculate” the public against disinformation by raising awareness of the problem and showing how it spreads. This works best before people have been exposed to false information, as it is harder to dispel it afterwards. For example, putting forward “fake experts” and offering “false balance” (which is when a divergent opinion is given equal weight in media coverage to the scientific consensus) are techniques used to confuse the public about the level of agreement that scientists are in with regards to certain topics, such as vaccines or climate change. Encouragingly, research shows that the impact of those techniques can be neutralized by letting the public know about them in advance, as well as by highlighting the existence of a consensus around the topic.
Much is being done to contain the effect of disinformation. The World Health Organisation (WHO) is working with Google, Facebook, Twitter, YouTube and other search and media companies to limit the spread of disinformation and provide useful information in its place. Facebook is optimising the use of artificial intelligence to remove false or banned content, but they still rely on fact-checking organisations to confirm COVID-19 hoaxes because of the challenges in distinguishing false and legitimate content.
The Poynter Institute has set up the International Fact-Checking Network to coordinate the work of fact-checking organisations around the world. They have released a WhatsApp chatbot that links users to their huge database of hoaxes, to help them validate messages.
News agencies such as France 24 are training their audience on how to identify disinformation with videos and verification guides. People can reverse search images in Google Images and check videos on the InVid verification platform , but fact-checkers remind us that one of the most powerful tools is to be cautious and to not share information unless we can verify it.
Two behavioural scientists from Cambridge University have taken a different approach to stem the sharing of fake news. They developed an online game, called Bad News, that invites players to become fake news producers who use disinformation techniques to amass followers and win the game. The techniques include polarisation, invoking emotions, spreading conspiracy theories, trolling people online, deflecting blame, and impersonating fake accounts. An early evaluation has shown that players of all ages and backgrounds become better at resisting disinformation by learning how it spreads.
Some efforts are going straight to the source of the problem. Communication professionals in Macedonia have launched an initiative to teach high school children about the impact on society of fake news. Macedonia became notorious after the USA 2016 election because of the huge amounts of false information that was generated there, often by teenagers. Teens made thousands of dollars by luring Trump supporters towards ad-laden websites with fake news. However, despite the initiative, some young Macedonians are still producing fake content and argue that it is the responsibility of Americans to become more media literate.
Counteracting the wave of disinformation in science will need scientists to talk to the public more often about science and research. If you are interested in becoming a science advocate, or want to improve your communication skills, ask your University or workplace communication team for resources and training opportunities. You can also search online for science communication resources, such as those from SciDevNet and the American Association for the Advancement of Science, or free science communication online courses from FutureLearn, Coursera or edX.
Other posts from the FEBS Network on fake news and the importance of science communication:
https://network.febs.org/channels/728-viewpoints/posts/44035-cultivated-ignorance
Join the FEBS Network today
Joining the FEBS Network’s molecular life sciences community enables you to access special content on the site, present your profile, 'follow' contributors, 'comment' on and 'like' content, post your own content, and set up a tailored email digest for updates.