The Big Idea: How to Win the Fight Against Disinformation | Books

0

Iver the past few years, the Internet has become the site of a general breakdown in trust. Trolling, fake news, and “doing your own research” are so much a part of public discourse that it’s sometimes easy to imagine that the online revolution has brought us myriad new ways to be confused about the world.

Social media has played a particularly important role in spreading misinformation. Malicious state enterprises such as Russia’s notorious “troll farm” are certainly one of them. But there’s a more powerful mechanism: the way it brings together people, whether flattering or anti-vaccine, who would struggle to meet like-minded people in the real world. Today, if you are convinced that our planet is not round, you do not need to stand on street corners with a sign, shouting at passers-by. Instead, you have access to an online community of tens of thousands of people churning out content that not only tells you you’re right, but also creates a web of pseudo-knowledge that you can tap into whenever you feel that your beliefs are being challenged.

Sign up for our Inside Saturday newsletter for an exclusive behind-the-scenes look at the making of the magazine’s biggest stories, plus a curated list of our weekly highlights.

The same types of “counterfactual communities” arise around any topic that generates enough general interest. I have witnessed this myself over the past decade investigating war crimes in Syria, Covid-19 disinformation and now the Russian invasion of Ukraine.

Why do counterfactual communities form? A key factor is distrust of dominant authority. For some, it is partly a reaction to the fabrications of the British and American government in the preparation for the invasion of Iraq in 2003. Sometimes it stems from a feeling of injustice around the Israeli-Palestinian conflict. These positions are of course legitimate and are not in themselves indicative of a tendency to believe in conspiracy theories. But a pervasive sense of distrust can make you more vulnerable to slipping down the rabbit hole.

One way to look at this is that government deception or hypocrisy has caused some form of moral harm. As with the saying “once bitten, twice shy,” this hurt can lead to instinctive rejection from anyone perceived to be on the side of the establishment.

This creates a problem for traditional approaches to countering misinformation, such as top-down fact-checking, which may be provided by a mainstream media outlet or other organization. More often than not it will be discredited, dismissed with, “They would say that, wouldn’t they? Fact-checking teams can do a good job, but they lack a crucial element: crowd power. Because, in addition to counterfactual communities, we have also seen the emergence of what you might call truth-seeking communities around specific issues. These are Internet users who want to be informed while avoiding being manipulated by others or being misled by their own preconceived ideas. Once established, they will not only share and propagate fact checks in a way that gives them credibility, but they will often drive the fact check process.

What is important about these communities is that they respond quickly to information disseminated by various actors, including states. In 2017, the Russian Defense Ministry posted images on social media that it said showed evidence of US forces assisting Islamic State in the Middle East. Huge so true – except it was instantly debunked when social media users realized within seconds that the Russian Defense Ministry had used screenshots from a computer game.

I would go so far as to say that internet users who are highly engaged with particular topics are our best defense against misinformation. At Bellingcat, a collective of researchers, investigators and citizen journalists that I founded in 2014, we saw this play out in real time during the Russian invasion of Ukraine. Our investigation into the downing of Malaysia Airlines flight MH17 over eastern Ukraine has helped create a conflict-focused community in that country that uses open source techniques to examine, verify and debunk a myriad of information. In the weeks leading up to the Russian invasion, members began collecting videos and photographs of Russian troop movements that warned of the planned attack, and proactively debunked misinformation spread by separatists, including a supposed IED attack posed with bodies shown to have already been autopsied before arriving on the scene.

After the invasion began, many of the same people helped collect and geotag videos and photographs, including images of potential war crimes, which Bellingcat and its partners verified and archived for possible use in future accountability processes. These seemingly disjointed and chaotic contributions saved our audit team considerable time.

But how do you grow and nurture what are essentially decentralized, self-organizing, ad hoc groups like this? Our approach has been to engage with them, link from their helpful social media posts to our posts (all carefully vetted by our team) and thank them for their efforts. We also create guides and case studies so that anyone inspired to try it for themselves has the opportunity to learn how.

But there’s more to it than just waiting for throngs of investigators to emerge and hope they’re interested in the same things we are. We need to take a broader approach. The answer lies in creating a society that is not only resistant to misinformation, but equipped with the tools to actively contribute to transparency and accountability efforts. For example, digital media education charity The Student View has visited schools and shown young people aged 16 to 18 how to use inquiry techniques to explore issues that affect them. In one case, Bradford students used Freedom of Information requests to uncover the unusually high number of high-speed police chases in their area.

Teaching young people how to engage positively with the issues they face and then extending that work to online inquiry is not only empowering, it gives them skills they can use throughout their lives. It is not about making every young person aged 16 to 18 a journalist, a policeman or a human rights investigator, but about giving them tools that they can use to contribute, no matter how small. , to the fight against misinformation. In their hometowns, in conflicts like Ukraine and in the rest of the world, they can really make a difference.

Eliot Higgins is founder of the investigative journalism network Bellingcat and author of We Are Bellingcat: An Intelligence Agency for the People.

Further reading

It’s Not Propaganda: Adventures in the War on Reality by Peter Pomerantsev (Faber, £14.99)

The Age of Misinformation: How False Beliefs Spread by Cailin O’Connor and James Owen Weatherall (Yale, £11.99)

A field guide to lies and statistics by Daniel Levitin (Viking, £14.99)

Share.

Comments are closed.