By Lwazi Maseko and Stuart Dickinson

Misinformation and disinformation are not only a threat to democracy, but a direct threat to the sustainability of credible news media, say media watchdog groups during a panel discussion on mis- and disinformation in Africa at the recent Jamfest conference.

In 2024, approximately 74 different elections will take place around the world, with 15 of those happening in Africa. As these countries face the evolving threat of misinformation (info that is inaccurate) and disinformation (info that is spread with the intention to cause harm), fact-checking and media-monitoring groups are finding ways to mitigate the damage caused.

“The environment of elections is already so tense and volatile, especially in countries like South Africa,” said Thandi Smith, Media Monitoring Africa’s (MMA) head of programmes. “A lot of the voting public is sitting on the fence in terms of where they plan to cast their vote. They might not understand the politics, the voting dynamics, the electoral process and so on, which makes instances of disinformation so dangerous. All those who want to disrupt the election process need to do is sow that little bit of doubt, that little bit of mistrust.”

MMA was established in 1993 to analyse the coverage of SA’s first democratic elections the following year, and ultimately ensure that the process was free and fair. Since then, it has grown into an organisation with a number of initiatives and programmes in place designed to safeguard media freedom and freedom of expression, and ensure that the country’s media fulfils its obligations and responsibilities in empowering the public to be active citizens within a democracy. This includes fighting back against mis- and disinformation.

Smith was joined by Carina van Wyk, head of education and training at Africa Check, Jibi Mering Moses, associate editor at 211 Check, and panel moderator Pheladi Sethusa, lecturer at the Wits Centre for Journalism. 

“Our goal is to reduce the spread of false information and to provide the public with accurate information so they can make well-informed decisions, especially in the lead-up to elections,” said Van Wyk.

Africa Check was established in South Africa in 2012 as the continent’s first independent non-profit fact-checking organisation. It monitors high-profile and far-reaching public statements on social media and WhatsApp, interrogates the best available evidence and then debunks false information and publishes fact-checking reports under a creative commons license to guide public debate.

“We see a big increase in incidents of disinformation during major events, such as the run-up to elections, xenophobic incidents, and outbreaks of infectious diseases and the like,” said Van Wyk. Since 2015, Africa Check has trained over 10,000 people in fact-checking verification methods, and has also introduced lesson plans in schools around the country that teachers can use to educate learners on the dangers of disinformation, and how to spot it.

211 Check is South Sudan’s first and only independent fact-checking and information verification flagship project, established in March 2020 to counter Covid-19 dis- and misinformation.

“The ongoing conflict in South Sudan is a result of ethnic division, which aggravates mistrust between people in the country, especially when information is shared online,” said Moses. “We are a small organisation tasked with fact-checking information for a country of over 10 million people. It is challenging, but we have managed to provide training for thousands of people, and introduced a fact-check-for-pay programme that encourages citizens to fight mis- and disinformation, and instils a fact-checking culture in the country.”

 

How to spot mis- and disinformation

 

AI-generated disinformation is becoming more sophisticated and is increasingly difficult to spot. But even though Van Wyk explained it hasn’t become a major problem in Africa just yet, fact-checking and media monitoring groups remain vigilant as technology improves.

“People can safeguard themselves against disinformation by asking themselves a few questions when engaging with online information,” she says.

  • If it sounds too good, shocking or unlikely to be true, question it, pause and make sure you verify the information before you share it.
  • If it triggers your emotions, if it makes you angry, or scared, or if it gives you hope, then pause, reflect, and verify before you share it.
  • If information is shared and looks like it’s becoming viral, go look at credible news sources to verify the information.
  • Ask yourself: who is the source of the information?
  • With AI-generated images, look at the details such as fingers, ears, backgrounds and patterns, as AI still often doesn’t get all the details right.

 

Smith warned, “Don’t believe nearly anything you read on the internet. This is difficult to say because I believe that we still enjoy a vibrant news media and information-sharing community, but on social media, and with the high dissemination and consumption of information, always be sceptical of what you read or see. We need to understand that news media, as much as we have this utopian idea that everyone must be objective, without bias and completely independent, everyone has some kind of subjective angle, and it’s about the transparency and accountability of that information.”

Smith explained that a good, credible news article will lay out the basic facts, will have multiple sources from different angles on an issue, and go a step further in unpacking context and depth, whereas a mis/disinformation “news article” will not. Smith suggested a need for digital media literacy programmes for children in primary schools and that the ability to discern information is a life skill that everyone needs. 

MMA runs various programmes such as the Spotters Network Programme, which aims to curb digital offences and ultimately create an online space that is free from hate speech, harassment and incitement, while simultaneously promoting truthful and accurate/credible news. 

Moses emphasised that the spread of mis/disinformation is based on intent, and so it becomes important to think about the intentions behind potentially harmful information spread online.

“Some do it for money. For example, a person may spread false information about a health-related incident or a disease to sell or promote their own medicine,” he said. “It varies from economic and political, to social gain, especially during an election period, where politicians spread information to de-campaign their opponents. Governments will invest heavily in social media platforms to reach a wide audience. When people see information over and over again, they begin to believe it.”

 

Mistrust in the media and the responsibility of those with influence

 

“People with influence have a responsibility to ensure that the content they share is credible, accurate, and can be trusted,” said Smith, explaining that politicians often share content on social media that is xenophobic, borderline hate speech, or it takes some form of incitement, but they are savvy in knowing what lines they can push where it doesn’t constitute direct hate speech. “It is still harmful content.”

She explained that with more transparency and accountability, the easier it is to build trust. “The media do not always get it right, but at least there are accountability mechanisms in place where the public can file complaints to organisations like the Broadcasting Complaints Commission and Press Council of South Africa.”

Added Van Wyk: “We know that the journalism and media industry is under a lot of pressure and is dealing with various challenges, such as the juniorisation and shrinking of newsrooms, but accuracy is paramount, and I encourage journalists to focus on being right rather than being first. When we were moving from traditional media into the online space, there was huge competition to always be first. I may be overly optimistic, but there now seems to be a shift in journalists moving to being right rather than being first.”

Sethusa questioned how journalists or fact-checking organisations can respond to social media users who are monetising a response or a reaction, purposely sharing false information to trend or make money, particularly on X (formerly Twitter), and what the personal responsibility is for social media users.

“We want people to understand how to use social media, and if you are on social media, you should be able to understand your own responsibility to fight disinformation and engage responsibly on these platforms,” said Smith.