Kenya’s National Cohesion and Integration Commission (NCIC), a government agency that was created to foster ethnic harmony among the country’s 45 tribes, has given Facebook seven days to tackle hate speech and incitement tied to the upcoming elections on the platform. Otherwise, Facebook will face suspension in the east African country.
Kenya’s presidential, legislative and local authorities elections will take place on August 9.
The NCIC warning follows Global Witness’s and Foxglove’s report about how the social media platform approved over a dozen political ads written to instigate ethnic violence both in English and Swahili.
Global Witness and Foxglove, two NGOs, joined hands to conduct a study testing Facebook’s ability to detect hate speech and calls for ethnic-based violence ahead of the elections.
The study was essential in light of Kenya’s political history of featuring ethnically driven violence. For example, after the 2007 elections, over a thousand people were killed while hundreds of thousands had to desert their homes for safety. The number of social media users today is exponentially higher than it was in 2007, and more than 12 million Kenyans – over 20% of the Kenyan population – are on Facebook, where hate speech and misinformation are rampant. It is the second most widely used social media platform in Kenya, after WhatsApp.
Global Witness said it chose to use ads because they undergo a stricter review and moderation process than for posts. Ava Lee, the leader of the organisation’s Digital Threats to Democracy Campaign spoke on Facebook’s recent history of approving similar ads in other countries.
“Facebook has the power to make or break democracies and yet time and time again we’ve seen the company prioritize profits over people… This isn’t a one-off. We’ve seen the same inability to function properly in Myanmar and Ethiopia in the last few months as well. The possible consequences of Facebook’s inaction around the election in Kenya, and in other upcoming elections around the world, from Brazil to the U.S. midterms, are terrifying,” said Lee.
The two NGOs opted not to publish the ads in question because they were extremely offensive, but they used real examples of common hate speech in Kenya, including comparisons of some ethnic groups to animals and calls for sexual and physical violence against their members.
NCIC corroborated their findings.
"Facebook is in violation of the laws of our country. They have allowed themselves to be a vector of hate speech and incitement, misinformation and disinformation," said Danvas Makori, an NCIC commissioner.
Mankori accused Meta, Facebook’s parent company, of violating Kenya’s laws concerning hate speech and the use of social media platforms. He expressed their collective resolve to not allow “Facebook, or any social media company” jeopardise national security.
The NCIC on its own does not have the power to suspend Facebook, however, it held talks with the Communications Authority of Kenya (CAK) which regulates social media companies.
In response, Meta said it has hired dedicated teams of Swahili speakers and adopted advanced technology to “remove harmful content quickly and at scale”.
However, Global Witness and Foxglove decided to test Facebook’s claim by resubmitting the test ads, which were approved yet again.
In a latter statement, Meta said it has taken “extensive steps” to “catch hate speech and inflammatory content in Kenya, and will amp up efforts in preparation for the election. However, it also added that they were bound to miss things sometimes “as both machines and people make mistakes”.
The general hope is that none of these mistakes cost Kenyan lives.
Sources: Reuters, Tech Crunch, Yahoo! News