In June, Global Witness and Foxglove found that Meta was continuing to approve ads in Amharic targeting Ethiopian users, including hate speech and incitement to violence. Facebook is involved in spreading hate speech and fueling ethnic violence in Ethiopia’s ongoing conflict.
Crider argues that Facebook should invest more in its moderation practices and democracy protection. She worries that even the threat of a ban could allow the company to shirk responsibility for the problems it hasn’t addressed.
“I think eventually the moment a regulator looks at Facebook and it looks like they’re actually going to let them do something that could cost them some money, they start crying about censorship and presenting a false choice that it’s either an essentially unmoderated and unregulated Facebook or no Facebook at all,” she says.
And Crider says there are things the company can do, including glass-breaking measures such as deprioritizing its heavily promoted live videos or limiting the reach of incendiary content, and banning election-related ads in the run-up to the election. mood.
Mercy Ndegwa, Meta’s public policy director for East Africa and the Horn of Africa, told WIRED that the company has “taken extensive steps to help us deal with hate speech and incendiary content in Kenya, and we are stepping up these efforts ahead of the elections .” However, she acknowledged that “despite these efforts, we know there will be examples of things we miss or delete accidentally because both machines and humans make mistakes.” Meta did not answer specific questions about the number of content moderators who Speaking Swahili or other Kenyan languages, or the nature of conversations with the Kenyan government.
“What the researchers did was stress-test Facebook’s systems and prove what the company said was bullshit,” Madung said. The fact that Meta allowed ads on the platform despite a review process “raises questions about their ability to deal with other forms of hate speech,” Madung says, including the sheer amount of user-generated content that doesn’t require pre-approval.
But banning Meta’s platforms, Madung says, won’t allay disinformation or ethnic tensions because it doesn’t address the root cause. “This is not a mutually exclusive question,” he says. “We need to find a middle ground between heavy-handed approaches and true platform responsibility.”
Joseph Mucheru, Private Secretary for Internet and Communication Technologies (ICT), tweeted“Media, including social media, will continue to enjoy PRESS FREEDOM in Kenya. It’s not clear what legal framework NCIC plans to use to suspend Facebook. The government is on the list. We are NOT shutting down the internet.” There is currently no legal framework under which NCIC can order Facebook’s suspension, agrees Bridget Andere, a policy analyst for Africa at Access Now, a digital rights nonprofit.
“Platforms like Meta have completely failed to deal with misinformation, disinformation and hate speech in Tigray and Myanmar,” said Andere. “The danger is that governments will use that as an excuse for shutting down the internet and blocking apps, when instead it should encourage companies to invest more in moderating human content in an ethical and human rights-respecting way.” .”
Madung also worries that whether or not the government chooses to suspend Facebook and Instagram now, the damage may already have been done. “The effects will be seen at another time,” he says. “The problem is that the precedent now officially exists and can be referenced at any time.”