Wrote Doughty: “Defendants have ‘substantially encouraged’ the social media companies to the extent that the decisions (of the companies) should be considered the decisions of the government.”
Doughty’s ban, now on hold as the White House appeals, seeks to set the limits of acceptable behavior for government IRUs. It provides an exemption for officials to continue to update social media companies on illegal activity or national security issues. Emma Llansó, director of the Free Expression Project at the Center for Democracy & Technology in Washington, D.C., says this is a cause for concern because the line between thoughtful public safety protections and unfair suppression of critics can be thin.
The EU’s new approach to IRUs also seems compromised for some activists. The Digital Services Act (DSA) requires each EU member to designate a national regulator by February to consider applications from government agencies, non-profits, industry associations or companies seeking to become trusted flaggers that can report illegal content directly to Meta and other medium to large platforms. Reports from trusted flaggers must be reviewed “without undue delay,” under penalty of fines of up to 6 percent of a company’s annual global revenue.
The law aims to make IRU requests more accurate by designating a limited number of trusted tagging organizations with expertise in various areas of illegal content, such as racist hate speech, counterfeit goods or copyright violations. And organizations will have to publish annually how many reports they have made, to whom and what the results are.
But the disclosures will have significant loopholes, as they only contain requests related to content that is illegal in an EU country, leaving reports of content flagged solely for violating the terms of service unseen. While tech companies aren’t required to prioritize reports of content flagged for violating rules, there’s nothing stopping them from doing so. And platforms can still work with unregistered trusted flaggers, essentially preserving today’s obscure practices. The DSA does require companies to publish all their content moderation decisions without “undue delay” in an EU database, but the identity of the flagger can be omitted.
“The DSA creates a new, parallel structure for trusted flaggers without directly addressing the lingering problems with actual existing flaggers such as IRUs,” said Paddy Leerssen, a postdoctoral researcher at the University of Amsterdam involved in a project providing an ongoing analysis of the DSA.
Speaking on condition of anonymity because they are not authorized to speak to the media, two EU officials working to enforce DSA say the new law aims to ensure that all 450 million EU residents benefit from trusted flaggers’ ability to send expedited notices to companies that might not otherwise work with them. While the new trust flagger designation isn’t designed for government agencies and law enforcement agencies, there’s nothing stopping them from applying, and the DSA specifically names internet referral units as possible candidates.
Rights groups are concerned that if governments participate in the Trustworthy Flaggers program, it could be used to suppress legitimate speech under some of the bloc’s more draconian laws, such as Hungary’s ban (currently contested) on promoting same-sex relationships in educational materials. Eliška Pírková, global freedom of speech leader at Access Now, says it will be difficult for tech companies to resist the pressure, even as state coordinators may suspend trusted flaggers deemed to be acting inappropriately. “It’s the total lack of independent safeguards,” she says. “It’s quite worrisome.”
Twitter banned at least one human rights organization from access to its top-priority reporting queue a few years ago because it filed too many false reports, the former Twitter employee says. But dropping a government can certainly be more difficult. The Hungarian embassy in Washington DC did not respond to a request for comment.
Tamás Berecz, managing director of INACH, a global coalition of non-governmental groups fighting hate online, says some of the 24 EU members are considering applying for official trusted flagger status. But they have concerns, including whether coordinators in some countries will approve applications from organizations whose values do not align with the government’s, such as a group that monitors anti-gay hate speech in a country like Hungary, where same-sex marriage is banned. “We don’t really know what’s going to happen,” Berecz says, leaving room for some optimism. “For now, they’re happy to be in an unofficial trusted program.”