Ten days in After Russia’s invasion of Ukraine, TikTok announced it had suspended new posts from Russian accounts due to the country’s new “fake news” law. But the company has been quieter about a second policy shift — one that prevented TikTok users in Russia from seeing content posted by accounts outside the country.
Findings from social media research collective Tracking Exposed suggest TikTok has wrapped its Russian users in a massive echo chamber designed to pacify President Vladimir Putin’s government. Within that digital enclave, a network of Russian accounts somehow remained active, posting pro-invasive content. “There was obvious manipulation of the information ecosystem on TikTok,” said Salvatore Romano, head of research at Tracking Exposed.
TikTok spokesperson Jamie Favazza declined to comment on Tracking Exposed’s findings, reiterating a previous statement that the company had blocked new uploads from Russia. But the platform, owned by Chinese startup ByteDance, was less critical of Russia than its US rivals and treated less harshly by the Russian government. TikTok complied with EU sanctions that forced platforms to block access to Russian state-backed media from Europe. Meta, Google and Twitter have also tweaked their algorithms to make content or links to those outlets less visible. In retaliation, Facebook and Twitter were both blocked by Russian internet censorship. On March 21, a Moscow court banned Facebook and Instagram from Russia, accusing parent company Meta of “extremist activities”.
TikTok’s actions in Russia and its pivotal role in spreading video and rumors about the war in Ukraine add urgency to ask questions about how truth and falsehood circulate on the platform, Romano and other researchers say. TikTok’s geopolitical moment also highlights the challenges facing researchers trying to answer such questions. Launched in 2017, the app surpassed 1 billion monthly users in September 2021, but is less well-studied and harder to study than its older rivals.
Most of the work on the dynamics and downsides of social media has focused on Facebook and Twitter. Tools and techniques developed for those platforms have shed light on the spread of misinformation about Covid-19 and exposed online manipulation campaigns linked to governments including Russia, China and Mexico. Meta and Twitter provide APIs to help researchers see what’s circulating on their platforms.
TikTok does not provide an investigative API, making it difficult to answer questions about its role in spreading accurate or inaccurate information about the war in Ukraine or other topics. And while researchers might like to see Meta and Twitter provide wider access to data, at least these platforms offer something, says Shelby Grossman, a researcher who has followed pro-Russian reports about Ukraine at Stanford’s Internet Observatory. “It’s hard to systematically look at what’s happening on TikTok,” she says. Researchers have also struggled to track content about Ukraine through the Telegram messaging app, which also lacks a research API and is far less studied than US networks.
TikTok spokesperson Favazza says that while it doesn’t currently offer an investigative API, “we strongly support independent research,” citing a program that educates lawmakers and online harm experts about its moderation and recommendation systems. TikTok has previously claimed that the war in Ukraine has prompted it to increase moderation and accelerate a pilot project involving state-controlled media accounts, but has not specified exactly how its operations have changed. On March 24, two TikTok moderators filed a lawsuit against the company for psychological harm from “exposure to highly toxic and extremely disturbing images.”
One of the biggest challenges for outside researchers interested in what’s circulating on TikTok comes from the power and influence of the recommendation algorithm, which plays an inordinate role compared to older social networks. The app and its rapid growth is built on the For You page, which showcases an endless series of videos curated by the TikTok algorithm and largely from accounts that a user does not follow. As a result, different people see completely different videos, with the feed being based on past views and other signals.