Telegram did not respond to a request for comment.
Telegram was heavily used by the organizers of the January 6 riots in the US. The platform is unmoderated, with minor exceptions for pornographic and terrorist content, making it a hub for conspiracy theories and misinformation that would otherwise be removed from platforms such as Facebook, Instagram, and Twitter.
Many of these Telegram channels are publicly searchable and have thousands of members who share tens of thousands of pieces of content per month. Many refer to Bolsonaro’s opponent Luiz Inácio Lula da Silva as a communist, claiming that any outcome that Bolsonaro does not favor will be the result of a corrupt electoral process.
But Telegram does not operate in a vacuum. “In Brazil, the center of disinformation is not Telegram itself, but YouTube,” claims Leonardo Nascimento, a professor at the Universidade Federal da Bahia and researcher at the Internet Lab. Telegram, he says, is often used as a channel to distribute links to YouTube videos. According to Nascimento’s research, the most popular videos are often clips or interviews with Bolsonaro himself, shared hundreds of times across multiple groups. Bolsonaro has questioned the validity of the country’s elections many times, even leading to a federal police investigation into his claims about the country’s voting systems.
“On the one hand, you have honest soldiers without any charges of corruption. On the other side you have two thieves. Which one would you invite into your house?” asked a video from the YouTube channel PodVoxx that was recently shared in a Telegram group of more than 15,000 users. Nascimento’s research showed that in just 90 days, more than 300,000 YouTube links were shared in the right-wing groups in Brazil he follows.
According to research by the Internet Lab, the most common misinformation links on Telegram lead users to unlisted YouTube videos, meaning they cannot be found when searching the platform and can only be accessed by those who have the URL. That makes it difficult for outsiders to find such links, but not YouTube itself, says Nascimento. “[YouTube] know that these links are shared,” he says. He also claims that the platform is usually slower than Meta or Twitter when it comes to removing hateful and extremist content.
YouTube spokesperson Cauã Taborda says there is no difference in moderation practices for listed and unlisted videos. But Nascimento says that because platforms enforce policies differently — if at all — malicious content can somehow continue to circulate. “The problem isn’t Twitter itself, or YouTube itself, or other platforms,” Nascimento said. “The problem is the whole system.”
Additional reporting by Priscila Bellini.