Skip to content

YouTube will now allow videos falsely claiming Trump won the 2020 election

    Signs that say
    Enlarge / A Trump campaign event in Waco, Texas, on Saturday, March 25, 2023.

    Getty Images | Bloomberg

    YouTube on Friday announced a major change in its approach to misinformation about the US election, saying it will no longer remove videos that make false claims about the 2020 election or previous presidential elections. As of today, “we will stop removing content that advances false claims that widespread fraud, error or glitches occurred during the 2020 presidential election and other past U.S. presidential elections,” YouTube’s announcement said.

    This is a reversal of YouTube’s December 2020 announcement that it would be banning videos falsely claiming that Donald Trump defeated Joe Biden. YouTube said at the time that it will “remove any content uploaded today (or at any time thereafter) that misleads people into claiming that widespread fraud or error changed the outcome of the 2020 US presidential election, consistent with our approach to historic U.S. Presidential elections. For example, we will remove videos claiming that a presidential candidate won the election because of widespread software problems or counting errors.”

    Google’s subsidiary YouTube made its announcement in December 2020 as Trump spread an unfounded conspiracy theory that the election was stolen from him. Trump’s false claims helped fuel the January 6, 2021 attack on the Capitol.

    YouTube today said it had “deliberated carefully” before deciding to drop the policy:

    We introduced a provision in our election misinformation policy for the first time, targeting the integrity of previous US presidential elections in December 2020, after the states’ safe harbor date for certification passed. Two years, tens of thousands of videos deleted and one election cycle later, we realized it was time to re-evaluate the effects of these policies in the current changing landscape. In the current environment, we find that while removing this content mitigates some disinformation, it may also have the unintended effect of curtailing political expression without meaningfully reducing the risk of violence or other real-world harm.

    YouTube’s policy against false claims still applies to certain elections in other countries, notably the 2021 German federal election and the 2014, 2018, and 2022 Brazilian presidential elections.

    YouTube overturned the Trump ban in March

    YouTube said other policies that counter the spread of misinformation about US elections will not be changed. The platform said it will continue to ban “content intended to mislead voters about the time, place, means or eligibility requirements to vote; false claims that could materially discourage voting, including those that challenge the validity of mail-in votes; and content that encourages others to disrupt democratic processes.”

    YouTube also said it will continue to promote “authoritative” content about elections. “We’re making sure that when people come to YouTube looking for election news and information, they see content from authoritative sources prominently in search results and recommendations,” YouTube said.

    Like other social networks, YouTube suspended Trump’s account following the attack on the US Capitol. YouTube allowed Trump back in March this year, proverb“We have carefully evaluated the continued risk of real-world violence as we weigh the opportunity for voters to hear equally from major national candidates in the run-up to an election. This channel remains subject to our policies, just like any other channel. channel on YouTube.”

    Twitter reversed the ban on Trump in November 2022, shortly after Elon Musk purchased the company. Meta announced it would allow Trump back on Facebook in January 2023.