Skip to content

The YouTube rabbit hole is nuanced

    Perhaps you have an image in your head of people being brainwashed by YouTube.

    You can imagine your cousin who loves videos of cuddly animals. Then YouTube’s algorithm pops a terrorist recruitment video out of nowhere at the top of the app and continues to suggest increasingly extreme videos until he’s persuaded to take up arms.

    A new analysis adds nuance to our understanding of YouTube’s role in spreading beliefs far beyond the mainstream.

    A group of academics found that YouTube rarely suggests videos of conspiracy theories, extreme bigotry or quackery to people who have shown little interest in such material. And those people are unlikely to follow such automated recommendations when they’re offered. The kitten-to-terrorist pipeline is extremely unusual.

    That does not mean that YouTube does not play a role in radicalization. The paper also found that survey volunteers who already held bigots or followed YouTube channels that often contained fringe beliefs were much more likely to seek out or be recommended more videos along the same lines.

    The findings suggest that policymakers, internet administrators and the public should focus less on the potential risk of leading an unwitting person to extremist ideologies on YouTube, and more on the ways YouTube can help validate and harden the views of people who tend to hold such beliefs.

    “We’ve underestimated the way social media facilitates the demand for the supply of extreme views,” said Brendan Nyhan, one of the paper’s co-authors and a Dartmouth College professor who studies misconceptions about politics and health care. “Even a few people with extreme views can do serious damage in the world.”

    People watch over a billion hours of YouTube videos every day. There are perennial concerns that the site owned by Google could amplify extremist voices, silence legitimate expressions, or both, similar to the concerns surrounding Facebook.

    This is just one study and I list some limits of the analysis below. But what’s intriguing is that the study questions the binary idea that either YouTube’s algorithm risks turning any of us into monsters or that crazy things on the Internet can do little harm. Neither can be true.

    (You can read the research paper here. A version of it was also previously published by the Anti-Defamation League.)

    Going deeper into the details, about 0.6 percent of the survey participants accounted for about 80 percent of the total watch time for YouTube channels classified as “extremist,” such as those of far-right figures David Duke and Mike Cernovich. (YouTube banned Duke’s channel in 2020.)

    Most of those people didn’t find the videos by accident, but by following web links, clicking videos from YouTube channels they subscribed to, or following YouTube’s recommendations. About one in four videos YouTube recommended to people watching an extreme YouTube channel was another video like this one.

    Only 108 times during the study — about 0.02 percent of all video visits the researchers observed — someone watching a relatively conventional YouTube channel followed an automated suggestion to an outside of the mainstream channel when they weren’t already subscribed.

    The analysis suggests that the bulk of the audience for YouTube videos that promote fringe beliefs are people who want to watch them, and then YouTube feeds them more of the same. The researchers found that viewers were much more likely among the volunteers to exhibit high levels of gender or racial resentment, as measured by their responses to surveys.

    “Our results show that YouTube continues to provide a platform to distribute alternative and extreme content to vulnerable audiences,” the researchers wrote.

    Like any study, this analysis has caveats. The study was conducted in 2020, after YouTube made sweeping changes to curb the recommendation of videos that maliciously misinform people. That makes it difficult to know whether the patterns researchers found in YouTube recommendations would have been different in previous years.

    Independent experts have not yet thoroughly reviewed the data and analysis, and the study did not examine in detail the relationship between watching YouTubers such as Laura Loomer and Candace Owens, some of whom the researchers cited and described as “alternative” channels. , and the number of viewers of extreme videos.

    More studies are needed, but these findings suggest two things. For one, YouTube may be taking credit for the changes it made to reduce the ways the site pushed people into views outside of the mainstream that they weren’t intentionally looking for.

    Second, there needs to be more talk about how much further YouTube needs to go to reduce the exposure of potentially extreme or dangerous ideas to people who are inclined to believe them. Even a small minority of YouTube audiences who regularly watch extreme videos are many millions of people.

    For example, should YouTube make it harder for people to link to fringe videos – something it has considered? Should the site make it harder for people who subscribe to extremist channels to automatically see those videos or get similar videos recommended? Or is the status quo fine?

    This research reminds us to constantly grapple with the complicated ways in which social media can both mirror and amplify the misery in our world, and resist easy explanations. There are not any.


    Tip of the week

    Brian X. Chenothe consumer technology columnist for The New York Times, is here to outline what you need to know about tracking online.

    Last week, listeners of the radio program KQED Forum asked me questions about internet privacy. Our conversation made it clear how concerned many people were about their digital activities being monitored and how confused they were about what to do.

    Here’s an overview that I hope will help On Tech readers.

    There are two broad types of digital tracking. Tracking by ‘third parties’ is what we often find scary. If you visit a shoe website and register what you’ve looked at, you could continue to see ads online for those shoes everywhere. This is repeated on many websites and apps, and marketers compile a record of your activity to target ads to you.

    If you’re concerned about this, you can try a web browser like Firefox or Brave that automatically blocks this type of tracking. Google says its Chrome web browser will do the same in 2023. Last year, Apple gave iPhone owners the ability to say no to this kind of online surveillance in apps, and Android phone owners will have a similar option at some point.

    If you want to take it a step further, you can download tracker blockers like uBlock Origin or an app called 1Blocker.

    The push for third-party tracking has shifted focus to “data collection by the first partywhat a website or app monitors when you use the product.

    If you search for directions to a Chinese restaurant in a maps app, the app may assume you like Chinese food and allow other Chinese restaurants to advertise for you. Many people find this less scary and potentially helpful.

    You don’t have much choice if you want to avoid first-party tracking, other than not using a website or app. You can also use the app or website without logging in to minimize the information collected, although that may limit your options there.

    • Barack Obama Crusades Against Disinformation: The former president begins to spread the word about the risks of online falsehoods. He is wading into a “fierce but inconclusive debate about the best way to restore trust online,” my colleagues Steven Lee Myers and Cecilia Kang reported.

    • Elon Musk’s funding is apparently secured: The chief executive of Tesla and SpaceX has laid out the loans and other financing obligations for his approximately $46.5 billion offer to buy Twitter. Twitter’s board must decide whether to agree, and Musk has suggested he wanted Twitter shareholders to decide for themselves instead.

    • Three ways to save on your technical expenses: Brian Chen has tips for determining which online plans you might want to cut, save money on your cell phone bill, and decide when you should (and might not) need a new phone.

    Welcome to a penguin chick’s first dive.


    We want to hear from you. Tell us what you think of this newsletter and what else you would like us to discover. You can reach us at ontech@CBNewz

    If you have not yet received this newsletter in your inbox, then sign up here† You can also read previous On Tech columns