Skip to content

Google's Non-Consensual Explicit Image Problem Gets Worse

    In early 2022, two Google policymakers met with a trio of women who had fallen victim to a scam that resulted in explicit videos of them circulating online, including in Google search results. The women were among hundreds of young adults who responded to ads seeking swimsuit models but were then coerced into participating in sex videos distributed by the website GirlsDoPorn. The site was shut down in 2020, and a producer, an accountant and a cameraman subsequently pleaded guilty to sex trafficking, but the videos continued to appear in Google searches faster than the women could request their removal.

    The women, accompanied by a lawyer and a security expert, presented a wealth of ideas about how Google could better keep the criminal and degrading clips hidden, according to five people who attended or were briefed on the virtual meeting. They wanted Google Search to ban websites dedicated to GirlsDoPorn and watermarked videos. They proposed that Google borrow the 25-terabyte hard drive on which the women’s cybersecurity consultant, Charles DeBarber, had stored every GirlsDoPorn episode, take a mathematical fingerprint, or “hash,” of each clip and prevent them from ever appearing in search results again.

    The two Google employees in the meeting hoped they could use what they had learned to get more resources from higher up. But the victim’s attorney, Brian Holm, left feeling dubious. The policy team was in a “difficult position” and “had no authority to make changes within Google,” he says.

    His gut feeling was right. Two years later, none of the ideas that emerged from the meeting have been implemented, and the videos still show up in search results.

    WIRED spoke with five former Google employees and 10 victim advocates who have contacted the company. All say they appreciate that recent changes Google made make it easier and more successful for survivors of image-based sexual abuse like the GirlsDoPorn scam to remove unwanted search results. But they’re frustrated that the search giant’s leadership hasn’t approved proposals like the hard drive idea, which they believe would fully restore and protect the privacy of millions of victims around the world, most of whom are women.

    The sources detail previously unreported internal deliberations, including Google's reasoning for not using an industry tool called StopNCII that shares information about nonconsensual intimate images (NCII) and the company's failure to require porn websites to verify consent in order to be considered for search traffic. Google's own research team has published steps that tech companies can take against NCII, including using StopNCII.

    The sources believe such efforts could better address a growing problem, in part by increasing access to AI tools that create explicit deepfakes, including those of GirlsDoPorn survivors. Total reports to the U.K.’s Revenge Porn hotline more than doubled last year, to about 19,000, as did cases involving synthetic content. Half of the more than 2,000 Britons in a recent survey were concerned about falling victim to deepfakes. The White House in May urged lawmakers and industry to move more quickly to address NCII more broadly. In June, Google, along with seven other companies and nine organizations, announced a working group to coordinate responses.

    Currently, victims can pursue prosecution of abusers or file legal claims against websites hosting the content, but neither route is guaranteed and both can be expensive in legal fees. Having Google remove results may be the most practical tactic and serves the ultimate goal of keeping infringing content out of the eyes of friends, hiring managers, potential landlords, or dates, all of whom are likely to turn to Google to look up people.

    A Google spokesperson, who requested anonymity to avoid harassment by perpetrators, declined to comment on the conversation with GirlsDoPorn victims. She said that combating what the company describes as nonconsensual explicit imagery (NCEI) remains a priority and that Google's actions go well beyond what is legally required. “Over the years, we've invested heavily in industry-leading policies and protections to protect people impacted by this harmful content,” she said. “Teams across Google continue to work diligently to strengthen our protections and thoughtfully address emerging challenges to better protect people.”