Skip to content

How to get help if you are victims of sexually explicit depths

    Sexually explicit depths have been shot up in recent years, with almost one in 10 people in the UK becoming the victim of digital crime, according to ESET.

    A deep fake is a digitally modified image or video made or manipulated by AI, to replace the face of one person through the other. They can be extremely painful for victims whose similarity has been stolen, and growing care is in women (61%), men (45%) and among the 18-year-olds (57%) who are concerned form of abuse.

    It is a subject that former Geordie Shore star, Vicky Pattison, explores in a controversial new documentary that will be broadcast on channel 4 tonight. Vicky Pattison: My Deepfake -Sekkoe digs in the abuse of deepfake and its effect on women, because a stunning 99% of the sexually explicit deep fakes of women is.

    Familiar news and daily pleasure, exactly in your inbox

    Watch yourself-the Yodel is the source for daily news, entertainment and feel-good stories.

    The 37-year-old British personality wanted to show how “vulnerable are all of us” for this digital abuse, so she created her own deepfake-sex tape and released it online. Pattison said it gave her a “glimpse in powerlessness” experienced those victims.

    View: Vicky Pattison defends controversial deepfake -documentary

    Sexually explicit deep fakes are also spread by celebrities, such as the global pop star Taylor Swift, who was the victim of this abuse, which was generated by AI last year.

    It comes after the government was encouraged by Baroness Charlotte Owen to speed up the proposed law to make it a criminal offense to make or share sexually explicit images without the permission of people, because the largest site dedicated to this abuse more than more than 13.4 million hits per month. According to this new law, everyone would be found guilty of one of the violations a fine or a prison sentence of a maximum of six months.

    In the UK it is a sexual violation to share revenge porn since 2015, which is intimate, private photos or videos from another person who has been shared without their permission.

    Professor Clare McGlynn expert in the legal regulation of image-based sexual abuse, at Durham University says: “It is now so easy to create and share sexually explicit deepfakes. The platforms such as Instagram benefit from the abuse of deepfake by advertising apps to advertise.

    “The abuse of deepfake is normalized by platforms and search engines that facilitate this abuse. Google makes it easy to find 'how to' tutorials, they rank the apps and deepfake websites strongly.”

    If you have been the victim of a sexually explicit depth, there are several things you can do.

    Trieste woman with a smartphone in the living room. Depression concept.

    A shocking 99% of the sexually explicit depths belongs to women. (Getty Images)

    What to do if you are victims of a sexually explicit depth

    Collect proof of the abuse of deepfake

    Begin to keep track of the abuse of deepfake – if this is activated, maybe ask a friend or someone you trust for help with this. Copy the URL, create screenshots or save the video files to have proof of this AI image abuse.

    There are several ways in which AI generates these images, either of an existing image of someone online to make a fully fake-explicit video or image, or by drawing up the face of that individual on an existing explicit image.

    If you can identify the image she has used from you from your social media or online presence, this also incorporates this in your evidence, because this will all help report this abuse.

    Report it about the social media platforms

    If the deepfake appears on a social media app, such as Instagram, X or Tiktok, you can immediately report the content to violate the community guidelines of the app. If you feel comfortable to ask friends, encourage them to report the content, because this can lead to the account being suspended or the content is deleted. McGlynn adds: “They can report it to the platforms where the material is shared.”

    Report the Deepfakes on search engines such as Google

    You can submit a request on Google and Bing to remove non-consensual deepfake images, and to do this on the social media platform. McMlynn adds: “Victims can report a website to Google if it is shared there. Google says that they will only downrase apps and deepfake websites if they receive many reports of abuse.”

    Speak with a helpline to have the content removed

    There are various help lines that you can call to help you remove the content online, such as SWGFL, revenge porn rim or victim support. McGlynn explains that these organizations can “help to have material removed online. They can contact the websites or platforms where material is shared.” They can also offer support and guidance for what they should do then and how they can deal mentally with this experience of sexual abuse of deepfake.

    Report it to the police

    The with Police states that it is “illegal to share or threaten to share intimate photos or videos of someone without their permission and this includes deepfake images.” You can report this online to the police and start a criminal case, says McMlynn.

    Read more about Deepfakes: