AgeAlertAnonymousAppealsApplicationsApply Or RegisterArea OutlineArrow DownArrow LeftArrow RightArrow UpAutomatic DoorsBack ArrowBusinessCalendarCashArrow DownArrow LeftArrow RightArrow Down[Missing text '/SvgIcons/Symbols/Titles/icon-chrome' for 'English (United Kingdom)']ClockCloseContactDirectionsDocumentDownloadDrawDrugExpandExternal LinkFacebookFb CommentFb LikeFiletype DefaultFiletype DocFiletype PdfFiletype PptFiletype XlsFinance[Missing text '/SvgIcons/Symbols/Titles/icon-firefox' for 'English (United Kingdom)']First AidFlickrFraudGive FeedbackGlobeGuide DogHealthHearing ImpairedInduction LoopInfoInstagramIntercom[Missing text '/SvgIcons/Symbols/Titles/icon-internet-explorer' for 'English (United Kingdom)']LaptopLiftLinkedinLocal ActivityLoudspeakerLow CounterMailMapMap PinMembershipMenuMenu 2[Missing text '/SvgIcons/Symbols/Titles/icon-microsoft-edge' for 'English (United Kingdom)']Missing PeopleMobility ImpairmentNationalityNorth PointerOne Mile RadiusOverviewPagesPaper PlaneParkingPdfPhonePinterestPlayPushchairRefreshReportRequestRestart[Missing text '/SvgIcons/Symbols/Titles/icon-rotate-clockwise' for 'English (United Kingdom)']Rss[Missing text '/SvgIcons/Symbols/Titles/icon-safari' for 'English (United Kingdom)']SearchShareSign LanguageSnapchatStart AgainStatsStats And Prevention AdviceStopSubscribeTargetTattosTell Us AboutTickTumblrTwenty Four HoursTwitter LikeTwitter ReplyTwitter RetweetUploadVisually ImpairedWhatsappWheelchairWheelchair AssistedWheelchair ParkingWheelchair RampWheelchair WcYoutubeZoom InZoom Out

Skip to main content

Skip to main navigation

NPCC-white

Search this website

Main navigation menu

  • Media Centre
  • News
  • Editorial
  • Contact Us
Typing on keyboard - Hero Image

24 Nov 2025

Police warn of rising threat from sexual deepfakes

A new police-commissioned survey has found that one in four people feel there is nothing wrong with, or feel neutral about, creating and sharing sexual deepfakes, even when the person depicted has not consented.

The survey, published today, looked at the public’s attitudes towards deepfakes and more specifically, those that are sexual or intimate in nature and disproportionately target women and girls. This type of content is thought to have increased in prevalence by 1,780% between 2019 and 2024.

Of 1,700 respondents from the nationally representative survey led by Crest Advisory, almost one in six people said that they had created any kind of deepfake (humorous, political, sexual/intimate) before or would do so in the future, peaking to a third of people aged 25-34 years old.

Younger people were also more likely to find it morally acceptable to create or share non-consensual sexual or intimate deepfakes compared to older people.

The survey found that:

  • Three in five people stated that they are very or somewhat worried about being a victim of a deepfake.
  • 5% of respondents had created a deepfake in the past, and of those, 34% had created a sexual/intimate deepfake of someone they know, and 14% had created a sexual/intimate deepfake of someone they do not know.
  • Social media was the most common platform where people had seen deepfakes. Of men and women who saw a sexual deepfake of someone they did not know, 41% of men and 17% of women saw them on porn sites.

Public attitudes

The survey helps to build a picture of how the public perceive the harm caused by non-consensual sexual deepfakes.

Among various listed offences such as phone theft or scams, most respondents considered being a victim of sexual or intimate deepfakes to be less harmful.

However, previous studies have found that many of the psychological and emotional impacts of deepfake violence against women and girls (VAWG) described by victims and practitioners mirror the impacts reported by victims of sexual harassment and contact VAWG offences, like sexual assault and rape.

The gap suggests that the wider public may not understand the impact of deepfake sexual image abuse, in part because the images are not perceived as ‘real’, as well as a lack of understanding of the legislation and consequences of creating this type of material.

The survey also asked respondents to judge whether different scenarios that involved the creation, sharing or consumption of non-consensual deepfake sexual/intimate images should be legally and morally acceptable.

One scenario included:

An individual creates an intimate deepfake of their partner and tells them about it. After an argument, the individual shares this intimate deepfake with other people.

For this specific scenario a sizeable minority, 13% of respondents, said it should be both legally and morally acceptable, with a further 9% stating that they felt ‘neutral’ about it.

Perpetrators of ‘deepfake’ violence against women and girls

Whilst there is no extensive research into perpetrators who create or share non-consensual sexual deepfakes, the survey and existing studies have found a relationship between misogyny and offending.

Within the survey, those who considered it to be morally and legally acceptable to create, view, share and sell non-consensual sexual deepfakes were more likely to be younger men under the age of 45, actively consume pornography and agree with beliefs that would commonly be regarded as misogynistic.

Previous studies have also found a positive relationship between perpetration and endorsing rape myths. Other evidence suggests that some deepfake creators are prolific - one creator alone is known to have posted over 1,800 videos.

The police response

The survey was commissioned by the Office of the Police Chief Scientific Advisor to help inform the next steps of the police response to tackle online violence against women and girls.

Policing is working in tandem with the Home Office, academics and industry to find solutions to help detect deepfakes.

Intimate image abuse, including non-consensual sexual deepfakes is vastly under-reported, with data from the Revenge Porn Helpline showing that only 4% of people who reported their abuse to the helpline also reported to the police.

Police are appealing to victims to report intimate image abuse, as chiefs and other agencies voice growing concern that online sexual abuse towards women and girls is rising exponentially.

A new project led by the Revenge Porn Helpline, the National Centre for VAWG and Public Protection (NCVPP) and Digital Public Contact aims to improve the reporting and investigative process to encourage victims to come forward.

This includes police exploring the use of ‘image hashing’, a process that allows police and prosecutors to investigate a crime using a description of the image, rather than sharing the image itself. This means that a victim could have the option to limit who sees the image and avoid the distress of having the image shown in court.

Police and the Revenge Porn Helpline are also working with the Foreign, Commonwealth and Development Office to ensure international borders don’t hinder activity to take down imagery and tackle perpetrators.

Detective Chief Superintendent Claire Hammond from the National Centre for VAWG and Public Protection, said: “Sharing intimate images of someone without their consent, whether they are real images or not, is deeply violating.

“The rise of AI technology is accelerating the epidemic of violence against women and girls across the world. Technology companies are complicit in this abuse and have made creating and sharing abusive material as simple as clicking a button, and they have to act now to stop it.

“However, taking away the technology is only part of the solution. Until we address the deeply engrained drivers of misogyny and harmful attitudes towards women and girls across society, we will not make progress.

“If someone has shared or threatened to share intimate images of you without your consent, please come forward. This is a serious crime, and we will support you. No one should suffer in silence or shame.”

 

Cally-Jane Beech, award winning activist and influencer, has been campaigning for better protection for victims of deepfake abuse, she said: “We live in very worrying times, the futures of our daughters (and sons) are at stake if we don’t start to take decisive action in the digital space soon.

“It’s an encouraging start to hear about the changes being made to the reporting system with regards to deepfakes, and I truly hope the additional training and guidance enables victims to feel supported from the moment they pick up the phone - but the conversations need to begin earlier. At home. Parent to child.

“We are looking at a whole generation of kids who grew up with no safeguards, laws or rules in place about this, and now seeing the dark ripple effect of that freedom.

“Stopping this starts at home. Education and open conversation need to be reinforced every day if we ever stand a chance of stamping this out.”

To read the full survey findings, please click here. To read the full literature review referenced in this press release, click here.

For more information on how to report deepfake image abuse visit the Police UK website.

If you aren't ready to speak to the police, help and support is available via the Revenge Porn Helpline. If you are under the age of 18, please contact the NSPCC for advice and support.

Contact information

Communications office
By phone: 0800 538 5058
By email: press.office@npcc.police.uk

Downloads

  • Typing on keyboard - Hero Image: Typing on keyboard - Hero Image

    Typing on keyboard - Hero Image

Footer navigation

About Us

  • About Us
  • Our Strategy
  • Structure And Membership
  • Governance And Accountability
  • Privacy Policy
  • Cookies
  • Terms And Conditions
  • Accessibility

News

  • News

Publications

  • Disclosure Log
  • Meeting Minutes

Contact Us

  • Contact Us

Follow Us:

© Copyright 2025. All rights reserved.