Legal News

The Legal Journal covers the most significant legal news in the UK

Looking for a solicitor?

Legal Directory

Law Commission recommends reform around nonconsensual deep fake porn and intimate image abuse


According to the Law Commission, within the UK, there is a patchwork of criminal offences that are currently not covered by legislation due to a failure to keep up with technological advances, especially regarding AI and deep fakes. As a result, victims are being subjected to abuse without being able to access justice.

The Law Commission has subsequently made a number of recommendations to bring current laws in line with the ‘smartphone age,’ to ensure that it is “easier to prosecute” those who take or share sexual, nude or other intimate images of people without their consent.

Speaking about the impact the Law Commission hopes the recommendations will have, Professor Penney Lewis, law commissioner for criminal law, said: “Our new reforms for government will broaden the scope of the criminal law to ensure that no perpetrators of these deeply damaging acts can evade prosecution, and that victims are given effective protection.”

The legal gap

Although revenge porn was made illegal in 2015 under Section 33 of the Criminal Justice and Court Act 2015, and turned into an offence punishable by up to two years in prison, technology has advanced since then, meaning that there are gaps within the legislation, and its scope does not currently extend to cover deep fakes and other forms of intimate image abuse.

Deep fakes are a kind of AI that uses deep learning that mimic the brain’s neural networks to learn from videos and images fed into it to create lifelike videos. While coined back in 2017 by a Reddit user, the basis of the technology actually stems back to the late 90s with the Video Rewrite program in 1997. While ​​AI-generated synthetic media can have positive applications, for example, within education, the fashion retail industry, and medicine, by and large, this technology has predominantly been used to create deep fake porn, which is largely violent and, in some cases depicts women being raped.

According to research, this is a predominately gendered issue, with 90-95% of deep fakes being non-consensual and targeting women, and the effects can be devastating. Speaking about the impact of sharing intimate images of a person without their consent can be, Professor Lewis, said it can be “incredibly distressing and harmful for victims, with the experience often scarring them for life,” adding: “Current laws on taking or sharing sexual or nude images of someone without their consent are inconsistent, based on a narrow set of motivations and do not go far enough to cover disturbing and abusive new behaviours born in the smartphone era.”

Likewise, Dean Fido, a senior lecturer in forensic psychology at the University of Derby, commented that although “great strides” have been made by practitioners and lawmakers to make “behaviors associated with image-based sexual abuse illegal (e.g., ‘revenge pornography’ and upskirting),” illegal, there’s more to be done for what Fido called “equally impactful and damaging behaviors.”

The rise in the distribution and commercialisation of non-consensual images

The distribution and commercialisation of non-consensual images has been on the rise over the last few years. One investigation into Pornhub and parent company MindGeek by journalists from The Globe and Mail, Tortoise Media, the New Yorker, and The New York Times across 2019-2022, found that the porn company has actually “hosted sexually explicit nonconsensual videos — including those with children — for years.”

A 2019 report by Sensity, which monitors and detects deep fakes, found that 96% of deepfakes involved placing people’s faces on porn actors, and that the amount of deep fake porn videos is doubling every six months. It’s also something which has only got worse across the course of the pandemic, to the extent that it has been called a “pandemic within a pandemic.” Revenge porn also increased, with the BBC citing that one government-funded helpline in the UK, saw a 22% increase in reports of nonconsensual pornography.

That said, Henry Ajder, head of policy and partnerships at Metaphysic and leading expert on synthetic media, who co-authored the report with Sensity, said that now the growth of deep fakes is hard to track because it’s just so widespread: “Image based abuse has gone a lot more global. It’s much harder to map unless you have an extensive knowledge of the landscape,” he said, adding: “It would be unlikely that the 2019 report could be replicated today, due to the scale of content.” Speaking about how the abuse has evolved, Ajder said: “Deepfakes in the image abuse context first started off targeting celebrities. Now, it’s no longer about just celebrities. Many more private women have been targeted. It fundamentally changes the way that women feel safe on digital spaces.”

Scarlett Johansson is one of the celebrities who has been targeted by this kind of online image abuse and, similarly to other victims, expressed a kind of defeatism around the ability to regulate and prosecute the crime: “I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself,” she commented, adding: “It’s a fruitless pursuit for me but a different situation than someone who loses a job over their image being used like that. The Internet is just another place where sex sells and vulnerable people are preyed upon. And any low-level hacker can steal a password and steal an identity. It’s just a matter of time before any one person is targeted.”

Kate Isaacs, the founder of Not Your Porn, a campaign group which is seeking better regulation of the global porn industry, and fighting for accountability over the distribution and commercialisation of non-consensual material, commented: “The thing that concerns me the most is it doesn’t look like there’s anything in terms of profiting from [image-based sexual abuse] or platform responsibility here, which is really worrying. Addressing the installation of equipment such as a hidden camera is all very well and good, but I’ve worked on a number of cases with women who have been secretly recorded without their consent and then that footage has been uploaded to a platform that has allowed someone to profit from that content.”

The Law Commission’s reform recommendations

The Law Commission said the raft of recommendations it has proposed to the government is “long overdue” and that under the proposals, all perpetrators of these acts would face prosecution.”

As per the reform recommendations, a “base” offence for intimate image abuse would be introduced and supplemented by “three additional offences” for what it called “more serious conduct” and a further offence for installing equipment, for example, in an Airbnb, a crime that has been on the rise in recent years.

As outlined by the Law Commission, the new base offence would make it an offence for someone to “intentionally take or share an intimate image of a person” without that person’s consent, with the caveat that the perpetrator does not “reasonably believe that they consent,” and would be applied regardless of the perpetrator’s motivation. The Law Commission stated that while intimate image offences are currently restricted to “one or two narrow motivations,” this widens it to include “all motivations,” such as “sharing intimate images for financial gain, social status or as a joke,” or “where there is no motivation at all.” This offence could result in a maximum sentence of six months’ imprisonment.

For more “serious conduct,” there are three additional offences, which include situations where a perpetrator has taken or shared an intimate image without consent with the motivation either to “obtain sexual gratification or to cause humiliation, alarm or distress,” or where the perpetrator has “threatened to share an intimate image,” leading to a sentence of two to three years’ imprisonment.

In line with the reform proposals, installing equipment such as a hidden camera to facilitate taking an intimate image of a person without their consent would also be a crime, and the sharing of altered intimate images of people, including pornographic deep fakes and so-called ‘nudified’ images, would be covered under these offences.

While there has been support from campaign groups and charities working in this area, the CEO of Cease (Centre to End All Sexual Exploitation), Vanessa Morse, argues that more needs to be done to hold the pornography industry to account. Morse commented: “The law must also clamp down on the high volume of nonconsensual content appearing on pornography websites. We know that the pornography industry cannot be trusted to self-regulate and it has facilitated and profited from this horrific practice for years.

Further to this, she explained: “Crucially, pornography platforms must be made, by law, to verify the age and consent of those featured in uploads. This is the only way that nonconsensual material will be prevented from being uploaded in the first place, and it has already been adopted as a policy by Mastercard. The government has the opportunity to impose these changes on the pornography industry through the online safety bill, but it is currently choosing not to. This is a grave mistake.”

Speaking about the issue to the BBC, a government spokesman said: “Nearly 1,000 abusers have been convicted since we outlawed ‘revenge porn’.With the Online Safety Bill, we will force internet firms to protect people better from a range of image-based abuse – including deepfakes. But we asked the Commission to explore whether the law could be strengthened further to keep the public safer. We will carefully consider its recommendations and respond in due course.”

Article Created By Madaline Dunn

Add Your Law Firm

If your law firm is based in the UK, then a listing on The Legal Journal could really help your firm to reach new clients that are searching for legal services.

Add Your Law Firm