“The law is failing”: How victims of image-based abuse are falling through the gaps

In 2015, a Yorkshire man became the first to be convicted under “revenge porn” legislation. Six years on, however, loopholes and omissions in the law are leaving victims of image-based abuse in limbo.
Helen has called for more comprehensive laws around the various forms of image-based abuse.Helen has called for more comprehensive laws around the various forms of image-based abuse.
Helen has called for more comprehensive laws around the various forms of image-based abuse.

Late last year, when an acquaintance of Helen Mort’s said they had some bad news to deliver, the Sheffield-based poet envisioned all manner of worst-case scenarios:

“I got really panicky...I remember thinking, ‘maybe something’s happened to my son, maybe something really dreadful has happened”, she recalls.

Hide Ad
Hide Ad

The revelation, however, turned out to be something Helen could never have imagined: the acquaintance had seen what looked like explicit images of Helen on a porn site.

“My first reaction was, ‘that’s impossible’”, she explains. “I have never ever shared any kind of intimate photo of myself anywhere or with anyone - so how could I possibly have ended up on a porn site?”

When she finally plucked up the courage to look for herself, Helen discovered that someone posing as her “boyfriend” had taken images of her face from various social media accounts and superimposed them onto explicit images, saying he wanted to see Helen “abused and humiliated” and inviting others to make further “fakes”. Some dated as far back as 2017.

“Some of them were almost comically bad”, she recalls. “But there were a few which looked much more realistic - those were the most disturbing of all”.

Hide Ad
Hide Ad

Helen had unknowingly become a victim of “deepfake porn”; an emerging form of image-based abuse in which a person’s photograph is manipulated into explicit, sometimes violent, videos or stills.

The discovery led to extreme anxiety, stress and paranoia, with Helen suffering recurring nightmares about the images and feeling temporarily unable to leave the house.

Yet when she approached the police about the incident, they were unable to take any action: currently, no law exists against making or sharing deepfake images, no matter how realistic.

Helen’s story is just one example of the myriad ways in which current legislation is failing victims of image-based abuse; a behaviour now so endemic that the UK’s Revenge Porn Helpline recorded an average of nine reports per day in 2020 - marking their busiest year on record.

Hide Ad
Hide Ad

So called “revenge porn”, which entails the sharing of private sexual photos or videos without consent, was outlawed England and Wales in 2015.

The very first conviction, however, exposed glaring holes in the new legislation: namely that revenge porn is classed as a communications offence, not a sexual offence, meaning that victims are not granted an automatic right to anonymity.

Charged in Yorkshire, the victim of the first man to be convicted of “revenge porn” in England thus saw her name, and details of the case, emerge in the press:

“Her name and situation was spread all over the national media, causing significant distress and anxiety”, explains Julia Mulligan, North Yorkshire Crime Commissioner, who subsequently spearheaded the #NoMoreNaming campaign calling for anonymity for victims.

Hide Ad
Hide Ad

In some cases, says Dr Kelly Johnson, an Assistant Professor at the University of Durham and expert in image-based abuse, this lack of anonymity can re-traumatise victims if cases go to court:

“There have been examples where cases have gone to court and a victim’s intimate images have gone viral again, because all you need to know is the victim’s name and you can search and access them online”.

Some victims of “revenge porn” are unable to even get this far, says Dr Johnson, thanks to the law’s focus on the motivations of the perpetrator in sharing the intimate images or videos:

“The law has a motivation requirement attached to it, meaning you have to prove that [the perpetrator] was motivated by causing the victim distress”, she explains.

Hide Ad
Hide Ad

“Police have been unable to take cases forward because [the perpetrator] will say ‘I didn’t do it to upset her’ or ‘I did it for lad points’ and that’s a viable defence - because of the motivation requirement”.

For people like Kitty Wenham, this focus on the perpetrator’s motivations, rather than the victim’s distress, has prevented any form of criminal justice for image-based abuse.

Earlier this year, Kitty’s mental health was left in tatters after intimate images of hers were shared across social media platforms without her consent.

When she reported the incident to the police, however, she was told there was little they could do:

Hide Ad
Hide Ad

“I went to the police and they basically said it didn’t count as revenge porn because the person posting them wasn’t asking for anything, wasn’t blackmailing me”, she explains.

Aside from being unable to escalate her case, Kitty was further discouraged by the fact that the police seemed inadequately trained for dealing with victims like herself, with officers “not seeming to understand how all the social media sites worked - I had to explain Instagram at one point”.

She adds that, after supplying screenshots showing the non-consensually shared images to the police, she suspected that they were “looked at in a room with other people in it...that made me really uncomfortable”.

Victims of image-based abuse report a “mixed bag” of experiences in reporting to the police, says Dr Johnson, with negative experiences often leading, as in Kitty’s instance, to victims dropping their case. Just 24 per cent of victims even report incidents in the first place.

Hide Ad
Hide Ad

Negative experiences are often driven by a lack of training among police, says Dr Johnson, with a 2017 survey revealing that 95 per cent of police officers had no training on “revenge porn” legislation whatsoever.

In some cases, there simply isn’t legislation to be trained on, with no law in place against the “deepfake porn” Helen was a victim of, nor against threats to share intimate images online; something which one in seven British women aged 18 to 34 have been the victim of.

“The law simply isn’t fit for purpose”, Julie explains, citing the loopholes and omissions of current legislation as the reason why, in spite of rising cases, prosecutions under revenge porn law have fallen dramatically.

The Law Commission’s review - and imminent consultation - on the laws around image-based abuse in England and Wales is “a really positive step forward”, says Dr Johnson, but the pace of change is “frustrating”:

Hide Ad
Hide Ad

“Change is happening, I just think it needs to be happening more quickly to stop the law failing victims, as it is currently”.

While there are some upcoming opportunities to pass useful legislation in the Domestic Violence Bill and the Online Harms Bill, says Dr Johnson, the law needs “wholesale change” to keep up with emerging forms of abuse brought about by advancing technology. It’s a process she fears could take “years”.

Until then, says Dr Johnson, a huge cultural shift is now required to stop these incidents occurring in the first place:

“We have to invest in good education programmes and preventative programmes, particularly for things like sexual ethics, victim blaming...too much education out there focuses on the taking of the images rather than the sharing without consent”.

Hide Ad
Hide Ad

It’s a sentiment that Helen echoes, saying that old-school misogyny plays a significant role in both driving these incidents and attaching shame to its victims:

“My first instinct [after finding the pictures] was to hide, to think: ‘maybe I shouldn’t have had a presence on social media’ - but that’s exactly the same logic as telling rape victims ‘you shouldn’t have gone out dressed like that’.”

“That’s why I wanted to put my name to this experience - I want to say, actually, this did happen to me, and I shouldn’t feel ashamed about it - the person who did it should feel ashamed”.

If you have been a victim of image based abuse, you can contact the Revenge Porn Helpline for advice and assistance via email (as the phone line is currently closed) at [email protected].