New changes to the law will force tech firms to clamp down on the sharing of non-consensual intimate images on their platforms as part of a drive to tackle online sexual offending.
The offence of sharing intimate images without consent will be classified as the most serious type of online offences under the Online Safety Act, meaning platforms will now have to take steps to proactively remove this material, as well as prevent it from appearing in the first place. If they fail to do so under this new law they could face fines.
The strengthening of the law forms part of the government’s commitment to ensure new and existing technologies are safely developed and help keep people safer online, particularly women and girls with more than one in three women in the UK having experienced abuse online. Technology Secretary Peter Kyle said:
“The rise in intimate image abuse online is utterly intolerable. As well as being devastating for victims these crimes have also contributed to the creation of a misogynistic culture on social media that can spread into potentially dangerous relationships offline. We must tackle these crimes from every angle, including their origins online, ensuring tech companies step up and play their part.
That is why we will classify these vile and cowardly offences as the most severe types of crime under the Online Safety Act. Social media firms will face extra legal obligations – backed up by big fines – to uproot this content from their sites, helping to stop their normalisation and preventing generations becoming desensitised to their damaging effects.”
Safeguarding Minister Jess Phillips said that intimate image abuse is an “appalling, invasive crime” and technology companies “must do much more” to tackle it. She continued:
“We will use every tool available to achieve our unprecedented mission of halving violence against women and girls within a decade and this is an important step forward.
The scale of violence against women and girls in all its forms is a national emergency, whether in person or online. We must overhaul every aspect of society’s response to stop this abuse from happening in the first place. Platforms must take responsibility for the content they host and we must ensure victims receive the support they deserve.”
The Online Safety Act will require social media firms and search services to protect their users from illegal material on their sites, with protections due to come into force from Spring next year. The most serious forms of illegal content are classed as ‘priority offences’ meaning regulated online platforms will have additional duties to proactively remove and stop from appearing on their sites.
The move will mean intimate image offences are treated as priority offences under the Act, putting them on the same footing as public order offences and the sale of weapons and drugs online.
If firms fail to comply with their duties the regulator Ofcom will have robust enforcement powers, including imposing fines that could reach up to 10% of qualifying worldwide revenue.
Sophie Francis-Cansfield, Head of Policy at Women’s Aid, commented:
“Women’s Aid welcomes the changes to the Online Safety Act announced today, that will see the sharing of intimate images without consent becoming a priority offence. Intimate image-based abuse, along with other forms of abuse that happen predominantly online are sadly not taken as seriously as those that happen ‘offline’, but it is our hope that legislative changes like this will help improve the urgency and seriousness in which they are dealt with by police and social media companies.
While we welcome today’s announcement, this change must come alongside proper police training on handling these cases and the collection of evidence in them. Charging rates are pitifully low in such cases, largely because, as a relatively new and complex crime, police are still not properly investigating them or collecting evidence.
Training is desperately needed if this change is to have a meaningful impact and women and girls are to receive justice for this deeply violating form of abuse.
As the scale of intimate image-based abuse and other forms of online violence against women and girls is so large, there needs to be increased funding to the vital specialist services that support the survivors of online abuse. Women’s Aid would like to see the proceeds of fines, along with those from the Digital Service Tax, going towards the sustainable funding of support service, so survivors of online abuse, who often have nowhere else to turn, are able to receive the support they need to heal.
Intimate image-based abuse is a complex issue and can manifest in many ways. The sharing of AI-generated intimate images, or ‘deepfakes’, was criminalised earlier this year, but it is unclear if this is part of the ‘priority offence’. The Government needs to clarify this, as deepfakes continue to be a deeply harmful and prevalent issue.
The changes that the Government announced today were all recommendations made by the VAWG sector when the Bill was progressing through parliament. They remain urgent and must be implemented and monitored properly. Making the sharing of intimate images without consent a priority offence is a step in the right direction to tackling sexual offending and the normalisation of misogynistic content online, but a lot more needs to be done if the Government is to achieve its mission of halving violence against women and girls in the next decade.”