Research from the University of Liverpool, conducted in partnership with the Revenge Porn Helpline, has revealed that a significant number of websites analysed containing non-consensual intimate images (NCIIs – colloquially known as ‘revenge porn’) may not be required to remove the images by law.
The research found that many websites being used to host NCIIs are unlikely to be regulated by Ofcom under the Online Safety Act (OSA).
The non-consensual sharing of private sexual material was criminalised in 2015 under the Criminal Justice and Courts Act. The Online Safety Act, passed in 2023, included areas not previously covered in the 2015 legislation.
The OSA also creates different categories so larger companies are subject to higher levels of regulation. Ofcom is currently in the process of drawing up a register of the levels of regulation different companies will need to adhere to.
It is currently unclear how Ofcom will determine which companies fall under the legislation, other than the Act suggesting that those user generating services with a ‘significant’ number of UK users, or services where the UK is a target market, will be more highly regulated (for example, it’s anticipated that Facebook and Twitter will fall within scope of the regulation).
This means that the OSA’s powers could be limited in ensuring than NCII is removed from smaller platforms and platforms with a limited number of UK users. Therefore, the OSA may not prevent the non-consensual sharing of intimate images on a large proportion of platforms being used to host the content.
Research, from the University of Liverpool’s Department of Sociology, Social Policy and Criminology, examined how reports made to the Revenge Porn Helpline were distributed across the internet, including public URLs. A random sample of 200 cases was generated by research from over 2600 cases of NCII distribution between 2015 (when the Helpline opened) and early 2022. Analysis was conducted by Zara Ward, formerly a Senior Helpline Practitioner at RPH.
Analysis of these reports, recently published in the European Journal of Criminology, showed that many URLs responsible for hosting NCII would be unlikely to be covered by the legislation if regulation was focused on larger platforms. The research found that whilst the largest social media platforms were being used to host the largest proportion of images, many other platforms were also being used to host content, including pornography platforms, message boards, file-sharing sites, escort sites, and dedicated ‘revenge porn’ platforms.
This means that victim-survivors who have images shared on smaller platforms and platforms with a limited number of UK users could continue to struggle to have their images removed, if the highest level of regulation in the OSA is mostly targeted at the biggest social media platforms, search engines, and pornography websites.
Dr Antoinette Huber, who led the research, said: “Our research is the first of its kind in this complex area where we have worked with reporting data from a key victim-survivor support service to further understand the nature of NCII image distribution and removal.
“We have found that, while the Online Safety Act is an extremely important legislative step, it may not quite stretch far enough in helping to effectively prevent the distribution of NCIIs, depending on how Ofcom compiles the levels of regulation.
“Although it appears that the Act will be effective in targeting the bigger platforms with a large number of UK users, there are significant gaps in the legislation. This means that images removed from larger sites could still be circulating on smaller sites and sites without a large UK user base, continuing to cause considerable distress to victim-survivors. The internet is a borderless space, and victim-survivors are unlikely to be concerned with the country users are based in, if it means that their images can still be circulated.
“We know that, unfortunately, the longer NCII remain online, the greater the likelihood of further circulation, and increased susceptibility to further forms of abuse. That’s why it’s really important for Ofcom to ensure that the scope of the OSA is broad enough to tackle NCII distribution in a way that is meaningful for all victim-survivors.”
Sophie Mortimer, Helpline Manager at the Revenge Porn Helpline said: “A large part of the work of the Revenge Porn Helpline is on the reporting for removal intimate images that have been shared of our clients without their consent.
“As the research shows, this happens across all types of sites, in a landscape that is largely unregulated. The removal of images is the first thing our clients ask for when they get in touch, and the thing we will talk to them about for the longest. The nature of the internet means that a single outstanding image can cause huge distress and anxiety for someone who knows it is only a matter of time before it starts to recirculate widely all over again. The impact can be devastating, affecting personal and professional relationships, emotional and physical health and mental wellbeing.
“It is vital that Ofcom works to build a landscape through international, cross-industry co-operation, that gives swift remedies for the harms of intimate image abuse.”
Read the article entitled, ‘Non-consensual intimate image distribution: Nature, removal, and implications for the Online Safety Act’ in full here.