London: The UK government has unveiled new legal requirements that will compel major technology companies to remove intimate images shared without consent within 48 hours of being flagged to them. This move is part of broader efforts to strengthen protections for women and girls against online harassment and abuse.
Under proposed changes to the Crime and Policing Bill, firms that fail to comply could face fines of up to 10 per cent of their worldwide revenue or potentially have their services blocked in the UK.
The rules are designed so that victims only need to report harmful content once, with platforms expected to take action across multiple services and prevent repeated uploads.
Sean McConnell, GovTech Lead at Datactics,said: “The proposed 48 hour takedown requirement reflects a reality that harm from non-consensual intimate imagery escalates quickly. But timelines alone will not solve the problem, effectiveness will depend on the strength of the data infrastructure that enables platforms to detect, match, and prevent the redistribution of this content at scale.
For this legislation to deliver meaningful protection, technology providers must implement robust image-hashing, cross-platform data-sharing, and automated re-upload prevention supported by clear audit trails.
That means building systems with evidential integrity and strong data governance so law enforcement can verify compliance while preserving material needed for prosecutions. When designed correctly, these data frameworks not only accelerate removal but also help identify behavioural patterns that allow faster intervention.”
The creation or sharing of non-consensual intimate images will be designated a ‘priority offence’ under the Online Safety Act, placing it in the same high-harm category as offences such as terrorism and child sexual exploitation.
Officials have signalled that non-consensual intimate images will be treated with severity comparable to child sexual abuse and terrorism content, with future measures potentially including digital marking of material to automate takedowns.
Guidance for internet providers on blocking sites outside the scope of the Online Safety Act is also planned.
The proposals mark a significant shift in regulatory expectations, placing greater responsibility on online platforms to respond swiftly to abuse and actively prevent repeat harm, while crucially handing greater control back to victims so they are not left navigating repeated reporting processes or prolonged exposure to abuse.





