Ofcom has proposed concrete measures that tech firms should take to tackle online harms against women and girls, setting a new and ambitious standard for their online safety.
With insights from victims, survivors, women’s advocacy groups and safety experts, the draft guidance sets out practical, ambitious but achievable measures that providers can implement to improve women’s and girls’ safety. It focuses on four issues:
- Online misogyny: content that actively encourages or cements misogynistic ideas or behaviours, including through the normalisation of sexual violence.
- Pile-ons and online harassment: when a woman or groups of women are targeted with abuse and threats of violence. Women in public life, including journalists and politicians, are often affected.
- Online domestic abuse: the use of technology for coercive and controlling behaviour within an intimate relationship.
- Intimate image abuse: the non-consensual sharing of intimate images – including those created with AI as well as cyberflashing – sending explicit images to someone without their consent.
This guidance will be crucial as, under the UK’s Online Safety Act 2023, services such as social media, gaming, dating apps, discussion forums and search engines have new responsibilities to protect people in the UK from illegal content, and children from harmful content – including harms that disproportionately affect women and girls. This means companies must assess the risk of gender-based illegal harms, such as controlling or coercive behaviour, stalking and harassment, and intimate image abuse on their services. They must then take action to protect users from this material, including by taking it down once they become aware of it. Sites and apps must also protect children from harmful material, such as abusive, hateful, violent and pornographic content.
Ofcom has already published final Codes and risk assessment guidance on how they expect platforms to tackle illegal content, and will shortly publish their final Codes and guidance on the protection of children. Once these duties come into force, Ofcom’s role will be to hold tech companies to account.
Call for Views
In line with this, Ofcom is also required to produce guidance setting out how providers can take action against harmful content and activity that disproportionately affects women and girls, in recognition of the unique risks they face. Their draft Guidance identifies a total of nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm, and supporting their users.
As such, they are now inviting feedback on this draft guidance, as well as further evidence on any additional measures that could be included to address harms that disproportionately affect women and girls. Once they have examined all responses, they will publish a statement and final guidance later this year.
To share your views, please complete their Consultation Response Form (linked for your convenience) by 5pm on Friday 23 May 2025.
For further information, please visit the Ofcom website.











Leave a Reply