Latest News

Misogynistic image-based abuse recognised as new offence within draft Online Safety Bill

Cris Pikes, CEO of visual content moderation technology company, Image Analyzer, has commented on the inclusion of a section within the draft Online Safety Bill which specifically focuses on image-based harms and recognises the disproportionate harms experienced by women and girls who are subjected to image-based abuse.

“For over a decade, online platform operators have been side-stepping responsibility for the harms perpetrated on their platforms by stating that they have not had sufficiently clear guidance from the relevant authorities on the specific harms that they need to address,” says Cris Pikes, CEO of Image Analyzer, “Meanwhile, children, women and minority groups have been bearing the brunt of the delays.”

“By defining precisely which online actions are deemed ‘harmful’ and illegal within the Online Safety Bill online service providers will be left in absolutely no doubt about their legal responsibilities. The End Violence Against Women Coalition petitioned for the draft Bill to include specific reference to the fact that online harassment and image-based abuse disproportionately affects women and girls and called for all forms of image-based abuse to be classed as harmful under the Online Safety Bill. We agree with this proposal as it brings into scope non-consensual sharing of intimate images, cyber-flashing, up-skirting, deepfakes and other image-based abuse and harassment and places a clear legal responsibility on online service operators to detect, and swiftly block and remove offending images to prevent them from being shared, which perpetuates the harm. These activities are recognised offences when they take place offline and should not be permitted online. It’s important to specifically include these offences within any new law that is designed to make online spaces safer, particularly for children and women.”

Following scrutiny by the joint parliamentary committee, it has been recommended that the Online Safety Bill includes the following sections addressing misogynistic abuse online:

“25.Women are disproportionately affected by online abuse and harassment.60 They are 27 times more likely to be harassed online than men.61 36 per cent of women report having been a victim of online abuse and harassment, with this rising to 62 per cent in women aged 18–34.62 Abuse and harassment are not only directed towards adults: in 2020–21, half of 11–16 year old girls experienced hate speech online and a quarter were harassed or threatened.63

26.Violence against women and girls (VAWG) “is increasingly perpetrated online” and online VAWG “should be understood as part of a continuum of abuse which is often taking place offline too.”65 Professor Clare McGlynn QC, Durham Law School, described an “epidemic of online violence against women and girls”.66 Online VAWG “includes but is not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive ‘sexting’, and the creation and sharing of ‘deepfake’ pornography.”67

27.Cyberflashing—the unsolicited sending of images of genitalia68 is a particularly prevalent form of online VAWG. 76 per cent of girls aged 12–18 and 41 per cent of all women reported having been sent unsolicited penis images. Regardless of the intention(s) behind it, cyberflashing can violate, humiliate, and frighten victims, and limit women’s participation in online spaces.69 The use of deepfake pornography in online VAWG is also becoming increasingly prevalent and is of great concern, having been recently debated in the House of Commons on 2nd December 2021.70

The UK government has until February to respond to the committee’s report before the amended Bill is presented before Parliament in early 2022.

About Image Analyzer

Image Analyzer provides artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. Its technology helps organizations minimize their corporate legal risk exposure caused by employees or users abusing their digital platform access to share harmful visual material. Image Analyzer’s technology has been designed to identify visual risks in milliseconds, including illegal content, and images and videos that are deemed harmful to users, especially children and vulnerable adults.

The company is a member of the Online Safety Tech Industry Association (OSTIA) and was recently selected as a recipient of the UK government’s Safety Tech Challenge Fund to develop AI-based technology, in collaboration with Galaxkey and Yoti, which automatically detects CSAM in end-to-end encrypted environments without impacting the privacy of law abiding users.

Image Analyzer holds various patents across multiple countries under the Patent Co-operation Treaty. Its worldwide customers typically include large technology and cybersecurity vendors, digital platform providers, digital forensic solution vendors, online community operators, and education technology providers which integrate its AI technology into their own solutions.

For further information please visit:




UK Gov


BBC News, ‘Online Safety Bill – new offences and tighter rules,’ 14th December 2021


‘End Violence Against Women and Girls – Principles for the Online Safety Bill,’


The Independent, ‘Rights groups call for Online Safety Bill to include protections for women and girls,’ 2nd December 2021