close
close

The “nudificent applications” are an increasing threat to women and girls. Will be the online safety act …

The “nudificent applications” are an increasing threat to women and girls. Will be the online safety act …

March 10, 2025, 06:48

The

The “nudificent applications” are an increasing threat to women and girls. Will the law on online safety be sufficient?

Image: Getty


Katie Mehew

By Katie Mehew

Taylor Swift, Natalie Portman, Cathy Newman – three exceptional women, with something deeply disturbing: all were victims of sexual explicit depths.

Sexually explicit Deeps are images or videos generated by AI or alterates that depict individuals in intimate or sexual situations, without consent. Technological progress means that Deepfakes are alarming of life, with devastating impacts on victims.

Starting March 17, 2025, the new provisions under the law on online safety will impose internet service providers – including socializing platforms and search engines – to proactive illegal content, including abuse of intimate images, such as Deepfakes explicitly and prevents from appearing first. The sites that host such content could cope with fines of up to 10% of their global revenues from Offcom, the Communications Regulatory Authority.

Also, the government plans To incriminate the creation of sexual explicit depths without consent. The online safety law has already made an offense to share or threaten to share intimate images without consent, including Deepfakes. These measures are aligned with the commitment of the government to reduce by half violence against women and girls in the next decade.

These legal reforms cannot come soon. Research shows that 98% of Deepfakes are pornographic, 99% of women. While celebrities are often targeted, ordinary women are facing the highest risk. A 2024 study of My picture, my choiceHe found that most deep victims are not public persons, but everyday women.

The nudificent applications, which eliminate the clothing from the pictures, feed this abuse. My picture, my choice reported that a nudifier application has processed 600,000 women’s photos within 21 days after launch. These applications are inherently misogynistic – often inefficient on men’s photos – and, while they are forbidden from application stores, they remain accessible by search engines and social media advertisements. Their continuous existence allows and generates profits from the exploitation of women and girls.

Children at risk

Conformable Internet problems13% of children – around half a million adolescents in the UK – have met sexual dees, or by sending, receiving, creating or visualizing them online. There is an increase in growth in children’s abuse schools in children involving these applications. Whether it is driven by intimidation or curiosity, young users cannot realize that the depth generation of other children is not only harmful, but illegal.

The experience of being deep in adulthood is terrible and deeply violent. For a child, it could be even more traumatic, with long -term consequences.

The need for urgent action

The English law is already criminalizing the creation or distribution of sexual images under 18, including images generated by AI. While the new legislation is a critical step, application and education – especially in schools – are just as vital. Without an urgent action, the uncontrolled growth of Deepfake technology will continue to supply the online misogynus, long -term operation and injury.

________________

Katie Mehew is associated in the reputation protection team and the Mishcon of Reya crisis.

The LBC opinion offers a platform for various opinions on current business and issues of public interest.

The opinions expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us by Email [email protected]