Some crime stories are shocking because they are violent. Others are shocking because they are quiet. Britain’s worsening sextortion crisis falls squarely into the second category.
New figures reported this week show that children in the UK are reporting online sextortion attempts in record numbers, with the Report Remove service receiving 394 blackmail cases in 2025, a 34% increase on the previous year. The overwhelming majority of victims were boys aged 14 to 17. At the same time, the service handled 1,894 total reports from under-18s seeking help over intimate images of themselves online, a year-on-year rise that campaigners say points to a growing and deeply gendered form of abuse.
The mechanics of sextortion are cruelly efficient. A young person is contacted online, often through social media, gaming platforms, or messaging apps. Trust is built quickly. An intimate image is requested or exchanged. Then the tone changes. The child is told the image will be sent to friends, family or classmates unless more images are provided or money is paid. It is blackmail, but with humiliation as the weapon.
What makes the latest figures particularly alarming is who is being targeted. Public discussion of child sexual exploitation has often focused on girls, yet this wave of sextortion is falling overwhelmingly on teenage boys. According to the recent reporting, boys accounted for almost all the blackmail reports made to the service last year. That matters, because boys may be less likely to recognise themselves as victims of sexual exploitation and may be less likely to seek help until panic has set in.
There is also a legal and regulatory dimension growing around the crisis. Campaigners including the Molly Rose Foundation, the Internet Watch Foundation, and the NSPCC are pressing for stronger safety measures from technology companies and more assertive intervention by regulators. One of the proposals now receiving renewed attention is nudity-detection technology on phones and platforms, intended to stop intimate images from being shared or at least to interrupt the moment before they are sent. Critics say the tech sector has been too slow and too voluntary in its response.
Report Remove itself is an example of how the response has had to evolve. The service, run by Childline and the Internet Watch Foundation, allows children to report intimate images so that digital “hashes” can be created and used to help prevent those images being spread further, without requiring the child to endlessly re-expose themselves to the abuse. It is a smart tool. But the numbers suggest it is operating downstream of a much larger problem.
Perhaps that is why this story has landed so heavily. It is not one case, one defendant or one sentencing hearing. It is a pattern, and patterns are harder to dismiss. Behind every statistic is a teenager being told that one bad decision, one naive trust, one private image, could be turned into social ruin.
The law has not ignored sextortion. But this week’s figures suggest society is still chasing the crime rather than getting ahead of it. And for a generation growing up online, that is a dangerous place to be.