Internet platforms such as X (formerly Twitter), Reddit and Facebook are hotbeds of dangerous radical ideologies.
Since the assassination of Charlie Kirk on September 10, the Internet has been ablaze with political dialogue and calls for Christian nationalist rallies across the Western world.
However, becoming a radicalised left or right-winger is only one of the many dangers that internet use presents for children. Parents need to monitor their children’s online activity, as the internet is full of both metaphysical and physical dangers.
The latest data from the child protection non-profit Thorn shows that 1 in 3 boys are in danger of some sort of sexual contact online. Thorn’s Youth Perspectives Report is based on a survey of thousands of youths; their 2024 report found troubling increases in online sexual grooming and activity.
A recent post from Thorn’s website is quoted as follows: “In 2024, one in three (33 per cent) boys aged 9 to 12 reported having an online sexual interaction. This is the highest rate we’ve recorded since starting to gather data five years ago. But the risks extend far beyond sexual content. Nearly three in five (59 per cent) younger boys reported feeling bullied or made uncomfortable online. These aren’t isolated incidents—they represent a concerning pattern of increased targeting and harassment.
What makes this data particularly concerning is that it represents a shift in predatory tactics, not changes in boys’ behaviour. The data shows that 9 to 12-year-old boys are not engaging in riskier online activities—in fact, their rates of sharing intimate images remained virtually unchanged from 2023 (-3 per cent for sharing their own content, -3 per cent for sharing others’ content).
Instead, perpetrators appear to be increasingly intentional in how they identify and target younger male victims. They’re casting wider nets across platforms where boys feel safest, including gaming environments, social platforms, and messaging apps, where they can build private relationships.
The data reveals exactly how these harmful interactions are unfolding. Two large reporting increases among younger boys were in “being sent sexual messages” (+12%) and “being asked to send a nude photo or video” (+9%). Crucially, both of these experiences involve being contacted and solicited by someone else, not initiating risky behaviour themselves. In fact, 1 in 3 (30 per cent) 9 to 12-year-olds reported having an online sexual interaction with someone they believed to be an adult, trending up nine percentage points since 2020.”
The report further highlights the problem that 1 in 5 minors who experienced a harmful interaction online did not ask for help. This may be due to the shameful nature of the interaction or due to threats by the online groomer that if the boy tells anyone, some harm will come to them.
The US case of Doe v Twitter alleged that the social media platform Twitter (also known as X) aided individuals who sexually exploited two young boys over social media. One of the two boys was blackmailed and forced to produce sexual content with another boy. The content was shared on the social media platform Twitter, and Twitter failed to remove it promptly.
The rise of AI also allows predators to take innocent photos of a child and digitally manipulate them to create images that can be used to embarrass and scandalise a child. While AI companies that provide cloud-based services try to censor their products, open-source AI tools can be downloaded by criminals who can reprogramme them to run on a home computer and produce pornographic content.
Young boys may be in an online game like Fortnite or Roblox and befriend strangers who pretend to be children. These strangers may move interactions to Snapchat, Instagram or WhatsApp and then groom children into sending them photos. Even if no compromising nude photos are sent, the face of a young boy may be manipulated by AI into creating sexual content used for blackmail.
This sort of sexual exploitation can lead to the death of a child, as seen in the recently filed case of Rebecca Dallas v Roblox Corporation. Rebecca’s son, Ethan Dallas, committed suicide after being groomed online from the age of 12 until 15. The interaction started on Roblox but was moved to Discord, where Ethan Dallas was encouraged to send sexual content to a child predator.
Rebecca Dallas is now suing Roblox and Discord for wrongful death, fraudulent concealment and misrepresentations, negligent misrepresentation and strict liability. She alleges that the Roblox corporation misrepresented its platform as a safe space but had insufficient safeguards to guard children against paedophiles.
Parents in T&T should communicate with their children regularly about the dangers of meeting people online. Parents should ensure their children have age-appropriate sex education at an early age to prevent the confusion that may result from being sexually groomed online.
Ideally, children who are on platforms like Roblox should only be playing with children known to the parent. There is always a risk in allowing a child to interact with strangers online, even in childish settings that may seem to be innocent and playful.
In closing, it should be stressed that both girl and boy children are at a high risk of being sexually groomed online.