“Stranger danger” has always been the go-to parenting mantra to let kids know to be careful with people they don’t know, especially if the “stranger” behaves inappropriately.
The stereotype was the bad man in a van picking up vulnerable children off the street.
But, as disturbing new research conducted at the Victorian Forensic Paediatric Medical Service (VFPMS) in collaboration with Monash University will show, there’s a dark new frontier of child sexual abuse quickly taking hold.
It’s shifting and changing, hard to pin down, hard to prosecute, and hard to investigate. It’s online adult dating apps, gaming sites and consoles, and social media.
Dr Jo Tully and Dr Janine Rowse are both forensic doctors for the VFPMS, with centres at Melbourne’s Royal Children’s Hospital and Monash Children’s Hospital.
Dr Tully, a paediatrician, is deputy director of VFPMS; Dr Rowse is also a forensic doctor for the Victorian Institute of Forensic Medicine (VIFM), and is a PhD student at Monash University’s Department of Forensic Medicine.
They investigated 14 years of cases of Victorian children aged 12 to 17 after an allegation of, or police action into, sexual assault or abuse.
Their focus was on face-to-face contact sexual offending after a period of grooming online on any number of sites and apps, including Tinder, Instagram, Bumble, and gamer hubs. The majority of rapes and assaults happened at the first face-to-face contact.
The research is due to be published.
The research into what is called technology-facilitated sexual assault (TFSA) found it is associated with high rates of penetrative assaults. Long periods of online grooming and the sharing of sexual images and video may create a higher sense of expectation by the perpetrator and a reduced sense of stranger danger in the victim, the researchers say.
In the first seven years of the 14-year study, from 2007 to 2013, only 4% of sexual assaults occurred after connecting online. In the second half, from 2014 until 2020, it had risen to 14%. During 2019, it was 19%.
Almost all the assaults occurred at the first face-to-face meeting, and 75% of them occurred in either the alleged offender’s home or in a public place such as a park or a toilet.
The online platforms used by the children to connect keep changing. Between 2007 and 2013, the majority of connections were on Facebook, but between 2014 and 2016, adult dating apps increasingly featured – apps that should be restricted to those over 18. Then, between 2017 and 2020, almost half involved Snapchat.
There are more places to be groomed, and the grooming might lead to even more horrific physical crimes. One thing to bear in mind here is that the two people involved – victim and the perpetrator – don’t know each other before they begin communicating online. The research excludes those who know each other.
“The reason that we conducted this research was that we were seeing sexual assaults on children that were technology-facilitated,” says Dr Tully. “At the beginning of the study, the internet and social media were not such a big part of children’s lives, like they are now.
“It’s taken us all by surprise.”
Research by the University of New South Wales and its Gendered Violence Research Network show the same thing. More children are online, more are groomed, and more eventually assaulted or raped, often after being blackmailed or tricked into being filmed, meeting up, or sending sexualised images.
Data from the US this year showed that one in four boys aged 9-12 had been on an adult dating site, and one in seven girls. A very high number of children who connect on dating apps go on to meet the match in person.
The emerging issue – and Monash’s role in understanding it – has been comprehensively covered in The Age/Sydney Morning Herald this year.
Stark findings in the longest TFSA study
The researchers believe their forthcoming study is the longest conducted (retrospectively) into technology-assisted child sexual abuse.
Among many stark findings, they’ll show that 96% of all the cases examined involved penetrative sexual assault, and 75% without a condom.
“These are high-risk assaults. You have to be concerned about what’s happening,” says Dr Tully. “The more I work in this area, the more concerned I become, both as a forensic doctor and paediatrician.
“What we’ve found is really important, but it’s almost certainly the tip of the iceberg. It’s a perfect storm.”
“There needs to be simple and clear messages for parents and schools around reframing stranger danger in the online world.”
Within the storm are the tech companies that own the platforms hosting the grooming, according to Professor Richard Bassed, the head of Monash’s Department of Forensic Medicine, and the Deputy Director of VIFM.
It’s “all too easy” for children to access adult dating sites and apps, he says, despite “large swathes of content that is completely unsuitable for children to view”, and sexual predators waiting to start communicating.
He says the owner companies are “reluctant or unable, or have not considered dealing with issues of inappropriate access to their platforms by minors”.
“At a bare minimum, they should have a robust age-verification system in place to prevent minors from joining these online adult communities. Games that children play online such as Minecraft, Fortnite and Roblox are also open to abuse by adults who can engage with children anonymously, and can pretend to be children.”
An ‘ethical and moral failing’ by platforms
He says the tech and social media corporations have “corporate responsibility” to make sure minors can’t access material that could expose them to criminal exploitation.
“It is an ethical and moral failing to have unsuitable and dangerous material, and intimate online contact with effective strangers available for minors to access relatively easily.
“I would also think that there should be, if there is not already, a legal responsibility in place. Corporations should bear much of the legal responsibility for appropriate access and use of their platforms.”
As a parent, he says parents have trouble keeping up. “Parents cannot be expected to be the sole gatekeepers of technology access for children – the technology simply moves too fast and leaves the older generation rapidly behind.
Dr Tully and Dr Rowse agree.
“It’s difficult for parents to believe this might be happening, and the tech companies don’t have sufficient accountability,” Dr Tully says. “Parents are struggling to keep up. They don’t fully understand what is happening.
“It’s become normalised for young people to receive sexually-explicit pictures and advances from strangers. Sexualisation of online communication is now normalised.
“Teenagers are generally well-versed in blocking, but may not be reporting concerning communications to parents or other adults. We’re seeing children being given devices at a young age, suggesting that online safety education needs to start early.
“If an older male approached a 14-year-old girl at a bus stop asking her to go to his house, she would say ‘No’,” she says. “But that same danger is not recognised online. In many cases the child may perceive the online contact to be a friend or romantic partner, despite having never met in real life. In other cases, the child may be blackmailed or tricked or convinced to do something they would not ordinarily do.
“That’s the message we have to get across. There needs to be simple and clear messages for parents and schools around reframing stranger danger in the online world.”
Dr Rowse uses the analogy of the motor car. Like the internet, it was invented and welcomed as a useful thing. But it was also sometimes dangerous.
“We built the car,” she says, “then we had to build the seatbelts and the airbags to address emerging safety issues. This is the way we do life now. We are online. For the sake of children and other vulnerable groups, we need to work out, ‘What are the seatbelts? What are the airbags?’”
This article was first published on Monash Lens. Read the original article