Research + Reform
-
Stats & Facts
- Prevalence of child sexual abuse
- Nature of child sexual abuse: risk factors & dynamics
- Disclosure of child sexual abuse
- Harmful sexual behaviours (incl. peer-instigated abuse)
- Child sex offenders
- Convicting, treating & managing child sex offenders
- Child sexual abuse & religious organisations
- The impact of child sexual abuse
- The cost of child sexual abuse
- Child sexual abuse: prevention & education
- Community views on child sexual abuse
- Online risks, child exploitation & grooming
- Research
- Reform & Submissions
Helpful Links
- Home > Research + Reform > Stats & Facts > Online risks, child exploitation & grooming
Technology facilitated abuse
Two types of online child sexual victimization prevalent in Australia: the prevalence of non-consensual sharing of a sexual image of the child, by any perpetrator being 7.6 %, and the prevalence of online sexual solicitation by an adult perpetrator being 17.7 % (Walsh et.al., 2025).
Girls were nearly three times as likely as boys to experience both forms of online sexual victimization before age 18: non-consensual sharing of sexual images of the participant by any perpetrator (10.9 %; 3.8 %), and online sexual solicitation by adults (26.3 %; 7.6 %) (Walsh et.al., 2025).
Finkelhor and colleagues (2023) have proposed a categorisation of technology facilitated abuse, or sexual image crimes and abuse that occur against children, which they refer to as Image Based Sexual Exploitation and Abuse of Children (IBSEAC). The categorisation includes five incident types: 1) adult made images (child sexual abuse images), 2) images non-consensually made by other youth, 3) voluntarily provided self-made images non-consensually shared with other youth, 4) voluntarily provided self-made images non-consensually shared by adults, and 5) voluntarily provided self-made images to adults that entailed an illegal age difference or were part of a commercial transaction.
Youth online: activity and privacy
95% of 13- to 15-year-olds surveyed used social media in 2024. Most popular services: YouTube (73%), Snapchat (63%), TikTok (62%), and Instagram (56%). Facebook and Discord saw moderate use among children aged 13 to 15 (41% and 27%, respectively), with lower engagement on Twitch (12%) and Reddit (8%) (eSafety Commissioner, 2025).
80% of the surveyed children aged 8 to 12 used one or more of the eight social media services in 2024, despite policies prohibiting users under 13. Estimated impact: Considering Australia’s population of 1,596,302 8- to 12-year-olds, this suggests that about 1.3 million children aged 8 to 12 in Australia may be using social media, highlighting potential widespread breaches of age policies. 84% of children (aged 8 to 12) with accounts reported that their parents or carers knew about their account(s) (eSafety Commissioner, 2025).
Services reported on the average number of Australian monthly active end-users aged 13 to 17 (inclusive): IInstagram – 1,088,980; YouTube – 643,670; Facebook – 455,054; Snapchat – 1,034,07; TikTok – 522,863; Discord – 222,189; Twitch – 24,466; Reddit – Did not know (eSafety Commissioner, 2025).
Online sexual interactions (Thorn, 2024):
- 1 in 3 minors aged 9-17 reported they have had a sexual interaction online, including 1 in 5 9-12-year-olds.
- 28% of minors have had an online sexual interaction with another minor
- 28% of minors have had an online sexual interaction someone they believed to be an adult.
Responding to online risks (Thorn, 2024):
- 1 in 6 minors who experienced an online sexual interaction did not disclose their experience to anyone.
- Minors who have had an online sexual interaction were nearly twice as likely to use online reporting tools than to seek offline help.
Children who have experienced neglect are 3.5 times more likely to engage in online risk behaviour, which in turn is associated with 2.7 times higher risk of online invasive exploitation (Emery, Wong, Haden-Pawlowski, et al., 2024).
In 2023, the top platforms where the most minors reported having had an online sexual experience were Snapchat (16%), Instagram (14%), Messenger (13%), Facebook (12%), TikTok (11%), X (8%), and WhatsApp (8%). However, when looking exclusively at minor platform users, the top five platforms with the highest rates of online sexual interactions were different. Minor platform users were most likely to indicate they had experienced an online sexual interaction on Omegle (36%), Kik (23%), Snapchat (23%), Telegram (22%), Instagram (20%), and Marco Polo (20%) (Thorn, 2024)
A survey of children aged 8-17 years and their parents, found that six in ten children have communicated with someone they first met online, one in eight children have sent a photo or video of themselves to someone they first met online, and one in eight children have met someone face-to-face after first getting to know them online. Seven in ten young people aged 14–17 have seen sexual images online in the past year, while close to half have received sexual messages from someone online in the past year. Parents’ awareness of their children’s exposure to sexual material online is much lower than their actual rate of exposure. This suggests that children may not be telling their parents about experiences that are embarrassing, sensitive or stigmatising (eSafety Commissioner, 2022).
A recent survey of Australian youth aged 12-17 years has found that teens spend an average of 14.4 hours per week online, with males spending more time online (15 hours) than females (13.8 hours). Just over four in 10 teens had at least one negative online experience (e.g., being contacted by a stranger; receiving inappropriate, unwanted content) in the six months to September 2020, with this increasing to over 50% of those aged 14-17 years. Nine in 10 teens, meanwhile, sought to build positive relationships online, and having negative online experiences made teens more aware of the impact of their online actions and motivated them to engage in more positive behaviour online (eSafety Commissioner, 2021).
Research published by the Australian Centre to Counter Child Exploitation (2020) found that 87% of Australian children aged 4-7 years, 98% of children aged 9-11 years, and 100% of children aged 12-17 years use the internet, and that 16% of children aged 4-7, 40% of children aged 8-11, 73% of children aged 12-15 and 91% of children aged 16-17 years do so without supervision. Almost one-third (30%) of 4 years olds were found to have access to their own personal device (most often a tablet).
Among children aged 4-7 years, around one-in-five engaged in activities which had the potential to increase their risks of being exploited online, such as: playing interactive games (17%); and chatting with friends and family via video call (24%). Among those 8-11 years, half reportedly played interactive games (54%), and 18% used messaging apps. The proportion using messaging apps increased significantly amongst 12-15 year olds (54%), whilst by this age 42% were also using social media to view or post content (Australian Centre to Counter Child Exploitation, 2020).
A 2018 survey of 3,520 parents of children aged 2–17 years found that 81% of parents with pre-schoolers say their children use the internet, and that of these parents, 94% say that their child was using the internet by the age of 4 years (eSafety Commissioner, 2019a).
The same 2018 survey found that parents’ three most common concerns regarding their children online were: exposure to inappropriate content other than pornography (38%), contact with strangers (37%) and being bullied online (34%). The majority of parents, meanwhile, indicated that they were not confident in their ability to deal with negative online experiences of their children (eSafety Commissioner, 2019b).
The Australian Communications and Media Authority (ACMA) and the Office of the Children’s eSafety Commissioner released a research snapshot which provided updated information on the online activity of Australian children and youth (eSafety Commissioner, 2018a). This snapshot showed that:
At June 2015, over 935,000 Australian teens had gone online in the previous four weeks. That equates to 82% of all teens, up from 74% four years earlier.
- Of teen internet users, 88% went online more than once per day.
- At June 2011, smartphones were used by less than a quarter of teens (aged 14 – 17 years). Four years later, at June 2015, 80% of all Australian teens used a smartphone.
- Tablets were used by 27% of teens in June 2014; this figure rose to 39% just twelve months later.
- At June 2011, 39% of teens (aged 14 – 17 years) used the internet for social networking. Four years later, at June 2015, 54% of teens used the internet for this purpose.
A 2017 Australian Child Health Poll published by the Royal Children’s Hospital Melbourne found that:
- Almost all (94%) of teens aged 13-17 years, two-thirds (67%) of primary school children aged 6-12 years, and over a third (36%) of pre-schoolers aged 3-6 years have their own personal mobile screen device (e.g. smartphone or tablet).
- 71% of teens, 17% of primary school children and 13% of pre-schoolers report using a smartphone every day.
- Three in four (78%) teenagers and one in six (16%) primary school-aged children have their own social media accounts.
- Two-thirds (66%) of Australian children use screen-based devices without adult supervision at least once a week, with one-third doing so every day. Parents reported that 50% of children aged less than 6 years use screen-based devices without adult supervision (Rhodes, 2017).
Youth online: exposure to pornography
This current study found that among young men (n = 755), 86% had seen pornography, 10.5% had not seen it, and 3.6% preferred not to answer. Among young women (n = 1204), 69.0% had seen pornography, 28.7% had not, and 2.3% preferred not to answer (Crabbe, Flood, & Adams, 2024).
Among participants who identified as nonbinary or other gender (n = 26), 84.6% had seen pornography, 11.5% had not, and 3.8% preferred not to answer (Crabbe, Flood, & Adams, 2024).
Among all young people, 5.7% of young men and 4.4% of young women reported viewing pornography by age 10, 9.9% and 8.1% by age 11, 25.2% and 14.9% by age 12, 38.8% and 23.5% by age 13, 52.2% and 32.5% by age 14, and 61.2% and 42.7% by age 15, respectively (Crabbe, Flood, & Adams, 2024).
Among young people who had seen pornography and provided an age of first exposure (n = 1270), the average age of ‘first porn exposure’ was 13.2 years for boys and young men (median 13 years) and 14.1 years for girls and young women (median 14 years) (Crabbe, Flood, & Adams, 2024).
Among participants who had seen porn (n = 1502), 50.1% of young men and 40.3% of young women reported deliberately seeking pornography the first time they viewed it, 46.2% of young men and 55.7% of young women reported that their first exposure was unintentional, and 3.7% of young men and 4.0% of young women responded “prefer not to answer.” (Crabbe, Flood, & Adams, 2024).
Of the young people who had seen pornography, most were first exposed to it years before their first sexual experience with another person. The average gap between these events for boys and young men was a year longer than for girls and young women (3.2 years compared with 2.0 years) (Crabbe, Flood, & Adams, 2024).
Both boys and young men and girls and young women were most likely to view pornography at home (89%) and on an electronic device (94%) (Crabbe, Flood, & Adams, 2024).
Among the 50.1% of young men and 40.3% of young women whose first exposure was intentional, curiosity was the most frequently cited motivation (Crabbe, Flood, & Adams, 2024).
A UK Children’s Commissioner report into young people’s pornography use found that the average age at which children first see pornography is 13. By age 9, 10% of a sample of 1,000 young people had seen pornography, 27% had seen it by age 11, and 50% of children had seen it by age 13. Additionally, 79% had encountered violent pornography before the age of 18 years. Twitter was the online platform where young people were most likely to have seen pornography, and other mainstream platforms including Instagram and Snapchat ranked closely after dedicated pornography websites. Further, 21% of males aged 16-21 years had viewed content at least once per day in the 2 weeks prior to the survey, and boys who first viewed pornography at age 11 or younger were significantly more likely to become frequent users of pornography (Children’s Commissioner, 2023).
An online survey of young adults aged 18-25 years in the US found that most (82%) young adults indicated they had viewed pornography online. These emerging adults were found to endorse both positive and negative attitudes towards online pornography. Having positive or neutral attitudes towards pornography was associated with higher rape myth acceptance, while having more accurate knowledge of pornography was associated with reduced rape myth acceptance (Noll, Harsey, & Freyd, 2022).
The Our Watch survey with young people aged 15-20 years found that a high rate of participants reported using pornography as a source of information to learn about sex and sexual relationships in the past 12 months (60% of young men and 41% of young women) (Our Watch, 2020).
A study with 463 college males in the US found that more frequent pornography use was significantly related to a higher likelihood of committing both verbally and physically coercive sexual acts; however when both frequency of use and number of modalities used to access pornography (internet as well as, for example, books, magazines, movies) were considered, the number of modalities was significant in predicting sexually coercive behaviours, while frequency of pornography use was not. A threshold analysis revealed that two or more pornography modalities was the most significant predictor of likelihood of both verbal and physical coercion; meaning that if an individual used two or more modalities to access pornography, as opposed to just one, they were at the highest risk of sexually coercive behaviours (Marshall, Miller & Bouffard, 2021).
Research conducted by Our Watch identified that nearly half (48%) of young men have seen pornography by the age of 13 and nearly half (48%) of young women by the age of 15. The survey of nearly 2000 young Australians, aged 15-20 years, found that over half (56%) of young men indicated that they viewed pornography at least once per week over the past 12 months, while 15% of young women reported at least weekly usage (Our Watch, 2020).
Results of Wave 7 of the Longitudinal Study of Australian Children, conducted in 2016, revealed that at age 16–17, significantly more boys than girls had intentionally viewed pornography in the past 12 months: almost three quarters of boys but only one in three girls said they had viewed pornography in that time. Boys also reported viewing pornography far more frequently than girls: one in 10 boys (11%) and less than one in 100 girls (0.004%) said they watched pornography daily (Warren & Swami, 2019).
Teens and young adults, both girls and boys have a cavalier attitude toward porn:
- When they talk about pornography with friends, 90% of teens, and 96% of young adults say they do so in a neutral, accepting, or encouraging way.
- Only 1 in 20 young adults and 1 in 10 teens say their friends think viewing pornography is a bad thing. (Enough is Enough, 2017)
Teens are watching more porn and seeking it out more than any other generation:
- Among those age 13 – 17 years: 8% daily; 18% weekly; 17% once or twice a month.
- Among those aged 18 – 24 years: 12% daily; 26% weekly; 19% once or twice a month.
- 83% of boys and 57% of girls have seen group sex online; 32% of boys and 18% of girls have viewed bestiality online.
- 8.2% of top-rated porn scenes contain physical aggression (spanking, gagging, slapping, etc.); 48.7% contain verbal aggression (name calling). Perpetrators were usually male, 94% of the targets were women. (Enough is Enough, 2017)
Sexting and self-generated content
1 in 13 children under the age of 18 have faced the unauthorised sharing of sexual images of themselves, known as image-based abuse (Walsh et.al., 2025).
Australian youths reported one-year prevalence rates of 6% for nonconsensual resharing of images and 11% for sexual solicitation online. Those who had sexual images solicited online had about 10 times the odds of having their private images non-consensually reshared online. Having sent a naked picture or video of oneself in the past year was significantly associated with having one’s image non-consensually reshared. Those who shared their images had approximately 10 times the odds of having their private images non-consensually reshared online, 32% compared to 4%. Girls had over twice the odds of being solicited for images compared to boys (Seto et.al., 2024).
Self-generated images (Thorn, 2024):
- 1 in 7 (13%) minors reported they had shared a nude image or video of themselves with someone else,
- Among minors who have shared their own images, 1 in 3 reported having done so with an adult.
- 1 in 4 minors agree that it is normal for people their age to share nudes with each other
- Roughly 1 in 8 minors know a friend who has received money or gifts in exchange for images.
Nonconsensual resharing of images (Thorn, 2024):
- Fewer than 1 in 10 minors reported having nonconsensually reshared images.
- 1 in 5 minors reported having seen nonconsensually reshared images.
- 1 in 8 minors, aged 9-12, reported having seen nonconsensually reshared images.
- Roughly 1 in 3 exclusively or predominantly blame the person in the image rather than the resharer.
Youths that identified as LGBTIQ+ had over three times the odds of having their images solicited online and over twice the odds of having their images non-consensually reshared online. The proportion of LGBTIQ+ teens being sexually solicited may reflect their higher-than-average likelihood of conducting sexual and romantic relationships online (Seto et.al., 2024).
Similarly, youths with at least one disability had nearly twice the odds of having their images solicited and non-consensually reshared online.
In a study categorising Image Based Sexual Exploitation and Abuse of Children, Finkelhor and colleagues (2023) found that just 12% of image episodes qualified as adult produced child sexual abuse images, while the majority of image episodes (74%) were self-generated and voluntarily shared by youth (with, in many cases, further non-consensual sharing by another youth or adult). Children under age 13 were victims in only 9.8% of image episodes, and only 11% of episodes of images taken by adults.
In the US since 2019, there has been a sustained increase in minors self-reporting that they have shared sexual images – in 2021, approximately 1 in 6 minors reported sharing sexual images, including 1 in 7 of those aged 9-12 and 1 in 5 of those aged 13-17 years. Also compared with 2019, the number of younger boys (aged 9-12) reporting that they had shared sexual images more than doubled, while reporting among older boys (13-17) has nearly tripled (Thorn, 2022a).
A survey with young people in the US found that 1 in 8 (13%) 9-12-year-olds have shared a nude image with an online-only contact (Thorn, 2022b).
A study with 1370 Spanish university students found that 37.1% of participants had created and sent their nude images or sexual content to someone, with no differences found between gender. Out of all the participants, 6.4% had engaged in sexting coercion perpetration, and 32.7% had been victims of sexting coercion. Males were 7.56 times more likely to pressure someone to sext than females. Examination of the link between sexting behaviours and mental health showed that among males, none of the sexting behaviours (sexting, sexting coercion perpetration or sexting coercion victimisation) were associated with poorer mental health, while among females, each of the three forms of sexting behaviours were found to be associated with poorer mental health (Gasso, Mueller-Johnson, Agustina, & Gomez-Duran, 2021).
A study of Irish adolescents revealed that while more girls (29%) than boys (15%) had been asked to send a sexually explicit image, the prevalence for having sent a sexual picture was comparable (9% of girls and 8% of boys). The rate of having received unwanted sexually explicit images was much higher for girls (22%) than for boys (8%) (Foody, Mazzone, Laffan, Loftsson, & O’Higgins Norman, 2021).
The Cybersurvey, conducted with 11-16-year-olds across the UK, found that among those aged 13 and older who had shared sexual images of themselves, 18% reported having been pressured or blackmailed to do so (Katz & El Asam, 2020).
The results of the 2017 Youth Risk Behaviour Survey in the US showed that, among young people in grades 9-12 from four urban school districts, 5.7% of boys and 4.8% girls reported having had a sexual photo of themselves shared without their permission (Pampati, Lowry, Moreno, Rasberry, & Steiner, 2020).
In the first six months of 2020, 44% of all the child sexual abuse content dealt with by the Internet Watch Foundation (IWF) involved self-generated material; this was an increase from 15% in 2019 (UK Safer Internet Centre, 2020).
A report released by the Independent Inquiry into Child Sexual Abuse (IICSA, 2020) cited research published by the Internet Watch Foundation in 2018, which found that the majority of images and videos of live-streamed child sexual abuse analysed by the IWF depicted children assessed as being 11-13 years old. In the first quarter of 2019, 81% of self-generated content analysed by the IWF was of children aged 11-13, predominantly girls. IICSA (2020) quotes the Chief Executive of IWF as saying “we are extremely worried about girls, young girls, 11 to 13, in their bedroom with a camera-enabled device and an internet connection”.
In the first six months of 2019, the IWF dealt with 22,484 reports of self-generated child sexual abuse material. Just over a sixth of these images were categorised as the highest severity, involving penetrative sexual activity and/or sexual activity with an animal or sadism (WeProtect Global Alliance, 2019).
The results of the 2017 National Community Attitudes towards Violence against Women Survey showed that many young people lack understanding of consent in regards to image sharing. More than one in four young people (aged 16-24 years) reported they believe that “if a woman sends a nude image to her partner, then she is partly responsible if he shares it without her permission” (Politoff, Crabbe, Honey, et al., 2019).
Results of the 6th National Survey of Australian Secondary Students and Sexual Health, completed by over 6,000 Australian students in Grades 10-12, showed that 44% of students reported having received a sexually explicit nude or nearly nude photo or video of someone else, and that 32% reported having sent a sexually explicit nude or nearly nude photo or video of themselves. Just 6% reported sending a nude or nearly nude photo or video of someone else (Fisher, Waling, Kerr, et al., 2019).
The Netclean 2018 report, for which an online survey was conducted with 272 police officers across 30 countries, showed that self-produced material is common in their investigations of child sexual abuse and exploitation, with 91% of officers saying that voluntary self-produced material is common or very common in their investigations, and 89% reporting that the amount of voluntary self-produced material they see is increasing. More than three quarters of officers, meanwhile, reported that it is common or very common to see images produced as a result of grooming, and 65% reported that it is common or very common to see images produced as a result of sexual extortion. The majority of officers also indicated that the amount of images that they see produced as a result of grooming and sexual extortion is increasing. Officers did report, however, that it is frequently difficult to determine the exact circumstances in which an image was produced (Baines, 2019).
The Netclean 2019 report, for which an online survey was conducted with 450 police officers across 41 countries, showed that officers’ opinions were split regarding the frequency with which they came across live-streaming content in their investigations of child sexual abuse and exploitation. While nearly four in ten (38 %) of the surveyed police officers reported that the material is common or very common in their investigations, 42% reported that it is uncommon or very uncommon. Of the three types of live-streaming (voluntarily self-produced, induced self-produced and distance live-streaming), voluntarily self-produced content was seen as most common. The majority of respondents also indicated that live-streaming content is increasing (Netclean, 2019).
A study of young Australian adults, with a mean age of 21 years, found that sharing of sexts was relatively common, with approximately 1 in 5 participants having shown or shared a sext with another person for whom it was not originally intended (Clancy, Klettke & Hallford., 2019).
A meta-analysis of studies examining the prevalence of multiple forms of sexting behaviour among youth showed that the mean prevalence for sending and receiving sexts was 14.8% and 27.4% respectively. The prevalence of forwarding a sext without consent was 12%, and the prevalence of having a sext forwarded without consent was 8.4% (Madigan, Ly, Rash, Van Ouytsel & Temple, 2018).
A Swedish study which surveyed adolescents aged 12 – 16 years found that 20 – 32% reported having received sexts (“images or videos that contain nudity or are sexual in nature”), while 4 – 16% reported having sent them (Burén & Lunde, 2018).
Although 86% of a sample of 1,560 US youth viewed underage sexting as a crime, this knowledge did not appear to affect the prevalence of the practice (Gewirtz-Meydan, Mitchell & Rothman, 2018).
A collaborative report of youth and sexting behaviour has described the results of recent research conducted in the UK (Safer Internet Centre), Australia (Office of the eSafety Commissioner), and New Zealand (Netsafe). In the UK, 19% of youth aged 12-16 years said they were aware of a “few” incidents in the past year involving peers sharing self-generated images, with 12% reporting that it happens “all the time”. In Australia, around 5% of youth aged 14-17 reported having sent a nude or nearly nude image or video of themselves in the past year, while 15% reported having been asked for an image or video (52% of which requests came from an unknown person). Similarly in New Zealand, approximately 4% of youth aged 14-17 reported having shared a nude or nearly nude of themselves in the past 12 months, while 1 in 5 had been asked for a nude of themselves during the same period (UK Safer Internet Centre, Netsafe, & Office of the eSafety Commissioner, 2017).
Online solicitation and grooming
Additionally, 1 in 6 children reported experiences of sexual grooming by adults, with the acknowledgment that not all abuse is disclose (Walsh et.al., 2025)
1 in 17 minors reported having personally experienced sextortion (Thorn, 2024).
The present study reveals that 70% of respondents who have sought contact with a child tried to establish contact with a child online, the majority using social media platforms, online games, or messaging apps. Some respondents mentioned other methods they have used to contact children, including anonymous online video chat. The most common method reported by respondents to contact children is through social media. 48% of respondents who have contacted a child said that they used social media to establish the first contact. Instagram was by far the most mentioned platform, selected by 45% of respondents as a social media platform they have used to contact children. Next is Facebook (30%), Discord (26%), and TikTok (25%), followed by Snapchat (23%), X (Twitter) (22%), YouTube (13%), and Reddit (11%). 41% of respondents who have sought contact with a child shared that they tried to establish contact through an online game. 37% of respondents shared that they established the first contact with a child via a messenger, mostly via end-to-end encrypted messengers Telegram (45%) and WhatsApp (41%). The next most mentioned apps include Signal (28%), WickrMe (25%), Session (21%), Viber (16%), and Wire (12%). Other platforms mentioned by respondents in response to the option “Other, what?” include Discord, Omegle, Snapchat, and Likee (Suojellaan Lapsia Protect Children, 2024).
An online survey with young people in the US found that 40% of those aged 9-17 years have experienced a cold solicitation online for explicit imagery from an online-only contact. Boys aged 9-12 years were more likely than girls aged 9-12 years to have been solicited for explicit imagery, while this gender difference was reversed for teens (Thorn, 2022b).
A prospective study of 12-15 year old Spanish students found that over a period of 13 months, almost 23% reported being sexually solicited by adults and almost 14% reported having interacted sexually with adults online. A higher prevalence was found among girls, and victims of online solicitation had significantly lower levels of health-related quality of life (Ortega-Baron, Machimbarrena, Calvete, Orue, Perdea, & Gonzales-Cabrera (2022).
This same online survey found that nearly 2 in 3 (65%) 9-17-year-olds have experienced an online-only contact inviting them to move from a public forum into a private messaging platform, with half of all minors (52%) and 46% of 9-12-year-olds having used a private messaging app to interact with an online-only connection (Thorn, 2022b).
LGBTQ+ minors have more online-only connections and are more responsive to online messages received from unfamiliar contacts than are non-LGBTQ+ youth. A survey of 9-17 year old youth in the US found that 15% of LGBTQ+ minors reported that at least half of their online contacts are only known to them online, compared with 10% of non-LGBTQ+ youth, and that 1 in 5 (19%) LGBTQ+ minors reported they respond to most messages they receive from people they don’t know online, compared to 1 in 10 (10%) non-LGBTQ+ minors (Thorn, 2022b).
A study of chat log communications between 38 adult males and children who were accessed via social media for sexually exploitative purposes found that offenders’ requests for sexual activity often occurred early in the communication exchange, and that frequent shifts in discourse strategy were used to elicit compliance, as opposed to the gradual grooming methods observed with offline offending (Powell, Casey & Rouse, 2021).
A Finnish study of 1,762 children and adolescents aged 11-17 years found that 62% reported being contacted online by a person they know, or suspect is an adult or at least five years older than themselves. Among those contacted by adults, 17% reported receiving messages with sexual content weekly, and 29% reported these messages at least once a month. Just under one third (32%) of children contacted by adults had been offered a reward, including money, cigarettes, for sexual acts. The majority (72%) of the children felt their grooming experiences did not have any particular consequences. More than half (67%) disclosed their grooming experiences to someone – the majority of these (93%) told a friend, while 19% told their mother and 14% a sibling. Just 4% reported their grooming experience to police (Juusola, Simola, Tasa, Karhu, & Sillfors, 2021).
A study of Swedish court judgements involving 50 offenders and 122 child victims identified two main strategies that offenders used when inciting children to engage in online sexual activity: pressure (threats, bribes, or nagging) and sweet-talk (flattery, acting as a friend, or expressing love). Overall, this research showed that the offenders who used pressure were younger and targeted older children than the offenders who used sweet-talk (Joleby, Lunde, Landström & Jonsson, 2021).
A study with more than 1,000 undergraduate college students in the United States found that one-quarter of participants conversed with adult strangers online as minors, and that 65% of those participants (17% of the total sample) experienced sexual solicitation. Slightly less than a quarter of the total sample (23%) reported engaging in an intimate online relationship with an adult stranger, although 38% of those youth met the adult in person. A large majority of those who did meet in-person (68%) reported physical sexual intercourse (Greene-Colozzi, Winters, Blasko, & Jeglic, 2020)
Over a three-month period in 2018, the National Crime Agency (NCA) received over 1,500 reports of grooming in respect of 12 internet platforms (IICSA, 2020).
Although no study has specifically examined the proportion of adults holding online sexualised conversations with young people in England and Wales, the Independent Inquiry into Child Sexual Abuse (IICSA) reports that it is unlikely that figures would be below the lowest estimate of 1 in 10 adults (IICSA, 2020).
A German study of adolescents aged 14 – 17 years found that 22% reported online sexual interaction with an adult, with just 10% perceiving this interaction as a negative experience (Sklenarova, Schulz, Schuhmann, & Osterheider, 2018).
A recent Spanish study found that 15.6% of girls and 9.3% of boys aged 12 -15 reported online sexual solicitations from adults (De Santisteban & Gámez-Guadix, 2018).
An Australian survey of online safety experiences conducted with more than 3,000 young people aged 8 to 17 years showed that 25% of youth had been contacted by a stranger in the previous 12 months, and 10% had been sent inappropriate content online (eSafety Commissioner, 2018b).
A study of transcripts of adults who sexually groomed decoy victims online found that the large majority of offenders (89%) introduced sexual content in the first conversation with the decoy victim. Results also showed that in 96% of cases, the offender and decoy victim arranged an in-person meeting – most commonly (89% of cases) the offender introduced the idea of this meeting. On average, the idea of meeting in person was introduced after 3.4 days (Winters, Kaylor & Jeglic, 2017).
Image-based sexual abuse
Reports of IBSA are increasing – in 2022-2023, eSafety handled 9,060 reports of IBSA, which represented a 117% increase on the 4,169 reports received in the previous period (eSafety Commissioner, 2023).
A survey of 245 Australian adults across four jurisdictions found that 64% reported ever witnessing, or becoming aware of, someone engaging in image-based abuse, with respondents most frequently reporting witnessing the sharing of an intimate image without permission (29%), and threats of sharing an intimate image of someone (29%). Just under half (46%) of these respondents reported having taken action to intervene during their most recent experience of witnessing image-based abuse (Flynn, Cama, & Scott, 2022).
A survey with more than 6,000 respondents aged 16-64 years from across Australia, New Zealand and the United Kingdom, found that one in three (37.7%) reported at least one form of IBSA. Most commonly, participants reported that someone had taken a nude or sexual image of them without their consent (33.2%); one in five (20.9%) reported having had a nude or sexual image shared without consent; and almost one in five (18.7%) reported that someone had threatened to share a nude or sexual image of them. Notably, one in seven (14.1%) were found to have experienced all three forms of IBSA (Henry et al., 2020).
In regard to IBSA perpetration, the same survey with more than 6,000 respondents across three countries (Australia, NZ and the UK) found that more than one in six (17.5%) of all respondents reported engaging in at least one form of IBSA perpetration across their lifetime against a person over the age of 16 years. Young men aged 20–29 years were the most likely to self-disclose engaging in at least one form of IBSA overall (33.5%), closely followed by men aged 30–39 years (29.9%) and young men aged 16–19 years (28.8%). Young women aged 20–29 years were fourth most likely to self-disclose engaging in any IBSA (20.4%). Differences also existed by sexuality, with 28.6% of respondents identifying as LGB+ having engaged in one or more forms of IBSA perpetration, compared with 16.1% of heterosexual respondents. One common motivation reported by respondents was for “fun” or to be “sexy” (61.2% for taking; 58% for sharing; 55.8% for threatening). Other motivations included wanting to “impress friends” and/or “trade the images” (37.8% for taking; 54.9% for sharing; 54.9% for threatening); and wanting to “control the person in the image” (45% for taking; 57.1% for sharing; 63.2% for threatening). Finally, respondents reported being motivated to “embarrass” and/ or “get back at the person” depicted in the image (38% for taking; 51.7% for sharing; 61.4% for threatening), (Henry et al., 2020).
Image-based sexual abuse (IBSA) has been defined as the “non-consensual creation, distribution or threatened distribution of nude or sexual images” (Henry, Flynn & Powell, 2019). A survey with over 4,000 Australians aged 16-49 years found that 23% reported being a victim of at least one form of IBSA. The most common forms of victimisation were nude or sexual images being taken without consent, with 20% reporting these experiences. Also common was nude or sexual images being distributed without consent, with one in 10 (11%) reporting these experiences. This is likely to be an underestimate, since many victims will never discover that images of them have been either created or distributed (Henry et al., 2019). Furthermore, 50% of Aboriginal and Torres Strait Islander people, and 56% of respondents disclosing a disability reported ever experiencing at least one form of IBSA (Henry et al., 2019).
The Crimes against Children Research Center, in partnership with Thorn, conducted an online survey of over 1,500 young people ages 18 to 25 who had been targets of sextortion. The results of this research showed that sextortion in face-to-face relationships tended to be perpetrated by men (89% of cases) and targeted against women (87%), although men were victims in 11% of cases and perpetrators were female in 9%. As with incidents in face-to-face relationships, victims of sextortion in online relationships were predominantly women (77%) although men were victims in 20% of cases (Wolak & Finkelhor, 2016). This survey also found that 45% of perpetrators carried out their threats (Thorn, 2019).
A second wave of the Thorn sextortion survey was conducted in 2017, with 2,097 participants aged 13 to 25 who had been targets of sextortion. Nearly 1 in 4 participants were aged 13 or younger when the sextortion began. Younger victims were more likely to be threatened by an online offender (approximately 60% of participants who were ages 13 and younger when threatened, and slightly more than 50% of participants aged 14, did not know their offender offline). While more than 2 in 3 participants disclosed to someone about the sextortion, only 17% of those who disclosed reported to law enforcement, while 54% told a family member or friend and 26% reported to a platform or website (Thorn, 2019).
A study of the prevalence of sextortion behaviours among a nationally representative sample of U.S. middle and high school students (aged 12-17 years) found that approximately 5% reported being a victim of sextortion, while about 3% admitted to threatening others. Males and non-heterosexual youth were more likely to be targeted, and males were more likely to target others. Youth who threatened others with sextortion were also more likely to have been victims themselves (Patchin & Hinduja, 2018).
In an analysis of 78 sextortion cases conducted by the Center for Technology Innovation at the Brookings Institution, 71% of the cases were found to involve a victim under the age of 18. Sextortion of children is believed to be underreported, however; therefore the true number of children who are victims of sextortion is unknown (International Centre for Missing and Exploited Children, 2018). Sextortion cases have also often been found to have “more minor victims per offender than all other child sexual exploitation offenses” as offenders commonly communicate with multiple, and sometimes hundreds of potential victims at one time (International Centre for Missing and Exploited Children, 2018).
Child exploitation material
In 2023, there were a staggering 36.2 million global reports documenting suspected instances of child sexual abuse material online. Moreover, a comprehensive global study conducted by Economist Impact on behalf of WeProtect Global Alliance revealed that 54% of respondents, aged 18–20, had encountered online sexual harms during childhood (Suojellaan Lapsia Protect Children, 2024).
In response to the survey of individuals searching for CSAM on dark web search engines, 77% of respondents report that they have encountered CSAM or links to CSAM somewhere on the surface web. The majority say that they have encountered the material on a pornography website or on a social media platform. Regular websites and messaging apps are each cited by more than one in ten respondents as platforms where they have encountered CSAM. 32% of respondents report that they have encountered CSAM on a pornography website. When asked on which platform they have encountered CSAM, Pornhub was the most cited platform. 29% of respondents have encountered CSAM or links to CSAM on social media platforms. 42% of respondents learnt how to access CSAM on the dark web via a regular browser search (Suojellaan Lapsia Protect Children, 2024).
Police data does not generally distinguish between “online” and “offline” child sexual offences. Across England and Wales in 2021/22, however, one-third (34%) of the 103,055 sexual offences against children recorded by the police were imagery offences (Karsna & Bromley, 2023).
In 2020, the ACCCE Child Protection Triage Unit received more than 21,000 reports of online child sexual exploitation. The AFP charged a total of 191 people with 1,847 alleged child abuse-related offences in 2020 (Australian Centre to Counter Child Exploitation, 2021).
Reports of child exploitation material are growing exponentially. Of the 23.4 million reports of images received through the National Center for Missing and Exploited Children’s CyberTipline since 1998, 9.6 million (40%) occurred in 2017 alone – nearly 1 million per month. New CEM is also constantly emerging – 84% of detected images and 91% of videos are reported only once, showing the need for complex detection algorithms that recognise the nature of CEM content (Bursztein, DaLaune, Eliff, et al., 2019). In 2015, British Telecom conducted a one-off exercise to try and establish the number of times that they blocked access to child sexual abuse imagery in the UK. Between January and November 2015, “the average number of attempts to retrieve the CSA image was 36,738 every 24 hours” (IICSA, 2020).
A content analysis of a sample of 729 indecent images of children from 26 offenders has shown that victims were most often White females aged approximately 9.5 years, and that most offenders were White males aged 18-24 years. Most of the analysed images showed erotic posing with no sexual activity. Just over one in ten (13%) images showed sexual activity by an adult on a child and an additional 13% showed sexual activity by a child on an adult (Tejeiro, Alison, Hendricks, Giles, Long & Shipley, 2020).
In 2018, US technology companies (with global users) reported over 45 million online photos and videos of children being sexually abused – more than double what they found the previous year (WeProtect Global Alliance, 2019).
The growth of online CEM is a global threat requiring coordinated action – ten years ago, 70% of reports of online CEM received through the National Center for Missing and Exploited Children’s CyberTipline reflected abuse in the Americas. Today, 68% of reports relate to abuse in Asia, while 19% of reports relate to abuse in the Americas, 6% in Europe and 7% in Africa (Bursztein, DeLaune, Eliff et al., 2019).
Child exploitation material (CEM, sometimes referred to as child pornography), is sexually abusive images of children (Krone & Smith, 2017). Online CEM is now the predominant form and a focus of law enforcement activity (Australian Criminal Intelligence Commission, 2017).
According to the Internet Watch Foundation, which reviews reports of CEM in the UK, 53% of CEM images depict abuse of children under 10 years of age, and 28% of images involve rape and torture (Internet Watch Foundation, 2017).
An online survey of survivors of child sexual abuse and CEM production found that the vast majority of respondents were abused before the age of 12 (87%) and over half of the respondents reported that their abuse began at or before the age of four. Nearly all (95%) respondents reported that still images/photographs of the abuse had been taken, while 72% reported that videos were taken and 14% reported that their abuse had been lived-streamed online. More than half (58%) of respondents reported having been abused by more than one person – some by multiple family members, and 49% of survivors appeared to have been victims of “organised sexual abuse” (Canadian Centre for Child Protection, 2017).
During 2016–17 the Australian Federal Police (AFP) received more than 10,000 reports about child sexual exploitation (not all of which involved CEM) and arrested 70 offenders on 118 charges (AFP, 2017).
Technology facilitated abuse: Offenders
Of the seven types of CSE encountered by investigators (Mitchell et.al., 2025):
- 3% involved traffickers/third party exploiters
- 5% of third-party exploiter cases involved a family member as the trafficker.
- 5% involved purchasers of sex/sex acts
- 3% involved no known third-party exploiter
Nine out of 20 predictors were significantly associated with ever viewing CSAM: age; gender; country of residence; early exposure to adult pornography (<14 years); experienced childhood physical abuse or neglect; self-reporting that they were likely to have sexual contact with a child under 16; ever viewed pornography featuring adults and animals; ever viewed BDSM pornography featuring adults; and recently visited chat forums where people talk about adult/child sexual relations (Napier et.al., 2024).
Data from a subset of 459 CSAM viewers found just under half (47.5 %) reported viewing CSAM again intentionally after first exposure (onset) (Napier et.al., 2024).
CSAM offenders reported that involuntary exposure to CSAM in childhood was prevalent: 70% were first exposed to CSAM when they were under the age of 18; 50% were first exposed to CSAM accidentally (Suojellaan Lapsia Protect Children, 2024).
Use of CSAM is strongly correlated with seeking contact with children: 40% have sought contact with a child after viewing CSAM (Suojellaan Lapsia Protect Children, 2024).
CSAM offenders predominantly search for CSAM depicting girls aged 4-13: 45% search for CSAM depicting girls aged 4-13, 18% search for CSAM depicting boys aged 4-13(Suojellaan Lapsia Protect Children, 2024).
Many CSAM offenders want to stop viewing CSAM, however very few have sought help: 50% want to stop using CSAM, however only 28% have sought help to stop (Suojellaan Lapsia Protect Children, 2024).
A meta-analysis examining the identity of perpetrators in internet sex crimes against children found that a large proportion of perpetrators are juveniles and that most perpetrators of online crimes are known to their victims – 44% of offenders among the 32 included studies were aged under 18 years, and 68% of offenders were known to the victim (Sutton & Finkelor, 2023).
Among an online panel survey of 13,302 Australian adults, just under 1% reported having intentionally viewed child sexual abuse material (CSAM) in the past year. Survey respondents who were aged 18-34 years, were living with a disability, were currently serving or had previously served in the military, or spoke a language other than English at home, were more likely than others to have viewed CSAM (Brown, 2023).
A review of research on parental production of CSAM found that parents are a significant group of CSAM producers, and that parentally produced CSAM is more likely to involve the victimisation of pre-pubescent children and more severe abuse (Salter & Wong, 2023).
A recent Australian study, looking at 82 cases of CSAM produced and distributed by parental figures, found that offending parents are most often the male parental figures of the victims, and victims are predominately girls under nine years of age (Salter, Wong, Breckenridge, Scott, Cooper & Peleg 2021). Over three-quarters (78%) of identified CSAM cases perpetrated by parental figures involved single perpetrators, while the remaining cases (22%) involved multiple perpetrators. A male perpetrator was involved in 90% of the cases – 72% of cases involved a single male perpetrator, 10% involved a single female perpetrator, and 18% involved both male and female perpetrators (Salter et al., 2021). This study revealed three distinct profiles of CSAM offenders in parental roles: the first was a male offender who forms adult relationships and has children of his own to exploit; the second was the male offender who forms a relationship with a woman and exploits her children or seeks to obtain children by some other means (e.g. through surrogacy); and the third was a biological mother who produces CSAM of her children at the behest of men she knows in person or online (Salter et al., 2021).
A study of the collecting behaviours of CEM offenders, involving an online survey completed by adults previously convicted of CEM offences in the United States and a comparison non-offending population of adults, found that offenders viewed more diverse categories of adult sexual exploitation material (SEM) than non-offenders, and that no offenders viewed CEM exclusively. In fact 74% of CEM offenders viewed more adult SEM than CEM. The results of this study did not support highly preferential viewing among most CEM offenders, but rather general novelty seeking, indicating that paedophilic interests are not necessarily the sole or even primary motivator for CEM behaviour (Steel, Newman, O’Rourke, & Quayle, 2021).
A study of financial transactions made by a cohort of Australians who paid known facilitators of CSA live streaming in the Philippines, found that offenders were likely to be aged in their 50s or 60s, and the majority (55%) had no criminal record. Most CSA live streaming transactions involved a small proportion of offenders: just 3% accounted for half of all transactions, while 25% made just 3% of the transactions (Brown, Napier & Smith, 2020).
Research on CEM users demonstrates that 50-85% admit to having undetected child victims, and the average number of undetected victims per offender was 8 (Johnson, 2020).
A recent review of the literature on those who view or collect child exploitation material (CEM) found that CEM offenders are typically male, white, with an average age of between 35 and 45 and they are often single. They also tend to be better educated and more likely work in professional occupations than other sexual offenders. CEM offenders also tend to be less assertive, and less socially confident than other sexual offenders and show higher levels of sexual deviancy. CEM only offenders also do not typically have previous offending histories for contact sexual offences (Brown & Bricknell, 2018).
Gewirtz-Meydan and colleagues (2018) found that in more than half (52%) of the CSAM production cases they analyses, the offender was a family member of the victim, and in 41% of cases the offender was an acquaintance. In this same study, perpetrators of female victims were found to be more likely family members whereas perpetrators of male victims were more likely to be acquaintances (Gewirtz-Meydan et al., 2018).
Research suggests that a significant proportion of child sexual abuse material (CSAM) is produced and distributed by parents who victimise their children. An online convenience sample of 150 adult survivors of CSAM found that, of those abused by a single perpetrator, 42% identified their biological or adoptive father or stepfather as the offender; and of those abused by multiple perpetrators 67% identified their biological or adoptive parents or step-parents as the primary perpetrators (Canadian Centre for Child Protection (CCCP), 2017).
Krone and Smith (2017) report a study of 152 federal offenders investigated by the Australian Federal Police for online child sexual exploitation offences. All offenders were men, most were described as Caucasian, and most were aged between 46 – 55 years. Of the 152 offenders, 85 (66%) had no prior criminal history. Eleven offenders had previously been convicted of a CEM offence and 10 had been convicted of a contact child sexual exploitation offence. The study also found that characteristics related to having a record of contact offender included: low SES, a conviction for producing CEM, undertaking a networking role in CEM offending, providing CEM, and having a criminal history of charges for producing CEM (Krone & Smith, 2017).
A review of the literature on online CEM offenders shows that on average, offenders are almost exclusively male and Caucasian in ethnicity, tend to be in their late-30s to mid-40s, employed and well educated – in contrast to the profile of the general offending population, high proportions of whom are of ethnic minority status and have limited educational backgrounds. Additionally, studies have shown typically low rates of historical and prospective offending among CEM offenders (Henshaw, Ogloff & Clough, 2017).
Artificial intelligence and child sexual abuse
Analysis of a dark web child sexual abuse forum found that total of 3,512 AI CSAM images. 90% of images assessed by IWF analysts were realistic enough to be assessed under the same law as real CSAM. 32% of criminal pseudo-photographs were Category A, indicating that perpetrators are experiencing more success generating complex ‘hardcore’ scenarios (internet Watch Foundation, 2024)
Live streaming of child sexual abuse
A scoping review of live streaming of child sexual abuse found that the average livestream offender was older than the average online child sexual abuse offender (Drajer, Riegler, Halvorsen, Johnson, & Baugerud (2023).
An Australian survey of almost 10,000 people who had used mobile dating apps and/or dating websites in the previous five years found that 12.4% of respondents reported receiving requests to facilitate the sexual exploitation of their own children or children they had access to. Requests included asking for sexual information about children or for sexual images or videos of children, asking to meet children in person or asking for children to perform sex acts over webcam (Teunissen, Boxall, Napier, & Brown, 2022).
Europol (2020), in monitoring darknet sites, reported an increase in sharing of child sexual abuse material (CSAM) captured through webcam early in the COVID-19 pandemic (from March to May 2020). This included a category listed on forums as “live streams”. Europol attributed this increase to offenders moving from contact offending to online offending due to travel restrictions (Napier, Teunissen & Boxall, 2021a).
The Internet Watch Foundation (2018) conducted an international analysis of over 2,000 image and video captures from CSA live streaming from August to October 2017. The study found that 98% of CSA live streaming captures showed children 13 years or younger, and 28% showed children aged 10 years or younger. Forty percent of the captures were classified by the Internet Watch Foundation as containing “serious” sexual abuse, with 18% involving the rape and sexual torture of children (IWF, 2018, cited in Napier, Teunissen & Boxall, 2021a).
A study analysing chat logs from seven offenders who watched and directed the sexual abuse of 74 children (mostly in the Philippines) via live stream found that offenders paid facilitators and victims very small amounts of money (median A$51) to view the sexual abuse of children, and used mainstream messaging and video platforms (e.g. Facebook) to communicate and transmit the abusive materials (Napier, Teunissen & Boxall, 2021a).
Further analysis of chat logs from seven offenders who watched and directed the sexual abuse of 74 children via live stream found that offenders accessed victims via three primary methods: establishing relationships and contact with Filipino locals; proactively contacting potential victims and facilitators through social media or dating sites; and facilitators proactively contacting potential offenders through social media and dating sites. Facilitators were found to be involved in 35% of the 145 CSA live streaming offences included in the dataset. At least 15 of the 20 facilitators were female, and most commonly they were a relative of the victim (Napier, Teunissen & Boxall, 2021b).
Live streaming of child sexual abuse (CSA) involves broadcasting acts of sexual abuse of children live via webcam to people anywhere in the world (ECPAT International, 2017).
Online child exploitation material: impact on victims
A majority, namely 84%, shared that they were subjected to abuse on more than one occasion. In total, 83% of those who experienced online abuse say that it has led to long-term consequences, namely depression, difficulties in forming and maintaining close relationships, anxiety disorder/panic attacks, and PTSD/PTS symptoms (Suojellaan Lapsia Protect Children, 2024).
An online survey with survivors of CEM production found that younger survivors suffered higher levels of psychopathology in adulthood, and that specific reactions to the crime, guilt about the crime, as well as embarrassment related to authorities seeing the images, were predictive of adult trauma symptoms (Gewirtz-Meydan, Lahav, Walsh, et al., 2019).
Gewirtz-Meydan and colleagues (2018) conducted a survey study with survivors of CEM production, and found that many participants reported they had a number of negative reactions “all of the time”: 74% felt ashamed, guilty, or humiliated all of the time, 54% always worried that people who saw the images would think they were a willing participant, 51% always felt it was their fault the images were created, 48% always worried about friends or other people they knew seeing the images, and 48% worried all the time that people who saw the images would recognize them.
Often, due to the permanence of online materials and lack of control over the audience viewing it, initial feelings of shame and anxiety can increase over time, negatively impacting a victim’s psychological state (Gerwirtz-Meydan et al., 2018).
Often a single perpetrator is unable to be identified, due to online sharing of materials, and can create feelings of re-victimisation, preventing closure for the individual (Leonard, 2010). This permanence and public accessibility can be one of the most difficult aspects for survivors to overcome (Gerwirtz-Meydan et al. 2018).
An online survey of survivors of child sexual abuse and CEM production found that of the survivors of organised sexual abuse, 68% reported receiving a diagnosis of or made reference to dissociative disorders or experiencing dissociation, as compared to 25% of remaining respondents. Many survivors of organised sexual abuse reported that as a result of their dissociative identity disorder, they had difficulty with both memory recall and providing accurate accounts of the abuse they experienced (Canadian Centre for Child Protection, 2017).
In the online survey of survivors of child sexual abuse and CEM production, nearly 70% of respondents reported worrying about being recognised by someone who has seen images of their abuse (n=103). Thirty respondents (30%) reported being identified by a person who had viewed the child sexual abuse imagery (Canadian Centre for Child Protection, 2017).
References
Australian Centre to Counter Child Exploitation (2021). ACCCE Statistics 2020. Retrieved from https://www.accce.gov.au/resources/research-and-statistics/2020statistics
Australian Centre to Counter Child Exploitation (2020). Online child sexual exploitation: Understanding community awareness, perceptions, attitudes and preventative behaviours. Canberra: ACCCE.
Australian Communications and Media Authority & Office of the Children’s eSafety Commissioner (2017). Annual reports 2016–17. Canberra: ACMA & Office of the Children’s eSafety Commissioner.
Australian Criminal Intelligence Commission (2017). Organised crime in Australia 2017. Canberra: ACIC.
Australian Federal Police (2017). Annual report 2016–17. Canberra: AFP
Baines, V. (2019). Netclean report 2018. Gothenburg, Sweden: Netclean.
Brown, R. (2023). Prevalence of viewing online child sexual abuse material among Australian adults. Trends & Issues in Crime and Criminal Justice No. 682. Canberra: Australian Institute of Criminology.
Brown, R., & Bricknell, S. (2018). What is the profile of child exploitation material offenders? Canberra: Australian Institute of Criminology.
Brown, R., Napier, S., & Smith, R.G. (2020). Australians who view live streaming of child sexual abuse: An analysis of financial transactions. Trends & Issues in Crime and Criminal Justice No. 589. Canberra: Australian Institute of Criminology.
Burén. J., & Lunde, C. (2018). Sexting among adolescents: A nuanced and gendered online challenge for young people. Computers in Human Behavior, 85, 210-217.
Bursztein, E., Bright, T., DeLaune, M., Eliff, D.M., Hsu, N., et al. (2019). Rethinking the detection of child sexual abuse imagery on the internet. https://doi.org/10.1145/3308558.3313482
Canadian Centre for Child Protection (2017). Survivors’ survey: Full report 2017. Winnipeg, Canada: Canadian Centre for Child Protection.
Children’s Commissioner (2023). ‘A lot of it is actually just abuse’: Young people and pornography. London, UK: Children’s Commissioner.
Clancy, E.M., Klettke, B., & Hallford, D.J. (2019). The dark side of sexting – factors predicting the dissemination of sexts. Computers in Human Behavior, 92, 266-272.
Crabbe M, Flood M, Adams K. (2024) Pornography exposure and access among young Australians: a cross-sectional study. Australian and New Zealand Journal of Public Health, 11:100135. doi: 10.1016/j. PMID: 38508985
De Santisteban, P., & Gámez-Guadix, M. (2018). Prevalence and risk factors among minors for online sexual solicitations and interactions with adults. Journal of Sex Research, 55(7), 939-950.
Drajer, C., Riegler, M.A., Halvorsen, P., Johnson, M.S., & Baugerd, G.A. (2023). Livestreaming technology and online child sexual exploitation and abuse: A scoping review. Trauma, Violence, and Abuse, doi: 10.1177/15248380221147564
ECPAT International (2017). Online child sexual exploitation: An analysis of emerging and selected issues. ECPAT International Journal; 12, 1–63.
Emery, C.R., Wong, P.W.C., Haden-Pawlowski, V., Poi, C., Wong, G., Kwok, S., et al. (2024). Neglect, online invasive exploitation, and childhood sexual abuse in Hong Kong: Breaking the links. Child Abuse & Neglect, 147, doi: 10.1016/j.chiabu.2023.106591.
Enough is Enough (2017). Youth view porn as just “harmless fun”. Enough is Enough Newsletter, June 13, 2017. http://enough.org/news/ADCO704ZKR7
eSafety Commissioner (2025) Behind the screen: The reality of age assurance and social media access for young Australians. Sydney [NSW] eSafety Commissioner
eSafety Commissioner (2023). Technology-facilitated abuse: family, domestic and sexual violence literature scan. Canberra: Australian Government
eSafety Commissioner (2022). Mind the gap: Parental awareness of children’s exposure to risks online. Canberra: Australian Government.
eSafety Commissioner (2021). The digital lives of Aussie teens. Canberra: Australian Government.
eSafety Commissioner (2019a). Digital parenting: Supervising pre-schoolers online. Canberra: Australian Government. Accessed from https://www.esafety.gov.au/about-the-office/research-library/digital-parenting-supervising-pre-schoolers-online
eSafety Commissioner (2019b). Parenting in the digital age. Canberra: Australian Government.
eSafety Commissioner (2018a). Aussie teens and kids online. Canberra: Australian Government. Accessed from https://www.esafety.org.au/about-the-office/research-library/aussie-teens-and-kids-online
eSafety Commissioner (2018b). State of play: Youth, kids and digital dangers. Canberra: Australian Government.
Europol (2020). Exploiting isolation: Offender and victims of online child sexual abuse during the COVID-19 pandemic. The Hague: Europol.
Finkelhor, D., Turner, H., Colburn, D., Mitchell, K., & Mathews (2023). Child sexual abuse images and youth produced images: The varieties of Image-based Sexual Exploitation and Abuse of Children. Child Abuse & Neglect, 143, doi: 10.1016/j.chiabu.2023.106296.
Fisher, C. M., Waling, A., Kerr, L., et al. (2019). 6th National Survey of Australian Secondary Students and Sexual Health 2018, (ARCSHS Monograph Series No. 113). Bundoora: Australian Research Centre in Sex, Health & Society, La Trobe University.
Flynn, A., Cama, E., & Scott, A.J. (2022). Image-based abuse: Gender differences in bystander experiences and responses. Trends & Issues in Crime and Criminal Justice No. 656. Canberra: Australian Institute of Criminology.
Foody, M., Mazzone, A., Laffan, D.A., Loftsson, M., & O’Higgins Norman, J. (2021). “It’s not just sexy pics”: An investigation into sexting behaviour and behavioural problems in adolescents. Computers in Human Behavior, 117, doi: 10.1016/j.chb.2020.106662
Gasso, A.M., Mueller-Johnson, K., Agustina, J.R., & Gomez-Duran, E. (2021). Mental health correlates of sexting coercion perpetration and victimisation in university students by gender. Journal of Sexual Aggression, 10.1080/13552600.2021.1894493.
Gewirtz-Meydan, A., Lahav, Y., Walsh, W., & Finkelhor, D. (2019). Psychopathology among adult survivors of child pornography. Child Abuse & Neglect, https://doi.org/10.1016/j.chiabu.2019.104189.
Gewirtz-Meydan, A., Mitchel, K.J., & Rothman, E.F. (2018). What do kids think about sexting? Computers in Human Behavior, 86, 256-265.
Gewirtz-Meydan, A. Walsh, W., Wolak, J., & Finkelhor, D. (2018). The complex experience of child pornography survivors. Child Abuse & Neglect, 80, 238-248.
Greene-Colozzi, E.A., Winters, G.M., Blasko, B., & Jeglic, E.L. (2020). Experiences and perceptions of online sexual solicitation and grooming of minors: A retrospective report. Journal of Child Sexual Abuse, 29(7), 836-854, DOI: 10.1080/10538712.2020.1801938
Henry, N., McGlynn, C., Flynn, A., Johnson, K., Powell, A., & Scott, A.J. (2020). Image-based sexual abuse: A study on the causes and consequences of non-consensual nude or sexual imagery. NY: Routledge.
Henry, N., Flynn, A., & Powell, A. (2019). Image-based sexual abuse: Victims and perpetrators. Trends & Issues in Crime and Criminal Justice No. 572. Canberra: Australian Institute of Criminology.
Henshaw, M., Ogloff, J.R.P., & Clough, J.A. (2017). Looking beyond the screen: A critical review of the literature on the online child pornography offender. Sexual Abuse, 29(5), 416-445.
Independent Inquiry into Child Sexual Abuse (2020). The internet: Investigation report. IICSA, UK.
International Centre for Missing and Exploited Children (2018). Studies in child protection: Sexual extortion and non-consensual pornography. Alexandria, Virginia: International Centre for Missing and Exploited Children.
Internet Watch Foundation (2024). What has changed in the AI CSAM landscape? UK: IWF
Internet Watch Foundation (2018). Trends in child sexual exploitation: Examining the distribution of captures of live-streamed child sexual abuse. Cambridge, UK: Internet Watch Foundation.
Internet Watch Foundation (2017). IWF annual report 2016. https://www.iwf.org.uk/sites/default/files/reports/2017-04/iwf_report_2016.pdf
Johnson, S. (2020). Child porn users & risk for engaging in contact offenses: Fault data minimizes offender’s risk & puts more children at risk for sexual abuse. Forensic Research & Criminology International Journal, 8(2): 93-99
Joleby, M., Lunde, C., Landström, S., & Jonsson, L.S. (2021). Offender strategies for engaging children in online sexual activity. Child Abuse & Neglect, 120, https://doi.org/10.1016/j.chiabu.2021.105214.
Juusola, A., Simola, T., Tasa, J., Karhu, E., & Sillfors, P. (2021). Grooming in the eyes of a child: A report on the experiences of children on online grooming. Finland: Save the Children Fund.
Karsna, K., & Bromley, P. (2023) Child sexual abuse in 2021/22: Trends in official data. Barkingside: Centre of expertise on child sexual abuse.
Katz, A. & El Asam, A. (2020). Look at me – Teens, sexting, and risks. London: Internet Matters.
Krone, T., & Smith, R. (2017). Trajectories in online child sexual exploitation offending in Australia. Trends & Issues in Crime and Criminal Justice, 524, 1-13.
Leonard, M. M. (2010). “I did what I was directed to do but he “didn’t touch me”: The impact of being a victim of internet offending. Journal of Sexual Aggression, 16(2), 249–256.
Madigan, S., Ly, A., Rash, C.L., Van Ouytsel, J., & Temple, J.R. (2018). Prevalence of multiple forms of sexting behavior among youth: A systematic review and meta-analysis. JAMA Pediatrics, 172(4), 327-335.
Marshall, E.A., Miller, H.A., & Bouffard, J.A. (2021). Crossing the threshold from porn use to porn problem: Frequency and modality of porn use as predictors of sexually coercive behaviors. Journal of Interpersonal Violence, 36(3-4), 1472-1497.
Mitchell, k., Jones, L., O’Brien, J., & Puchlopek-Adams, A. (2025). Data Capture: Types of Commercial Sexual Exploitation of Children Cases Investigated by Law Enforcement. Durham, New Hampshire: Crimes Against Children Research Centre.
Napier, S., Teunissen, C., & Boxall, H. (2021a). Live streaming of child sexual abuse: An analysis of offender chat logs. Trends & Issues in Crime and Criminal Justice no. 639. Canberra: AIC.
Napier, S., Teunissen, C., & Boxall, H. (2021b). How do child sexual abuse live streaming offenders access victims? Trends & Issues in Crime and Criminal Justice no. 642. Canberra: AIC.
Napier, S., Seto, M., Cashmore, C., & Shackel, R. (2024). Characteristics that predict exposure to and subsequent intentional viewing of child sexual abuse material among a community sample of Internet users. Child Abuse & Neglect, 156, doi.org/10.1016/j.chiabu.2024.106977
Netclean (2019). Netclean Report 2019: A report about child sexual abuse crime. Gothenburg, Sweden: Netclean.
Noll, L.K., Harsey, S.J., & Freyd, J.J. (2022). Assessment of attitudes toward internet pornography in emerging adults using the Internet Pornography Questionnaire. Computers in Human Behavior, doi: doi.org/10.1016/j.chb.2022.107231.
Ortega-Barón, J., Machimbarrena, J.M., Calvete, E., Pereda, N., & González-Cabrera, J. (2022). Epidemiology of online sexual solicitation and interaction of minors and adults: A longitudinal study. Child Abuse & Neglect, 131, DOI: 10.1016/j.chiabu.2022.105759.
Our Watch (2020). Pornography, young people and preventing violence against women. Melbourne: Our Watch.
Pampati, S., Lowry, R., Moreno, M.A., Rasberry, C.N., & Steiner, R.J. (2020). Having a sexual photo shared without permission and associated health risks: A snapshot of nonconsensual sexting. JAMA Pediatrics, 174(6), 618-619.
Patchin, J.W., & Hinduja, S. (2018). Sextortion among adolescents: Results from a national survey of U.S. youth. Sexual Abuse, https://doi-org.ezp01.library.qut.edu.au/10.1177/1079063218800469
Politoff, V., Crabbe, M., Honey, N., et al. (2019). Young Australians’ attitudes to violence against women and gender equality: Findings from the 2017 National Community Attitudes towards Violence against Women Survey (NCAS) (ANROWS Insights, Issue 01/2019). Sydney: ANROWS.
Powell, M.B., Casey, S., & Rouse, J. (2021). Online child sexual offenders’ language use in real-time chats. Trends & Issues in Crime and Criminal Justice no. 643. Canberra: AIC.
Rhodes, A. (2017). Screen time and kids: What’s happening in our homes? Australian Child Health Poll, Poll 7, June 2017. Melbourne: Royal Children’s Hospital Melbourne.
Salter, M., & Wong, T. (2023). Parental production of child sexual abuse material: A critical review. Trauma, Violence and Abuse, doi: 10.1177/15248380231195891.
Salter, M., Wong, W.K.T., Breckenridge, J., Scott, S., Cooper, S., & Peleg, N. (2021). Production and distribution of child sexual abuse material by parental figures. Trends & Issues in Crime and Criminal Justice No. 616. Canberra: Australian Institute of Criminology.
Seto, M., Roche, K., Nicholas, M. & Newton, J. (2024) Predictors of online child sexual exploitation through image-sharing. Child Protection and Practice, 2: https://doi.org/10.1016/j.chipro.2024.100045
Sklenarova, H., Schulz, A., Schuhmann, P., & Osterheider, M. (2018). Online sexual solicitation by adults and peers – results from a population based German sample. Child Abuse and Neglect, 76, 225-236.
Steel, C.M.S., Newman, E., O’Rourke, S., & Quayle, E. (2021). Collecting and viewing behaviors of child sexual exploitation material offenders. Child Abuse & Neglect, doi: 10.1016/j.chiabu.2021.105133
Suojellaan Lapsia Protect Children (2024). Tech Platforms Used by Online Child Sexual Abuse Offenders: Research Report with Actionable Recommendations for the Tech Industry
Sutton, S., & Finkelhor, D. (2023). Perpetrators’ identity in online crimes against children: A meta-analysis. Trauma, Violence & Abuse, doi: 10.1177/15248380231194072
Tejeiro, R., Alison, L., Hendricks, E., Giles, S., Long, M., & Shipley, D. (2020). Sexual behaviours in indecent images of children: A content analysis. International Journal of Cyber Criminology, 14(1), 121-138.
Teunissen, C., Boxall, H., Napier, S., & Brown, R. (2022). The sexual exploitation of Australian children on dating apps and websites. Trends & Issues in Crime and Criminal Justice No. 658. Canberra: Australian Institute of Criminology.
Thorn (2024). Youth Perspectives on Online Safety, 2023. Available at: https://www.thorn.org/research/library/2023-youth-perspectives-on-online-safety
Thorn (2022a). Self-generated child sexual abuse material: Youth attitudes and experiences in 2021. Retrieved from https://info.thorn.org/hubfs/Research/Thorn_SG-CSAM_Monitoring_2021.pdf
Thorn (2022b). Online grooming: Examining risky encounters amid everyday digital socialization. Retrieved from https://info.thorn.org/hubfs/Research/2022_Online_Grooming_Report.pdf
Thorn (2019). Sextortion: Summary findings from a 2017 survey of 2,097 survivors. Retrieved from https://www.thorn.org/wp-content/uploads/2019/12/Sextortion_Wave2Report_121919.pdf
UK Safer Internet Centre (2020). ‘Disturbing’ rise in videos of children who have been groomed into filming their own abuse. Available from https://www.saferinternet.org.uk/blog/%E2%80%98disturbing%E2%80%99-rise-videos-children-who-have-been-groomed-filming-their-own-abuse
UK Safer Internet Centre, Netsafe, & Office of the eSafety Commissioner (2017). Young people and sexting – attitudes and behaviours: Research findings from the United Kingdom, New Zealand and Australia. Retrieved from: https://www.esafety.gov.au/-/media/cesc/documents/corporate-office/young_people_and_sexting_attitudes_and_behaviours_pdf.pdf
Walsh, K., Mathews, B., Parvin, K., Smith, R., Burton, M., Nicholas, M., Napier, S., Cubitt, T., Erskine, H., Thomas, H.J., Finkelhor, D., Higgins, D.J., Scott, J.G., Flynn, A., Noll, J., Malacova, E., Le, H., and Tran, N. (2025). Prevalence and characteristics of online child sexual victimization: Findings from the Australian Child Maltreatment Study. Child Abuse and Neglect 160, doi.org/10.1016/j.chiabu.2024.107186
Warren, D., & Swami, N. (2019). Teenagers and sex. In, Longitudinal Study of Australian Children Annual Statistical Report 2018. Melbourne: Australian Institute of Family Studies.
WeProtect Global Alliance (2019). Global threat assessment 2019: Working together to end the sexual exploitation of children online. London: WeProtect.
Winters, G.M., Kaylor, L.E., & Jeglic, E.L. (2017). Sexual offenders contacting children online: An examination of transcripts of sexual grooming. Journal of Sexual Aggression, 23(1), 62-76.