Skip to main content

'Pure evil': How the pandemic has given rise to online child exploitation, livestreamed abuse

Two adolescents look at a smartphone in this undated stock image (Pexels/Mary Taylor) Two adolescents look at a smartphone in this undated stock image (Pexels/Mary Taylor)
Share

WARNING: This story contains details and content that may disturb some

The COVID-19 pandemic, with its lockdowns and travel restrictions, has seen millions of Canadian children staying at home and spending more time than usual online, where they are more vulnerable to predators and exploitation. Now, the RCMP and national tip line Cybertip.ca are warning about the rise in online child exploitation they are seeing during the pandemic.

The RCMP’s National Child Exploitation Crime Centre is reporting up to 500 new files a day, while Cybertip.ca, which is run by the Canadian Centre for Child Protection, has seen a 120 per cent increase in reports of children being victimized online compared to pre-pandemic rates.

It’s a disturbing trend that has been seen globally, with U.K.-based organization Internet Watch Foundation reporting that 2021 was the worst year on record for online child abuse.

While “child pornography” is the term used in the Criminal Code of Canada, most charities, advocates and law enforcement use the term “child sexual abuse material” (CSAM) which better describes assaults against children, as pornography can infer consent between parties to create or participate in material.

Online CSAM generally refers to myriad of offences, including written depictions of child sexual abuse; audio, video and images of the abuse; sextortion, where coercion and threats are used to extort child sexual exploitation images or videos from youth; and grooming and luring via apps and online platforms.

Rosiane Racine, a sergeant in the NCECC, told CTVNews.ca this week that the centre, which tracks its intake of reports from the U.S.-based National Center for Missing and Exploited Children (NCMEC), saw a 31 per cent increase in reports in the 2020-21 fiscal year compared to 2019-20. 

“Nothing surprises me anymore. This is the worst that humanity has to offer, this is pure evil,” Racine said of the nature of her work. “What is happening to these children, the level to which offenders go to exploit children, sharing the material – it’s just non-stop.”

According to the most recent data available, the 2020 Statistics Canada police-reported crime data which include the first nine months of the COVID-19 pandemic, a disturbing trend in violence against children is seen especially when compared to the previous five-year average.

Incidents of making or distributing child pornography have increased by 27 per cent compared to 2019, and by 89 per cent when compared to the previous five-year average.

Possession of, or accessing child pornography has increased by 19 per cent compared to 2019, and represents a 48 per cent increase compared to the previous five-year average. Luring a child via a computer incidents have gone up 15 per cent compared to 2019, a 37 per cent increase from the previous five years.

Incidents of non-consensual distribution of intimate images have increased by 11 per cent compared to 2019, an 80 per cent increase compared to the previous five-year average.

The data included in the Statistics Canada report only encompasses incidents that are reported to the police, which means the true scale of CSAM in Canada is difficult to gauge.

“It’s definitely continuing its upward trend,” Racine said of online CSAM during the pandemic. “There is definitely some luring offences, some sharing of images without consent, self exploitation and what definitely stands out to me is the livestreaming.”

“With the pandemic we have seen an uptick in livestreaming with overseas victims, because people that would traditionally be travelling to those countries can’t do it because of the pandemic,” she explained of travel restrictions affecting what is known as sex tourism. “So they resort to online methods to pursue abuse.”

Racine said that “self exploitation” incidents are also often livestreamed, and can be the brainchild of an offender directing the child to do certain things on camera which are then livestreamed to others. ‘Self exploitation’ incidents can also involve minors who believe they are speaking to, or streaming to, a peer or a friend online – content which is then distributed by the offender without consent.

Stephen Sauer, director of Cybertip.ca also noted an “uptick” in reports of online abuse and exploitation of children in the pandemic.

“When I look at the statistics for the last 20 months, compared to the pre-pandemic period, we’re looking at a 120 per cent increase in reports of children being victimized online,” Sauer said in a telephone interview with CTVNews.ca Monday.

Sauer also mentioned livestreaming offences involving self-exploitation or parents abusing their children, but said a lot of the tips received were on the “luring side,” where offenders record children without their knowledge and use that to blackmail them either for money or to create more CSAM.

In September 2021, Cybertip.ca warned parents about a trend affecting mostly males between 15 and 17 years of age who were being fooled, manipulated or coerced into various livestreaming situations. Some of them were tricked by a pre-recorded “bait video” that led them to believe they were engaging in reciprocal sexual acts with a peer, but instead, they were being recorded by an offender.

In that particular scheme, some of the offenders asked for money in exchange not to post or share the teens’ intimate images or videos, and demands ranged from $70 to $700. The money is typically requested over online payment providers such as PayPal, but in a few instances offenders requested Google Play and Apple gift cards as payment, according to Cybertip.ca

Sauer said some of the most common ways predators lure children online is by connecting with them on sites or apps such as Omegle, TikTok, Instagram, Discord and Facebook Messenger and then redirecting their victims to continue talking to them on encrypted services such as WhatsApp and Kik.

To Sauer, the platforms in question are not doing enough to safeguard children using them or to prevent the uploading and distribution of CSAM.

“They’re not paying enough attention to what's being posted to their services. They are allowing for anonymous uploads from Tor IP addresses, right? Tor is basically the biggest dark web system out there,” he said. “As a service provider, you can actually block Tor exit nodes through your IP addresses so that you don't have people uploading from a Tor IP address - so you can really eliminate a big section of content….there are tools and techniques out there that that these providers could be taking advantage of.”

CTVNews.ca reached out to all of the companies mentioned by Sauer.

In an emailed statement to CTVNews.ca Tuesday, a Discord spokesperson said the company “works relentlessly to keep bad actors off our service and we take the safety of all Discord users, especially our younger users, incredibly seriously. We have zero-tolerance for child exploitation and take immediate action when we become aware of it, including banning users, shutting down servers, and when appropriate, engaging with the proper authorities.” The emailed statement also pointed CTVNews.ca to Discord’s online Terms of Service and Community Guidelines.

Omegle and Kik did not respond to CTVNews.ca by time of publication.

Instagram, Facebook and WhatsApp are owned by parent company Meta, in response to CTVNews.ca a Meta spokesperson said in an email Tuesday that the company reports all apparent incidents of child exploitation appearing on its platforms from anywhere in the world to NCMEC, and that the company uses “sophisticated technology” such as “PhotoDNA,” which scans all images on Facebook, Instagram and Messenger to detect, remove and prevent the sharing of images and videos that exploit children.

In a statement emailed to CTVNews.ca Thursday, a TikTok spokesperson said the organization has "zero tolerance for child sexual abuse material which violates our Community Guidelines and would be promptly removed and reported to the National Center for Missing and Exploited Children. To further protect teens, accounts under 16 are set to private by default and direct messaging is disabled. We remain vigilant and will continue to take the appropriate steps to protect our community."

Racine and Sauer both said the nature of distributing online CSAM has changed, with Racine citing the dark web as a place where the dissemination of material happens frequently, but also private members-only chatrooms like in the case of South Korea’s “Nth Room”and “Baksa room.”

But Sauer said Canadians may be surprised that much of the CSAM he sees is hosted on public sites.

“There's a lot of discussion on the dark web, a lot of links posted to the dark web. But the majority of the material we were seeing was actually hosted on the open web,” he said. “Offending communities on the dark web would provide links to file hosting services, you know, available on the open web, and those links would be openly accessible, easily downloadable, making it easy to share that type of thing.”

KEEPING YOUR KIDS SAFE ONLINE

Both Racine and Sauer said it is important for parents to keep an eye out for any sudden changes in their kids’ behaviour, including them feeling stressed or distraught but not wanting to disclose why. Or they may seem disengaged from daily life, hobbies, friends and family.

“Especially with younger kids, if they're having conversations with people that you don't know, that could be a warning sign,” Sauer said. “Know what they're engaged in, learn a little bit more about the platforms that they are intersecting with, [and] the privacy settings… check out the terms of service.”

The Canadian Centre for Child Protection recommends having regular conversations with children about online safety, and protectkidsonline.ca has tips on how to get the discussion started. The centre also recommends setting expectations that a parent or caregiver will monitor the child’s online activities and to work together to establish guidelines around things like texting, social media, live streaming and gaming.

If you have younger children, the centre recommends helping them create their login, password and profile information and ensuring that it is set to private. For teens and tweens, parents or caregivers can help set up privacy settings in the apps, games and social media accounts they use. The centre reiterates that it is important for a child to know that if they come across someone or something while online that makes them feel uncomfortable, they can tell a trusted adult without fear of getting in trouble or losing digital privileges.

Racine echoed that one of the most important thing is for parents or caregivers to keep an open, honest dialogue with their children. It is important for children to feel that if something bad happens, they are not in trouble and need to come forward and tell a trusted adult in their lives what is going on, she said.

“You can’t have your head in the sand, going and educating yourself on online safety is so important and critical to not have that false sense of security that if your child is in the living room that they're safe, because they may not be,” Racine said. “It can happen to anybody, literally anybody can be victimized.” 

----
Edited by CTVNews.ca producer Sonja Puzic

CTVNews.ca Top Stories

Hertz CEO out following electric car 'horror show'

The company, which announced in January it was selling 20,000 of the electric vehicles in its fleet, or about a third of the EVs it owned, is now replacing the CEO who helped build up that fleet, giving it the company’s fifth boss in just four years.

Stay Connected