TORONTO -- A new report suggests that many internet companies are failing to curb the spread of child sexual abuse material online and are slow to remove such content with delays of more than 42 days in some cases.

For the report, researchers at the Canadian Centre for Child Protection (C3P) analyzed more than 5.4 million verified images of child sexual abuse material (CSAM) related to more than 760 electronic service providers (ESPs) worldwide over the course of three years.

To do this, the researchers used a tool called Project Arachnid that crawls the web in search of CSAM and sends a removal request to the ESP once it’s detected.

During the three years it took to compile the report, C3P reported that the tool issued notices on more than 18,000 archives files, collectively containing nearly 1.1 million verified image or video files assessed as CSAM or harmful-abusive content to minors.

According to the report, the vast majority (97 per cent) of this content is physically hosted on the clear web – the portion that is readily available to the general public and search engines. However, the dark web – encrypted online content that isn’t available on traditional search engines – plays a prominent role in directing users on how to find CSAM on the clear web.

Despite all of those removal requests, the C3P report said there were long delays in removal times and in 10 per cent of cases, the content took more than 42 days before it became inaccessible.

“This report is worrisome,” a group of survivors, whose child sexual abuse was recorded and call themselves the Phoenix 11, said in a statement. “42+ days to remove content is 42+ days these ESPs are enabling crimes against children, and 42+ days that these children will suffer again and again as their abuse continues.”

Generally, CP3 found that images showing older adolescents (post-pubescent) took even longer to take down than those showing younger victims (pre-pubescent) and were more likely to reappear online.

Unfortunately, the report said that nearly half (48 per cent) of all content that Project Arachnid issued a removal request for had already been flagged to the service provider.

What’s more, certain ESPs had recidivism rates of higher than 80 per cent – meaning that images that were removed were repeatedly resurfacing on their systems.

“The findings in our report support what those who work on the frontlines of child protection have intuitively known for a long time — relying on internet companies to voluntarily take action to stop these abuses is not working,” Lianna McDonald, the executive director for C3P, said in a release.

The C3P report suggests that many ESPs are failing to use resources, such as widely available blocking technology for CSAM and human moderation, to prevent the spread of this content on their platforms.

That’s why C3P and the Phoenix 11 survivors are calling for swift government regulation and policies to impose accountability requirements on these companies, especially those that allow user-generated content.

“Children and survivors are paying the price for our collective failure to prioritize their protection and put guardrails around the internet,” McDonald said.