TORONTO -- The RCMP has admitted to using controversial facial recognition software developed by Clearview AI, a company currently under investigation by the federal privacy commissioner.

According to a statement released Thursday, the National Child Exploitation Crime Centre (NCECC) has been using the controversial software for four months in a “limited capacity” to assist in online child sexual exploitation investigations.

“While the RCMP generally does not disclose specific tools and technologies used in the course of its investigations, in the interest of transparency, we can confirm that we recently started to use and explore Clearview AI's facial recognition technology in a limited capacity,” read the statement.

“Only trained victim identification specialists in the NCECC use the software primarily to help identify, locate and rescue children who have been or are victims of online sexual abuse.”

Clearview AI’s technology allows police agencies to trawl a vast amount of online sources to help identify people.

The company came under scrutiny in early January after The New York Times published a report about its work with law enforcement agencies. The report alleged that the company scraped three billion images from online sources, including Facebook and YouTube.

The Times report included quotes from an anonymous Canadian law enforcement official who said the software was the “biggest breakthrough in the last decade” for identifying young victims of sexual abuse.

Several social media giants, including Twitter and Google, have since sent the company cease-and-desist letters.

Last week, federal Privacy commissioner Daniel Therrien announced he and three provincial counterparts will jointly investigate use of the technology in Canada to examine whether the company’s data collection practices comply with Canadian law.

The RCMP previously refused to confirm whether it had used the technology.

In a statement posted to Twitter late Thursday, the Office of the Privacy Commissioner announced it would launch an investigation into the RCMP’s use of the software.

“In light of the RCMP’s acknowledgement of their use of Clearview’s facial recognition technology, we are launching an investigation,” read the statement. “Given we are now investigating, no further details are available at this time.”

According to the statement, the NCECC has two licences for the software and has used it in 15 cases, which have led to the identification and rescue of two children.

However, the RCMP note there has been “limited use of Clearview AI on a trial basis by a few units in the RCMP to determine its utility to enhance criminal investigations.”

“The RCMP will be engaging with the Privacy Commissioner to work in partnership with him to develop guidelines and policies that conform to legislation and regulations,” the statement continues.

“The Internet has changed the way child sexual exploitation offences are committed, investigated and prosecuted and Clearview AI is only one of many tools/techniques that are used in the identification of victims of online child sexual abuse.”

On Wednesday, Clearview AI confirmed it had also suffered a data breach allowing an intruder to gain unauthorized access to its list of customers, the number of user accounts those customers had set up, and the number of searches its customers have conducted.

In a statement issued to CTV Toronto, the company's lawyer Tor Ekeland said: "security is Clearview's top priority. Unfortunately, data breaches are part of life in the 21st century. Our servers were never accessed. We patched the flaw and continue to work to strengthen our security."