Warning: This story contains a disturbing image below.

Police in the U.K. are warning parents about online videos that direct children to harm themselves and threaten them with severe consequences if their instructions are not followed.

The warning was issued Monday by the Police Service of Northern Ireland (PSNI), which said it was working with other U.K. policing agencies to determine “the extent of the problem” posed by the Momo Challenge.

The challenge, as described by the PSNI, involves messages being hidden in seemingly child-friendly games and videos. The messages urge children to contact “Momo” via Snapchat, WhatsApp or another popular messaging program.

The Momo accounts often feature an image of a doll with large eyes and long, black hair. A Momo-like character has also been spotted in the popular online game Minecraft.

Once messaged, police say, Momo may respond with disturbing images and instructions, including directions for self-harm. Momo may also make threats about what could happen if the directions are not followed, saying they will harm the child’s family or place a curse on them.

“While the threat of a curse may sound silly to an adult, it could be a very frightening prospect for a young child and they may feel under pressure to carry out acts to protect themselves or family from further harm,” Det.-Sgt. Elaine McCormill said in a statement.

One PSNI officer reported seeing a Momo Challenge video from the U.S. in which a “creepy” voice urged children to “take a knife to their own throat.”

The officer wrote that while the Momo Challenge was prompting this alert, any sort of online interaction in which children feel pressured into performing actions for a stranger represents potential danger.

“Don’t focus only on Momo, but make sure you know what your child has online access to,” the officer wrote.

The Canada Safety Council provides advice for how children can stay safe online, suggesting that they talk to their parents before giving out any personal information and alert an adult if anything they see makes them feel uncomfortable.

The so-called challenge made headlines in Canada last summer, with police in some Ontario and Quebec agencies warning parents to talk to their children about the risks of following a stranger’s instructions.

Elsewhere in the U.K., one school in Hull warned parents Tuesday that their children could encounter the “very disturbing” images while watching seemingly benign videos of Fortnite or Peppa Pig.

There have been no confirmed deaths or injuries linked to the Momo Challenge.

Mother says YouTube has ‘huge problem’ keeping kids safe

While the Momo Challenge has reportedly been spotted on several platforms, a mother from Florida is singling out YouTube for failing to prevent inappropriate videos from making their way to children.

Dr. Free N. Hess has already had two videos pulled off the popular platform after she discovered that a message instructing children on how to slice their wrists had been inserted into the otherwise innocent cartoons.

One of those two videos was accessible through YouTube Kids – a designed-for-children platform described by YouTube as offering a “safer and simpler” experience.

Hess, a pediatrician, told CNN that she was seeing “more and more kids” coming into her office after attempting self-harm or suicide.

“I don’t doubt that social media and things such as this [are] contributing,” she said.

In a Feb. 22 blog post, Hess spotlighted several other videos she found on YouTube kids and considers inappropriate for young children.

The videos contains gun violence, animated killings and suicide-related topics, as well as shooting scenes and foul language. One of the videos features an animated girl attempting to kill herself. Another show zombie cartoon characters killing each other.

“This is supposed to be a safe place for our children,” Hess wrote.

“Unlike YouTube itself, YouTube Kids is supposed to be specifically FOR kids. There has to be a better way to assure this type of content is not being seen by our children.”

YouTube boasts that YouTube Kids contains a number of safety-minded features, including the ability for parents to handpick which videos their children are able to watch, as well as human and automatic filters to keep out content deemed not to be family-friendly.

Google, which owns YouTube, also maintains a database of suggestions of ways parents can help keep their children safe on its platform. Its recommendations generally boil down to monitoring which videos children are consuming and reporting anything that could be considered inappropriate.

However, it is not difficult for many children to access the main YouTube platform, where the filters do not exist and where inappropriate videos are only caught after they have been published and potentially viewed.

Hess says she wants to YouTube to find a way to do a better job of reviewing content before it is made visible to the public.

YouTube: ‘More work to do’

A YouTube spokesperson said the company “work(s) to ensure videos in YouTube Kids are family-friendly and take feedback very seriously.”

“We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed,” the spokesperson added.

“We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app,” the spokesperson went on. “We are making constant improvements to our systems and recognize there’s more work to do.”