Florida Pediatrician Discovers Tips for Children to Commit Suicide Hidden in YouTube Kids Videos

Florida Pediatrician Discovers Tips for Children to Commit Suicide Hidden in YouTube Kids Videos

A Florida doctor is warning parents about hidden messages in videos targeting children on YouTube and YouTube Kids, including ones that offer tips on how to commit suicide.

Pediatrician Free Hess told The Washington Post that another mom found a video in which a man appears several minutes into a clip from a children’s video game to offer instructions on how to commit suicide.

Hess said the message had been edited into several other videos on YouTube and YouTube Kids about the Nintendo game Splatoon.

One of the videos shows a man holding a pretend blade toward the inside of his arm.

“Remember kids,” he says, “Sideways for attention, longways for results.”

Hess is also warning parents on her blog about other videos showing children’s videos depicting school shootings, human trafficking, and suicide.

YouTube under fire for putting kids at risk:

“I was shocked,” Hess told The Post of the videos. “I think it’s extremely dangerous for our kids. I think our kids are facing a whole new world with social media and Internet access. It’s changing the way they’re growing, and it’s changing the way they’re developing. I think videos like this put them at risk.”

Nadine Kaslow, the former president of the American Psychological Association, said it was “tragic” that “trolls are targeting kids and encouraging kids to kill themselves.” She said there should be “serious consequences” for people who target children to urge them to harm themselves.

YouTube’s rules not enough, experts say:

YouTube spokeswoman Andrea Faville told The Post that the company relies on flags from users and its algorithm to ensure the platform is “not used to encourage dangerous behavior and we have strict policies that prohibit videos which promote self-harm.”

“We rely on both user flagging and smart detection technology to flag this content for our reviewers,” Faville added. “Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report [transparencyreport.google.com] and give users a dashboard showing the status of videos they’ve flagged to us.”

But Kaslow said removing the videos is not enough once they have been viewed by hundreds or thousands.

“I don’t think you can just take them down,” she said. “For children who have been exposed, they’ve been exposed. There needs to be messaging — this is why it’s not okay.”

YouTube changing its algorithms:

YouTube is also under fire for actively promoting conspiracy videos about school shootings, vaccinations, and other divisive issues.

The Washington Post reported last month that YouTube is tweaking its algorithm in an attempt to prevent spreading false information.

“Advocates say those policies don’t go far enough to prevent people from being exposed to misleading information, and that the company’s own software often pushes people to the political fringes by feeding them extremist content that they did not seek out,” The Post reported. “From a mainstream video, the algorithm often takes a sharp turn to suggest extremist ideas…. The Post reported in December that YouTube continues to recommend hateful and conspiratorial videos that fuel racist and anti-Semitic content.”