Horrified mom discovers suicide instructions in video on YouTube and YouTube Kids
Horrified mom discovers suicide instructions in video on
YouTube and YouTube Kids
BY SOPHIE LEWIS FEBRUARY 23, 2019 / 2:53 PM / CBS NEWS
Warning: This article features disturbing content and
mentions of suicide.
Video promoting self-harm tips -- spliced between clips
of a popular video game — has surfaced at least twice on YouTube and YouTube
Kids since July, according to a pediatrician and mom who discovered the video.
The suicide instructions are sandwiched between clips
from the popular Nintendo game Splatoon and delivered by a man speaking in
front of what appears to be a green screen — an apparent effort to have him
blend in with the rest of the animated video.
"Remember kids, sideways for attention, longways for
results," the man says, miming cutting motions on his forearm. "End
it."
The man featured is YouTuber Filthy Frank, who has over
6.2 million subscribers and calls himself "the embodiment of everything a
person should not be," although there is no evidence that Frank, whose
real name is George Miller, was involved in creating the doctored video. He did
not immediately respond to CBS News' request for comment.
When Free Hess found the video on YouTube last week, she
posted it on her blog -- warning other parents to take control over what their
kids may be watching.
"Looking at the comments, it had been up for a
while, and people had even reported it eight months prior," Hess told CBS
News on Friday.
Shortly after she published her blog post, YouTube took
the video down, saying it violated the site's community guidelines, according
to Hess.
Hess said she spotted another version of the same video
on YouTube Kids in July last year. She said she and many other parents from
Facebook groups came together to report it, and the video was eventually taken
down after one parent directly contacted an employee at Google. Google has not
responded to CBS News' inquiry about the steps that led to the video's removal.
Hess said after seeing higher rates of suicide in
children in her own emergency room over the last few years, she made it her mission
to bring awareness to disturbing and violent content being consumed by children
on social media. She said she's reported hundreds of unsettling videos to
YouTube, with some success. On Friday, she found and reported seven more
disturbing videos on YouTube Kids, and said they were just the tip of the
iceberg.
"I had to stop, but I could have kept going,"
Hess said. "Once you start looking into it, things get darker and weirder.
I don't understand how it's not getting caught."
YouTube Kids is meant to be a kid-friendly version of
YouTube site for children who are 8 years old and under, but trolls have founds
ways around YouTube's algorithm and are posting potentially harmful videos.
"They're awful. Absolutely awful," Hess said
about some of the content on the YouTube Kids app.
She said she logs onto the app posing as a child, rather
than an adult, so that she can see exactly what kids around the world are
seeing. The videos Hess has found contain mentions or visuals of self-harm,
suicide, sexual exploitation, trafficking, domestic violence, sexual abuse and
gun violence, including a simulated school shooting. She said many of the kids
she treats in the ER list videos on YouTube as a method used to learn
destructive behaviors and self-harm techniques.
A YouTube spokesperson told CBS News on Friday the site
works hard "to ensure YouTube is not used to encourage dangerous
behavior." The spokesperson also said YouTube has "strict
policies" that prohibit videos that promote self-harm.
"We rely on both user flagging and smart detection
technology to flag this content for our reviewers," the spokesperson said.
"Every quarter we remove millions of videos and channels that violate our
policies and we remove the majority of these videos before they have any views.
We are always working to improve our systems and to remove violative content
more quickly, which is why we report our progress in a quarterly report and
give users a dashboard showing the status of videos they've flagged to
us."
However, YouTube kids has a history of letting disturbing
and violent videos slipping past its algorithms. In 2017, searching the word
"gun" on the app surfaced a video on how to build a coil gun,
Mashable reported. Other videos at the time featured Mickey Mouse in a pool of
blood and Spider-Man urinating on Elsa, the princess from "Frozen,"
prompting backlash.
"The YouTube Kids team is made up of parents who
care deeply about this, so it's extremely important for us to get this right,
and we act quickly when videos are brought to our attention," a YouTube
spokeswoman told CNET at the time. "We agree this content is unacceptable
and are committed to making the app better every day."
Since the backlash in 2017, YouTube has outlined steps it
is taking to improve safety on its Kids app. In November 2017, the company
outlined a new set of guidelines, including "faster enforcement" of
community guidelines and "blocking inappropriate comments." In April
last year, YouTube announced three new parental control features to give
parents the ability to curate what their child is seeing on the app. There are also
a number of other ways for parents to make the app safer, but none of them are
automatic.
This week, new cases of inappropriate content prompted
high-profile responses, including from Disney and Nestle, which pulled
advertising from YouTube after a blogger described "a wormhole into a
soft-core pedophilia ring" on the site.
YouTube said Thursday it was taking aggressive action,
banning more than 400 accounts and taking down dozens of videos.
Critics say its approach to safety across platforms just
isn't working.
TLDR: Disabled comments on tens of millions of videos. Terminated over 400 channels. Reported illegal comments to law enforcement.
Parents remain concerned about safety both on YouTube and
YouTube Kids. "We should start by educating ourselves, educating our
children, and speaking up when we see something that is dangerous for our
children," Hess wrote on her blog. "We also need to fight to have the
developers of social media platforms held responsible when they do not assure
that age restrictions are followed and when they do not remove inappropriate
and/or dangerous material when reported."
© 2019 CBS Interactive Inc. All Rights Reserved.
Comments
Post a Comment