Amid heightened awareness of the Momo challenge hoax, YouTube reaffirmed and updated its content moderation policies, making a sweeping decision to turn off comments on any videos containing children.
Recent news (and social media) has been abuzz with fear and concern around the so-called Momo challenge, which reportedly urged kids to do dangerous things including kill themselves. It first rose to prominence in 2018 with news outlets reporting that it targeted teens through WhatsApp. It has since returned with new reports that it’s targeting kids by splicing dangerous messages into children’s videos on YouTube.
“Contrary to press reports, we have not received any evidence of videos showing or promoting the Momo challenge on YouTube,” a YouTube spokesperson told Kidscreen. “Content of this kind would be in violation of our policies and removed immediately when flagged.”
Now, in response to the controversy, YouTube stated that the image representing the challenge (pictured) would not be allowed on the YouTube Kids app.
This comes as YouTube comes amid a child safety controversy, in which it came to light that predators were commenting on videos featuring youth. In response, YouTube removed thousands of inappropriate comments and terminated the accounts of those who violated its policies by posting inappropriate comments. But in a broader, more sweeping action to address the issue, YouTube disabled all comments on videos featuring minors. This move has already affected tens of millions of videos, that YouTube says “could be subject to predatory behavior.” The platform said in a blog post yesterday that a small number of creators will be allowed to keep comments enabled, but those channels are expected to actively moderate comments, beyond YouTube’s own moderation tools. The Google-owned platform did not clarify what channels will be affected by the change and if it will impact major family vloggers like Daily Bumps or kid video creators like Ryan ToysReview.
There’s no word yet on whether the changes will bring back advertisers like Disney and Fortnite-maker Epic Games, who pulled ad funding from the platforms in the wake of the controversy.
Prior to this, YouTube has made efforts to protect its youngest viewers from being exposed to inappropriate content recently, including bolstering its content moderation team with the goal of bringing the total number of people across Google working to address it to more than 10,000 in 2018. It also spent three months removing more than eight million videos that contained inappropriate content. The online video giant maintains that users need to flag inappropriate content, so that its human reviewers can see it, and if the video is found to violate YouTube’s community guidelines it will be removed.
While content moderation is being stepped up on YouTube’s end, in a survey commissioned by Common Sense Media it found that many parents did not use, or even know about the parental control features the site offers. Among more than among 1,000 US parents whose children watch YouTube videos, only 10% said the responsibility of moderating the content that kids see lies with YouTube, with most seeing it as a parent’s job to ensure a child doesn’t see anything they should not on the platform. However, recent study from research and consulting firm Smarty Pants, published in Kidscreen’s February/March issue, found that while almost every kid uses YouTube on a weekly basis (98% of those surveyed), trust in its general app has been on the decline for parents, with inappropriate content and cyber safety issues leading to fewer parents considering it as an age-appropriate brand for kids.