Following recent criticism that it’s failing to protect children from adult content online, YouTube has announced that it will step up its enforcement of regulations for videos that are targeted at children. The move comes as a result of concerns over children being exposed to violent or adult content online.
Being the world’s largest video streaming service, critics believe YouTube has a responsibility to improve its protection of children from inappropriate content. The site has already removed over 50 channels, and according the vice president Johanna Wright, the online video service has removed ad on over 3.5 million videos since June.
She said in a recent blog post that “Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies. These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge.”
Regulators of the site, as well as parents and advertisers have become increasingly worried about the sites open service, which places very few limits on its video content. It’s been advised that Google, who own YouTube, need to do more to restrict potential harmful or distressing content from its viewers.
Matan Uziel — a producer and activist who leads Real Women, Real Stories wrote to YouTube CEO Susan Wojcick expressing his concern about “tens of thousands of videos available on YouTube that we know are crafted to serve as eye candy for perverted, creepy adults, online predators to indulge in their child fantasies.”
In a company statement, YouTube pledged that it would address the issues raised, and that any gaps in the enforcement of its policies would be addressed. The site is removing accounts which show child endangerment, as well as videos which show children’s characters in adult or violent situations.
The site has also pledged to disable all comments on these types of videos, as provide better guidance for creators of content aimed at children. YouTube claims that are working alongside child safety experts to ensure children are safe online, and that the websites verification processes are sufficient to deal with the ongoing problem.
Johanna Wright has also advised that YouTube moderators will be instructed to remove all videos which “featuring minors that may be endangering a child, even if that was not the uploaders intent”, and that videos with children’s characters “containing mature themes or adult humour” will become restricted and viewable to adults only.
Leave a Reply