YouTube will remove any ‘mature’ or ‘violent’ content directed toward children amid mounting pressure to make its platform safer for minors.
According to The Verge, YouTube says it will weed out unsafe content by monitoring video titles, descriptions, and tags and will begin banning offenders following a grace period.
Targeted content will include any material that touches on sex, violence, death or other topics deemed inappropriate for young audiences.
While it may be odd to think that a platform of YouTube’s size hadn’t already been moderating ‘violent’ and ‘mature’ content directed towards kids, The Verge notes that up to this point YouTube has only age-restricted access.
YouTube reportedly announced the change two days ago, quietly, through a YouTube Help community forum.
The platform said it will remove content if and when it’s found, but won’t give out any ‘strikes’ to creators until after a 30-day notice period meant to familiarize users with the policy.
Videos uploaded prior to the rule change, however, will not be given strikes, though they can still be removed.
As a part of the push, YouTube will also be age-restricting other types of content that they fear could be misconstrued as being for kids such as adult cartoons.
An example, said the platform, would be cartoon directed towards children that depicts inappropriate subject matter like a character ‘injecting needles.’
YouTube has also been taking aggressive stances against letting channels live stream if children are involved, and the company is now disabling comment sections on certain videos featuring children. YouTube is still undecided, however, on whether it should turn off the recommendation algorithm on such content for fear it could reduce engagement. Earlier this week, Bloomberg reported that YouTube is finalizing plans to remove targeted advertising on videos featuring children on its main site to help prevent potential fines under the Children’s Online Privacy Protection Act.