YouTube has initiated “an even more aggressive stance” to prevent inappropriate content from reaching children after top global brands refused to publish Black Friday advertisements on videos with explicit comments from pedophiles. Adidas, Mars, HP, Diageo, Cadbury, Deutsche Bank, Lidl and Now TV are among the brands that pulled their campaigns.
The strategy consists of five guidelines:
- Tougher application of Community Guidelines and faster enforcement through technology.
- Removing ads from inappropriate videos targeting families.
- Blocking inappropriate comments on videos featuring minors.
- Providing guidance for creators who make family-friendly content.
- Engaging and learning from experts.
In the past week, YouTube removed more than 50 channels and thousands of videos with content that endangered children. The new policy enforces age restrictions on mature content or adult humor. YouTube will also deploy machine learning algorithms to detect content that violates the guidelines. Since June, advertising has been removed from 3 million videos, and 500,000 more will be removed based on the newly released guidelines. Furthermore, when inappropriately sexual or predatory comments are detected on videos with minors, the comment section will automatically be turned off. Family-oriented content will be available on a new platform called YouTube Kids.
“We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors,” writes Johanna Wright, Vice President of Product Management at YouTube.
“Comments of this nature are abhorrent and we work … to report illegal behavior to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
Tens of thousands of pedophile accounts leave predatory, sexually explicit comments on videos featuring minors, found an investigation by BBC News and the Times. Somewhere “between 50,000 and 100,000 active predatory accounts are still on the platform,” said a moderator.
“Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies,” Wright said.
*** This is a Security Bloggers Network syndicated blog from HOTforSecurity authored by Luana Pascu. Read the original post at: https://hotforsecurity.bitdefender.com/blog/youtube-tightens-content-policy-to-remove-predatory-accounts-some-still-active-19265.html