PvNew | Internet Celebrity Wiki

YouTube Pulled 120,000-Plus Sexually Explicit Videos With Children in First Half of 2021

  2024-03-01 varietyTodd Spangler7400
Introduction

YouTube removed more than 120,000 videos with sexually explicit content featuring kids or content that sexually exploits

YouTube Pulled 120,000-Plus Sexually Explicit Videos With Children in First Half of 2021

YouTube removed more than 120,000 videos with sexually explicit content featuring kids or content that sexually exploits minors in the first half of this year, according to the internet-video giant — noting that the majority of the content it removes is pulled before it gets even 10 views.

That’s according to written testimony submitted by Leslie Miller, YouTube’s VP of government affairs and public policy, for the Oct. 26 Senate subcommittee hearing “Protecting Kids Online,” during which execs from Snap and TikTok also appeared.

“There is no issue more important than the safety and wellbeing of our kids online, and we are committed to working closely with you to address these challenges,” Miller said at the hearing.

But the Senators remained skeptical about the claims of by YouTube, TikTok and Snap execs. “I just want folks watching to know [that] we’re not taking at face value what you’ve told us,” Sen. Richard Blumenthal (D-Conn.), chair of the Subcommittee on Consumer Protection, Product Safety, and Data Security, said at Tuesday’s hearing.

YouTube reports removals of sexually explicit content with minors to the National Center for Missing and Exploited Children. In the first half of 2021, the total of those videos represented less than 1% of the 15.8 million videos YouTube removed for violating its rules in the first six months of 2021 — which in turn is just a fraction of the 500-plus hours of video that YouTube says is uploaded every minute.

YouTube has “invested extensively in industry-leading machine learning technologies that identify potential harms quickly and at scale,” according to Miller’s prepared remarks. She also said, “Some speculate that we hesitate to address problematic content or ignore the well-being of youth online because it benefits our business; this is simply not true.”

In the second quarter of 2021, YouTube removed 1.87 million videos for violations of its child safety policies, of which approximately 85% were removed before they had 10 views, according to Miller. Overall in Q2, YouTube’s Violative View Rate (VVR) — an estimate of the proportion of video views of content that violates the site’s Community Guidelines, excluding spam — was 0.19%-0.21%, meaning that out of every 10,000 views on YouTube, 19-21 were views of all violative content.

Miller also said that YouTube has “long had policies that prohibit content that endangers the emotional and physical well-being of minors.” That includes bans on videos “promoting or glorifying suicide, content providing instructions on how to self-harm or die by suicide and content containing graphic images of self-harm posted to shock or disgust viewers,” she said, as well as content showing minors participating in dangerous activities and videos that involve cyberbullying or harassment involving minors.

Other YouTube initiatives that Miller called out included YouTube Kids, an app the platform introduced in 2015 with content curated for kids 4-12 and featuring multiple parental controls, and YouTube’s “supervised” account optionlaunched in March 2021 that lets parents set various restrictions for kids 13-18.

Children under 13 are not allowed access to the regular YouTube service. In 2019,GoogleandYouTube agreed to pay $170 million to settle allegations by the FTC that YouTube illegally collected personal information from children. According to Miller, YouTube shut down more than 7 million accounts in the first nine months of 2021 “when we learned they may belong to a user under the age of 13.”

Miller also pointed to a change announced earlier this month to YouTube’s monetization policies for channels that primarily create kids and family content. Under the new policy, YouTube channels primarily targeting young audiences that have “predominantly low-quality kids content” may be suspended from the platform’s ad-revenue sharing program.

In addition, YouTube collaborates with others in the industry “to better protect children,” Miller said. In 2015, YouTube introduced a system to identify and remove child sexual abuse imagery (CSAI). The platform shares access to its CSAI Match system with other internet companies for no charge to prevent the distribution of those videos on their platforms as well.

Meanwhile, in August, Google announced updates to provide additional protections for kids younger than 18. One of the biggest changes was that for users ages 13-17, YouTube will gradually start adjusting the default upload setting to the most private option available.

(By/Todd Spangler)
 
 
Dislike 0 Report 0 Favorite 0 Awards 0 Comments 0
0 itemsRelated comments
 

(c)2019-2024 PvNew All Rights Reserved |