This is a never ending battle for YouTube.
Every minute, YouTube is bombarded with videos that follow it Many guidelines, Whether pornography or copyrighted material or violent extremism or dangerous misinformation. The company has refined its artificially intelligent computer systems in recent years. To stop these most violent videos By uploading to the site, but Keeps coming under scrutiny For its failure to curb the spread of hazardous materials.
In an effort to demonstrate its effectiveness in finding and removing rule-breaking videos, YouTube on Tuesday revealed a new metric: Violet View Rate. This is the percentage of total views on YouTube that come from videos that do not meet its guidelines before removing the video.
In a blog post, YouTube stated that the violent video accounted for 0.16 percent to 0.18 percent of all views on the platform in the fourth quarter of 2020. Or, another way, 16 to 18 out of every 10,000 views on YouTube were for content that broke YouTube’s rules and was eventually removed.
“We’ve made a ton of progress, and it’s a very small number, but definitely we want to reduce it,” said Jennifer O’Connor, director of YouTube’s trust and security team.
The company said its violent visualization rate improved in the fourth quarter of 2017 from three years earlier: 0.63 percent to 0.72 percent.
YouTube stated that it did not reveal the total number of times before removing the problematic video. This reluctance highlights the challenges facing platforms such as YouTube and Facebook, which rely on user-generated content. Even as YouTube progresses in capturing and deleting restricted content – 94 percent of problematic videos are detected before a computer detects them, the company said – overall views remain an eye-popping figure because the platform is too large.
Ms. O’Connor said that YouTube decided to disclose one percent rather than the total number because it helps solve how problematic content is meaningful to the overall platform.
YouTube released the metric, which the company has tracked for years and is expected to fluctuate over time, as part of a quarterly report that explains how it is implementing its guidelines. In the report, YouTube offered totals for the number of objectionable videos (83 million) and comments (seven billion) removed since 2018.
Although YouTube points to such reports as accountability, the underlying data is based on YouTube’s own decisions, for which the videos violate its guidelines. If YouTube finds fewer videos to be violent – and therefore removes fewer of them – then the percentage of violent video views may decrease. And none of the data is subject to an independent audit, although the company did not do so in the future.
“We are just starting out by publishing these numbers, and we provide a lot of data,” Ms. O’Connor said. “But I will not close this table yet.”
YouTube also said that it was counting ideas liberally. For example, if someone stops watching a user before reaching the objectionable part of the video, a visual counts as well, the company said.