Jump to content

Professionalism/YouTube Censorship

From Wikibooks, open books for an open world

YouTube allows users to upload and share videos. With over 1 billion active users in 2013[1], YouTube has become the primary source of income for content creators and a secondary platform for mainstream media companies[2][3]. In contrast to mainstream television, YouTube supports a disproportionate number of minority personalities[4]. With projects like It Gets Better , YouTube is also a vehicle for social change. Despite it's widespread use, various reports have concluded that YouTube's ad revenue does not allow it to profit due to the high cost of bandwidth[5][6]. This article focuses on how YouTube censors content through demonetization and filtering in order to retain advertisers.

YouTube Monetization Process

[edit | edit source]

As of 2019, in order to make money from YouTube videos, Content Creators have to be accepted into the YouTube Partner Program (YPP) and have AdSense linked and enabled[7]. Applying to join YPP is similar to applying for a job. Content Creators must meed subscriber and view count requirements, similar to sending in a resume to show they have enough experience. Content Creators who have not monetized their account through this process will not receive any money from YouTube, regardless of the number of views on their videos.

YouTube Functionalities

[edit | edit source]

Flagging

[edit | edit source]

YouTube relies on users to report inappropriate content[8]. Reported videos are not immediately taken down. Instead, they are reviewed to see if the videos violate YouTube's community guidelines. Videos that violate the guidelines are taken down.

Strikes

[edit | edit source]

If a video is found to violate YouTube's community guidelines, the video is removed and the channel that uploaded it receives a strike. A channel that receives three strikes in a 90-day period will be permanently removed from YouTube[9]. There is a process to appeal the strikes. After appealing a strike, three outcomes are possible:

  • If YouTube finds that the content follows Community Guidelines, the strike will be removed and the content will be reinstated. If the user appeals a warning and the appeal is granted, the next offense will be a warning.
  • If YouTube finds that the content follows Community Guidelines, but is not appropriate for all audiences, YouTube will apply an age restriction. If the content is a video, it will be hidden from users who are logged out, under the age of 18, or have Restricted Mode activated. If it is a custom thumbnail, it will be removed.
  • If YouTube finds that the content was in violation of their Community Guidelines, the strike will stay and the video will remain down from the site. There is no additional penalty for appeals that are rejected.

Users may appeal each strike only once[10].

History of Youtube Monetization Policies

[edit | edit source]

2007: Launch of original YouTube Partner Program (YPP) for channels to earn ad revenue

2011: YouTube invests $100 million in content creators

2012: YouTube opens the YPP[11] to everyone, with the sole requirement that people monetize a single video (not entire channels)

2014: YouTube launches "Google Preferred", a program that allows advertisers to pay more for their ads to appear specifically on high-performing creators' videos

2015: Launch of YouTube RED, a paid content subscription service

2017: The Beginning of the "Adpocalypse[12][13]." Policy changes require a channel to have 10,000 views in order to be monetized[14]

2018: YouTube changes YPP requirements to 4,000 watch-hours and over 1,000 subscribers

2019: YouTube requires YPP in addition to the account being linked to an AdSense account for monetization

Types of Censorship

[edit | edit source]

Demonetization

[edit | edit source]

Demonetization, with respect to YouTube, is when a video that normally would have ads played with it (which the creator would receive revenue from) does not have ads. Video demonetization is determined by an algorithm based on advertising-friendly guidelines. Many YouTubers have claimed that their content has been unjustifiably demonetized, which is also detrimental to their career. Nasim Aghdam, one such YouTuber, attacked YouTube's San Bruno, California headquarters in April of 2018, presumably motivated by her displeasure at YouTube's monetization policies[15]. Her channel, which featured content ranging from animal rights and veganism to exercise demonstrations, was affected by changes related to age-restricted video criteria and monetization status. She posted a video complaining about one of her videos being demonetized “after new close-minded YouTube employees, got control of [her] Farsi YouTube channel last year, 2016, and began filtering [her] videos to reduce views & suppress & discourage [her] from making videos[16]." Her ultimate decision to act violently on the basis of this perceived injustice necessitates closer consideration of the YouTube’s monetization policies and their implications for content creators and the community.

Adpocalypse and PewDiePie
[edit | edit source]

YouTube's "Adpocalypse" refers to the phenomena of widespread demonetization of content due to new advertiser-friendly guidelines. It allegedly began with controversy related to an anti-Semitic video posted by the popular YouTuber Felix Arvid Ulf Kjellberg, better known as PewDiePie. PewDiePie was the most subscribed channel on YouTube in 2017. On January 11, 2017, PewDiePie became the center of controversy due to a video he posted in which he hired people on Fiverr to hold up a sign that read "Death to all Jews". This video was heavily criticized in a series of Wall-Street Journal articles which emphasized the consequences that advertisers might face due to unwitting association with similar content. Following these articles, advertisers realized their commercials were being played alongside terrorist and other hate-inciting videos. Many large advertisers such as Coca-Cola and Amazon[17] pulled their ads off of YouTube, causing the company to make drastic changes in order to re-establish the revenue it received from these advertisers[18]. Changes included sweeping reforms to automatic-censorship policies, and the expansion of the "Not Advertiser Friendly" (NAF) tag on videos[19]. After the Adpocalypse, PewDiePie was dropped by Maker Studios, YouTube ended his YouTube RED series in its second season, and he was removed from the Google Preferred list of content creators. As of 2019, he is still making gaming and satire videos on YouTube through standard AdSense revenue[20].

Logan Paul
[edit | edit source]

Logan Paul, originally a famous Viner, successfully transitioned to YouTube after Vine shut down. Logan Paul faced backlash for a video he posted in December 2017 in which he walked through the Aokigahara, a notorious "suicide forest" in Japan. He was criticized for showing the corpse of a suicide victim and making insensitive comments about the corpse. The video quickly rose to Trending status. Many channels posted reaction videos, generally expressing their outrage and disgust at Paul's video. While these reaction videos were swiftly demonetized, the original video was neither demonetized nor taken down. Eventually Paul removed the video himself and issued apologies[21][22]. Much like PewDiePie, he faced repercussion from YouTube itself; losing his Google Preferred status, being cut from a YouTube RED series, but retaining base AdSense revenue from videos on his channel. Shortly after the fiasco, YouTube changed the requirements for the YouTube Partner program in an effort to "prevent bad actors from harming the inspiring and original creators[23]." In order to have a monetized video, a YouTuber must have 4000 watch hours, 1000 subscribers, and connect an AdSense account to collect the revenue[24].

Content Creators

[edit | edit source]

Some YouTubers aspire to make YouTube their full time career, even quitting their current jobs to expand their channel[25]. However, a race to earn subscribers and quick views encourages low quality content. Content creators turn to making clickbait videos for shock value, following a recent trend, or both[26]. Additionally YouTube prevents videos from being monetized on  “Controversial issues and sensitive events[27]." However, some content creators have noted that this policy has been applied inconsistently. Casey Neistat posted a video on the Las Vegas mass shooting, pledging to aim all proceeds from the video to relevant charities[27]. YouTube demonetized his video, claiming that “no matter the intent, our policy is to not run ads on videos about tragedies[27].” Philip DeFranco, another content creator who also created a video about the tragedy, points out YouTube's hypocrisy. While YouTube demonetized Neistat’s and his video, it continued to run commercials on Jimmy Kimmel’s videos about the tragedy[27].

Filtering

[edit | edit source]

User Flagging

[edit | edit source]

Another technique YouTube uses to identify extremist or inappropriate content is user flagging. By clicking "flag" under a video, a user can report a video. In order to combat offensive content, YouTube took flagging a step farther by introducing the YouTube Trusted Flagger Program[28]. With this program, individuals, government agencies, and non-governmental organizations who “flag frequently and with a high rate of accuracy” can use advanced tools to flag videos on a larger scale[29]. If a flagged video does not violate policy but still contains what Youtube considers “inappropriate” content, it is put into a limited state[30]. Often times when YouTube removes a viral video, it is re-uploaded or put on another website, drawing the video even more attention[30]. In a limited state, users can still view the video but it is demonetized and users cannot comment on it. With this method, YouTube can quiet offensive content without fully censoring it[30].

Automatic Filtering

[edit | edit source]

In response to the growing number of users on YouTube Kids, the platform’s kid-friendly offshoot, YouTube has made new filtering protocol intended to prevent young children from viewing age-inappropriate content. However, as with automatic demonetization, the algorithm sometimes misclassifies content. These misclassifications have led to much criticism and outrage from parents. This controversy has been termed Elsagate. Many concerned parents have posted on Facebook warning others of the types of content that bypass the YouTube Kids auto-filtering algorithm[31]. Another concerned parent claimed that “The system is complicit in the abuse[32]." Others have voiced concerns about the fact that content creators who create deceivingly innocuous-looking parody videos are taking advantage of the fact that children cannot distinguish between legitimate content and spoofs created to generate ad revenue[33]. Experts have taken issue with the betrayal of trust in beloved characters that results from children watching these videos. YouTube’s response to its algorithm’s fallibility has been to provide a disclaimer in the YouTube Kids Parental Guide that states “While our automated filters try to keep out content that is not appropriate for kids, … it’s possible your kid may find something you don’t want them to watch” and “no automated system is perfect and your kid may come across content with nudity, highly offensive language, and extreme violence[34]." YouTube also gives parents an alternative to automatic filtering by allowing them to only let their kids watch content they have approved manually or disallow their kids from watching specific content. However, given the extensive collection of videos on the platform, Malik Ducard, YouTube’s global head of family and learning content, claims that these disturbing parodies are “the extreme needle in the haystack[35]."

Professionalism and Ethics

[edit | edit source]

YouTube’s has a large impact on the world. It supports the livelihoods of many content creators and is one of the most visited sites in the world. YouTube has justified its decisions and policies around monetization and acceptable content by claiming its priority is to protect advertisers and viewers from inappropriate content. Nevertheless, its policies have received backlash from critics.

This case leaves a couple solid conclusions. One is that algorithms will always make mistakes. Illustrated above are several cases where automated systems did not work as intended. In the case of Logan Paul, a video that should have been demonetized and removed, was not. With Casey Neistat, the stated rules were not applied fairly. YouTube recognized this fallibility and has an appeals process for people to examine the decisions. Another conclusion is that accountability is a key component of responsiveness. YouTube responds most effectively and clearly in response to vital threats to its interests, namely advertisers pulling out. In the case of the Adpocalypse, YouTube responded very quickly. However, complaints from individual content creators are not addressed.

There are many open questions in regards to this case. Whose responsibility is it to police content? What content should be acceptable on a large, public platform like YouTube? There are many definitions of acceptable, especially in a global context. Another question is: what commitment does YouTube need to make to the freedom of speech? Freedom of speech is an often lauded principle, especially in an American context, but many of the discussions around extremism and misinformation online call that principle (or at least the necessity of private company’s commitment to it) into question.

References

[edit | edit source]
  1. https://www.huffingtonpost.com/2013/03/21/youtube-stats_n_2922543.html
  2. https://www.youtube.com/user/CNN
  3. https://www.youtube.com/channel/UCXIJgqnII2ZOINSWNOGFThA
  4. https://www.washingtonpost.com/business/economy/in-online-video-minorities-find-an-audience/2012/04/20/gIQAdhliWT_story.html
  5. https://www.wsj.com/articles/viewers-dont-add-up-to-profit-for-youtube-1424897967
  6. http://www.slate.com/articles/technology/technology/2009/04/do_you_think_bandwidth_grows_on_trees.html
  7. https://support.google.com/youtube/answer/72857?hl=en
  8. https://support.google.com/youtube/answer/2802027?co=GENIE.Platform%3DAndroid&hl=en-GB
  9. https://support.google.com/youtube/answer/2802032?hl=en
  10. https://support.google.com/youtube/answer/185111?hl=en
  11. https://support.google.com/youtube/answer/72851?hl=en
  12. https://www.reddit.com/r/OutOfTheLoop/comments/6cyuva/what_is_the_youtube_adpocalypse/
  13. https://www.tubefilter.com/2017/05/04/how-youtube-adpocalypse-affected-top-creators/
  14. https://youtube-creators.googleblog.com/2017/04/introducing-expanded-youtube-partner.html
  15. https://www.reuters.com/article/us-california-youtube-shooting/woman-wounds-three-at-youtube-headquarters-in-california-then-kills-herself-idUSKCN1HA2LA
  16. https://www.wired.com/story/police-say-youtube-policies-motivated-shooter/
  17. http://nymag.com/selectall/2017/12/can-youtube-survive-the-adpocalypse.html
  18. https://creatoracademy.youtube.com/page/lesson/advertiser-friendly#strategies-zippy-link-4
  19. https://www.theguardian.com/technology/2017/mar/21/youtube-google-advertising-policies-controversial-content
  20. https://www.youtube.com/channel/UC-lHJZR3Gqxm24_Vd_AJ5Yw
  21. https://twitter.com/LoganPaul/status/948026294066864128?tfw_site=nytimes&ref_src=twsrc%5Etfw&ref_url=https%3A%2F%2Fwww.nytimes.com%2F2018%2F01%2F02%2Fbusiness%2Fmedia%2Flogan-paul-youtube.html
  22. https://www.nytimes.com/2018/01/02/business/media/logan-paul-youtube.html
  23. https://youtube-creators.googleblog.com/2018/01/additional-changes-to-youtube-partner.html
  24. https://support.google.com/youtube/answer/72851?hl=en
  25. https://www.cnbc.com/2017/07/28/college-students-turning-youtube-channels-into-real-money.html
  26. https://www.polygon.com/2018/4/13/17231470/fortnite-strip-clickbait-touchdalight-ricegum-youtube
  27. a b c d https://gizmodo.com/youtube-videos-about-las-vegas-massacre-blocked-from-ma-1819219891
  28. https://www.engadget.com/2014/03/17/youtube-s-trusted-flagger-program-lets-police-call-attention-to/
  29. https://support.google.com/youtube/answer/7554338?hl=en
  30. a b c https://gizmodo.com/youtube-has-a-new-naughty-corner-for-controversial-reli-1797429910
  31. https://www.facebook.com/photo.php?fbid=10155120216728667&set=a.466666843666.248808.673943666&type=3&theater
  32. https://medium.com/@jamesbridle/something-is-wrong-on-the-internet-c39c471271d2
  33. https://theoutline.com/post/1239/youtube-has-a-fake-peppa-pig-problem?zd=3&zi=74uk7u3k
  34. https://support.google.com/youtubekids/answer/7348644?hl=en&ref_topic=7348849
  35. https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html