Popular video sharing Web site YouTube adopted a policy banning videos “intended to incite violence or encourage dangerous, illegal activities” on Sept. 11, 2008, several months after Sen. Joe Lieberman (I-Conn.) pressured the site and its owner, Google, to remove content he said was “designed to incite violence against America and Americans or that show graphic violence against American troops and others.”
In a May 19, 2008 letter to Google Chairman and CEO Eric Schmidt, Lieberman, Chairman of the Senate Homeland Security and Governmental Affairs Committee, said that, according to a staff report by the committee, “Islamist terrorist organizations use YouTube to disseminate their propaganda, enlist followers, and provide weapons training.” Lieberman wrote that YouTube searches turned up “dozens” of videos bearing the logos of designated terrorist organizations documenting attacks on American troops in Iraq or Afghanistan, providing weapons training, featuring speeches by al-Qaida leadership, and “general material intended to radicalize potential recruits.”
Lieberman’s letter called on the Web site “to immediately remove content produced by Islamist terrorist organizations” and to “explain what changes Google plans to make to the YouTube community guidelines to address violent extremist material.” The letter is available on Lieberman’s Web site at http://tinyurl.com/Liebermanletter.
Because more videos are uploaded to YouTube than could possibly be individually monitored by the Web site, it relies on a set of “community guidelines” which encourage users to “flag” inappropriate content, which is then reviewed by the Web site and considered for removal.
According to a May 25 New York Times editorial, YouTube initially removed 80 videos at Lieberman’s request, but refused to take down more videos because they did not violate the site’s guidelines against graphic violence or hate speech. A May 19 post on the Google Public Policy blog said, “While we respect and understand [Lieberman’s] views, YouTube encourages free speech and defends everyone’s right to express unpopular points of view.” The blog post is available at http://googlepublicpolicy.blogspot.com/2008/05/dialogue-with-sen-lieberman-on.html.
Some criticized Lieberman’s demands that YouTube more closely police the site. The May 25 New York Times editorial said “it is profoundly disturbing that an influential senator would even consider telling a media company to shut down constitutionally protected speech.”
In a June 6 commentary, non-profit advocacy group the California First Amendment Coalition (CFAC) said Lieberman’s letter demonstrated “a failure to understand how free speech works in this medium.”
However, the new community guidelines appear to back down from Google’s initial position.
The relevant YouTube community guidelines state, “we draw the line at content that’s intended to incite violence or encourage dangerous, illegal activities that have an inherent risk of serious physical harm or death.” The guidelines specifically cite “instructional bomb making, sniper attacks,” and “videos that train terrorists,” as examples of those that should be removed, and say “[a]ny depictions like these should be educational or documentary and shouldn’t be designed to help or encourage others to imitate them.” The guidelines are available online at http://www.youtube.com/t/community_guidelines.
YouTube spokesman Ricardo Reyes told The Washington Post for a September 12 story that “YouTube reviews its content guidelines a few times a year, and we take the community’s input seriously,” adding, “The senator made some good points.”
According to The Associated Press (AP) on September 13, YouTube did not identify specific videos that led to the new guidelines, nor did it say exactly how it would choose those that are purged.
In a September 11 press release, Lieberman praised the new guidelines, which the press release called “a move taken in direct response to the Senator’s complaints.”
– Patrick File
Silha Fellow and Bulletin Editor