Center for Strategic Communication

by Monika Maslikowski

Online PSYOP campaigns are a cheap and easy way for extremists to infiltrate U.S. public discourse about the fight against terrorism. The campaigns attempt to break the political will of U.S. policymakers and persuade the public to doubt the purpose and effectiveness of their government’s policies.

Does extremist propaganda have enough breadth and resonance with Western audiences to make them re-think their government’s actions? The extremists must think so, because these efforts remain a critical component of their broader strategic communication.

On August 4th, Steve Corman reported about a new trend in online extremist PSYOP campaigns: deliberately misrepresenting the content of YouTube videos in order to lure pro-American viewers into watching violent attacks or extremist messages.

Extremists also encourage their followers to engage in a “media jihad” against Western audiences to promulgate an anti-war sentiment. Last year, MEMRI reported that the “Al Mohajroon” website gave specific instructions to these virtual warriors to “break [Americans’] spirits” by posting on forums and sites popular with Westerners.

Specifically they suggested posting videos and images of American soldiers committing “crimes” like killing unarmed civilians, women, or children. They also encouraged fabricating stories about disaffected American soldiers that have turned against the war:

Obviously, you should post your contribution…as an American…You should correspond with visitors to this forum bringing to their attention the frustrating situation of their troops in Iraq…You should invent stories about American soldiers you have [allegedly] personally known…who were drafted to Iraq and then committed suicide while in service…Also, write using a sad tone, and tell them that you feel sorry for your [female] neighbor or co-worker who became addicted to alcohol or drugs…because her poor fiancé, a former soldier in Iraq, was paralyzed or [because] his legs were amputated…

Over the past several years, extremist media organizations, forums and bloggers have called on their multi-lingual readers to translate texts, videos, magazines, and statements into English. One of the more recent suggestions, was posted on al-Ekhlaas forum on Aug. 18. It called on media organizations and individuals to provide translations of popular jihadist e-journals like Sawt al-Jihad.

There is also a growing number of extremist websites in English, created by both official media organizations of extremist groups and individuals that adhere to their ideology. Countless sites, blogs, and discussion forums seek to engage the U.S. population and manipulate their opinion. The language barrier between the United States and its adversaries in the global extremist network has practically been rendered irrelevant by the prevalence of English language Islamist media.

So what, if anything, should be done about extremist Internet content? The Internet is famously difficult to control or regulate. Proposals to do so are almost always controversial.

Extremist videos posted on YouTube caught the attention of U.S. lawmakers earlier this summer. On May 19, 2008, Senator Joe Lieberman wrote a letter to the chairman of Google, Inc. (which owns YouTube) asking the company to develop a method to systematically remove extremist videos from YouTube. The senator notes, as Steve did, that “this should be a straightforward task since so many of the Islamist terrorist organizations brand their material with logos or icons identifying their provenance.”

The YouTube Team responded on their blog that same afternoon. They promptly removed 80 videos that featured explicit violence. However, they declared that in order to encourage a “healthy debate”, most of the videos identified by Sen. Lieberman would remain online. Despite their obvious affiliation with extremist groups, the videos were not considered in violation of YouTube’s policies.

Sen. Lieberman’s request sought to help counteract the threat of self-radicalization of individuals within U.S. borders, which may be fueled by readily-available extremist propaganda. In his response to Google’s actions, he stated that “no matter what their content, videos produced by terrorist organizations…should not be tolerated.”

One comment on the YouTube blog, by a user named “northshore83”, points out that the issue revolves recruitment:

If our country has designated al-Qaeda a terrorist group then how much of a difference is there between permitting them to recruit on this privately owned forum and recruiting at a privately owned building? Would you object to the al-Qaeda representatives being prevented from staging a fundraiser or recruitment meeting in your home town?

Removing videos that simply propagate an ideology walks a fine constitutional line. However, YouTube is a private organization that is not bound by constitutional restrictions, and it is unlikely that those posting the material in question are even U.S. citizens. Since the promoters of the ideology advocate violence against YouTube’s home country, Sen. Lieberman’s request seems to many to be simple common sense.

Finding extremists online and removing their websites and postings is another contentious issue. Some go as far as accusing U.S.-based extremist bloggers of treason, claiming that they are actively propagating an ideology that promotes the destruction of America. Yet the hosts of many of these extremist websites are either unaware of their content, or are unwilling to remove them due to the volume of sites they support and legal protections of free speech.

U.S. government officials have been lukewarm to the idea of trying to stamp-out extremist messages online. Following the Lieberman – YouTube dialogue, a Washington Post article quoted a senior U.S. counterterrorism official as saying “Yes, we could go around shutting down Web sites, but it doesn’t really work as a strategic weapon against al-Qaeda,” because as soon as one site is shut down another pops up.

So, are Americans falling for the bait? The effectiveness of these extremist PSYOP campaigns is difficult to assess. The number of hits on a particular YouTube video may be an indication of popularity, yet it does not distinguish Americans from other viewers, and it says nothing about the impact of the message. Although minimizing the presence of extremist PSYOPs online might be a good idea, it should not displace another priority: Creating a counter-media strategy that can deconstruct extremist ideology and reduce demand for it on the Internet.

UPDATE:

On September 11, 2008, Sen. Joe Lieberman’s office issued a press release announcing Google’s decision to strengthen its policies regarding YouTube videos that incite violence. The new community guidelines will now target videos that include “predatory behavior, stalking, threats, harassment, intimidation, invading privacy, revealing other people’s personal information, and inciting others to commit violent acts…” Although Google has removed hundreds of videos from YouTube since Sen. Lieberman’s initial request, extremist videos depicting and inciting violence remain prevalent on the website.