Terror Web 2.0
The Net-Centric Operations of Terrorist Groups Today
By Guest ContributorJeffrey Carr
The latest phase of the Internet revolution, which has been widely referred to as Web 2.0, has not been overlooked by web-based terror networks. A recent study by the Artificial Intelligence Lab of the University of Arizona details precisely how these net-savvy terrorists are using the Web for fund-raising, recruitment, propaganda, logistical support, communications, training, and even cyber warfare.
The following table breaks down terrorist objectives and how they are supported by web sites and web-based features:
|Table 1: How Websites Support Objectives of Terrorist/Extremist Groups1|
|Terrorist objectives||Tasks supported by web sites||Web features|
|Increase fund raising||
|Overcome obstacles from law enforcement and the military||
|Provide recruitment and training||
The Pentagon has recently announced that it monitors over 5,000 jihadist sites and keeps a close watch on the top 100 most active and hostile. The European Union launched its "Check the Web" portal in May, 2007, which is a Europol (European Police) resource that all 27 member states can contribute intelligence to. In spite of these efforts, and those conducted by the U.S. Intelligence Community, there are a number of obstacles that confound our ability to find, capture, and evaluate this data.
For one, conventional search engines like Google only crawl and index a tiny amount of the data on the Web; typically the first 101k of a web page. The key words entered into Google's search window are run against the indexed data in Google's massive data stores, rather than the Web itself. For another, terrorist websites may utilize other means to make themselves invisible to web crawlers, including (but not limited to):
- Password-protected pages
- Noindex metatag
- Relational databases
- Spider traps
- Real-time content
Most researchers involved in the study of the Terror Web understand the limitations of public search engines and resort to the manual collection, storage, and analysis of web content. Qin (Qin et al 2007) points out that a manual form of collection and analysis is very limiting, and that as of November, 2006, almost no studies have been done (Qin et al 2007) which analyze the level of technical sophistication as compared to mainstream organizations.
The Terror Web's capability for cyber warfare was recently demonstrated by the Denial-of-Service attack launched against the government of Estonia, which was a collective world-wide effort by a group of Russian nationalists to disrupt and cripple Estonia's Internet resources. The attack was successful, and required nothing in the way of sophisticated equipment or specialized knowledge. The sheer number and size of bot networks is hard to measure but recent FBI activity, such as Operation Bot Roast, suggests that potential victims of botnet activity could number in the millions. These are just the networks that law enforcement can identify.
It is important to understand that Western governments are fighting a desperate battle to get a handle on these developments. While both service-specific and joint doctrine on how to fight in cyberspace exists, the institutions, policies and procedures necessary to overcome cyber-based terrorist attacks face numerous challenges. Many of these are simply bureaucratic in nature while others are clearly linked to the infrastructure limitations and security measures levied on defense and intelligence agencies. The sooner such limitations can be overcome, the sooner we can effectively counter terrorism in cyberspace.
Thanks to an increase in terrorism research funding made available by various government agencies, there is a growing body of work available from institutions such as RAND, the Centre for the Study of Terrorism and Political Violence at St. Andrews University, Scotland, The Center for Strategic and International Studies (Washington, D.C.), and the Dark Web Project at the University of Arizona, which recently published "Mapping the contemporary terrorism research domain" in October, 2006.
Jeffrey Carr participated in law enforcement and intelligence gathering activities with the U.S. Coast Guard until 1980. Today he is an information architect for analyst software, and writes about Data Fusion and Geospatial Intelligence at his blog www.IntelFusion.net.