2014 47th Hawaii International Conference on System Sciences (2008)
Waikoloa, Big Island, Hawaii
Jan. 7, 2008 to Jan. 10, 2008
The 9/11 Commission Report and the National Intelligence Reform Act both state that the development of terrorist network database collection processes is an immediate and pressing requirement. This paper is a study and comparison of two complementary approaches to developing a terror network dataset: Automap  a Network Text Analysis (NTA) tool; and Intelligence Analyst coding, a human process. NTA tools are an emerging branch of software that supports the analysis of quantitative characteristics of large-scale textual data  as well as the extraction of meaning from texts. Intelligence Analyst coding is the traditional method that requires a human to read and cognitively process each raw field report. In this study, both approaches were applied to the same one hundred eighty-three open source texts on the Al Qaeda organization. Each approach's process, dataset product, and analytics are compared qualitatively and quantitatively. In terms of process, the Automap-assisted system required less manpower and time resources. In terms of dataset product, both approaches identified unique nodes and relationships that the other missed. Lastly, the differences in the datasets significantly impacted threat analytics and potential course of action selection. These results suggest an integrated human-centered automation support approach to intelligence dataset development.
John M. Graham, Kathleen M. Carley, Drew Cukor, "Intelligence Database Creation and Analysis: Network-Based Text Analysis versus Human Cognition", 2014 47th Hawaii International Conference on System Sciences, vol. 00, no. , pp. 76, 2008, doi:10.1109/HICSS.2008.213