San Francisco, CA
May 23, 2013 to May 24, 2013
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/SPW.2013.12
The manual forensics investigation of security incidentsis an opaque process that involves the collection and correlation of diverse evidence. In this work we conduct a complex experiment to expand our understanding of forensics analysis processes. During a period of four weeks we systematically investigated 200 detected security incidents about compromised hosts within a large operational network. We used data from four commonly-used security sources, namely Snort alerts, reconnaissance and vulnerability scanners, blacklists, and a search engine, to manually investigate these incidents. Based on our experiment, we first evaluate the (complementary) utility of the four security data sources and surprisingly find that the search engine provided useful evidence for diagnosing many more incidents than more traditional security sources, i.e., blacklists, reconnaissance and vulnerability reports. Based on our validation, we then identify and make available a list of 138 good Snort signatures, i.e., signatures that were effective in identifying validated malware without producing false positives. In addition, we compare the characteristics of good and regular signatures and highlight a number of differences. For example, we observe that good signatures check on average 2.14 times more bytes and 2.3 times more fields than regular signatures. Our analysis of Snort signatures is essential not only for configuring Snort, but also for establishing best practices and for teaching how to write new IDS signatures.
Infections, Network forensics, IDS, Malware
Elias Raftopoulos, Xenofontas Dimitropoulos, "Understanding Network Forensics Analysis in an Operational Environment", SPW, 2013, 2013 IEEE CS Security and Privacy Workshops (SPW2013), 2013 IEEE CS Security and Privacy Workshops (SPW2013) 2013, pp. 111-118, doi:10.1109/SPW.2013.12