This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A Large-Scale Study of the Time Required to Compromise a Computer System
Jan.-Feb. 2014 (vol. 11 no. 1)
pp. 2-15
Hannes Holm, The Royal Institute of Technology, Stockholm
A frequent assumption in the domain of cybersecurity is that cyberintrusions follow the properties of a Poisson process, i.e., that the number of intrusions is well modeled by a Poisson distribution and that the time between intrusions is exponentially distributed. This paper studies this property by analyzing all cyberintrusions that have been detected across more than 260,000 computer systems over a period of almost three years. The results show that the assumption of a Poisson process model might be unoptimalâthe log-normal distribution is a significantly better fit in terms of modeling both the number of detected intrusions and the time between intrusions, and the Pareto distribution is a significantly better fit in terms of modeling the time to first intrusion. The paper also analyzes whether time to compromise (TTC) increase for each successful intrusion of a computer system. The results regarding this property suggest that time to compromise decrease along the number of intrusions of a system.
Index Terms:
Malware,Computational modeling,Statistical distributions,Workstations,network management,Invasive software (viruses,worms,Trojan horses),risk management
Citation:
Hannes Holm, "A Large-Scale Study of the Time Required to Compromise a Computer System," IEEE Transactions on Dependable and Secure Computing, vol. 11, no. 1, pp. 2-15, Jan.-Feb. 2014, doi:10.1109/TDSC.2013.21
Usage of this product signifies your acceptance of the Terms of Use.