The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.05 - Sept.-Oct. (2013 vol.30)
pp: 30-37
Tao Xie , Univ. of Illinois at Urbana-Champaign, Urbana, IL, USA
ABSTRACT
With software analytics, software practitioners explore and analyze data to obtain insightful, actionable information for tasks regarding software development, systems, and users. The StackMine project produced a software analytics system for Microsoft product teams. The project provided lessons on applying software analytics technologies to positively impact software development practice. The lessons include focusing on problems that practitioners care about, using domain knowledge for correct data understanding and problem modeling, building prototypes early to get practitioners' feedback, taking into account scalability and customizability, and evaluating analysis results using criteria related to real tasks.
INDEX TERMS
Algorithm design and analysis, Software systems, Data mining, Performance analysis, Debugging, Software analytics, Software engineering,actionable information, Algorithm design and analysis, Software systems, Data mining, Performance analysis, Debugging, Software analytics, Software engineering, software engineering, software analytics, mining software repositories, technology transfer, StackMine, data exploration, software artifacts, insightful information
CITATION
Dongmei Zhang, Shi Han, Yingnong Dang, Jian-Guang Lou, Haidong Zhang, Tao Xie, "Software Analytics in Practice", IEEE Software, vol.30, no. 5, pp. 30-37, Sept.-Oct. 2013, doi:10.1109/MS.2013.94
REFERENCES
1. J. Czerwonka et al., “CRANE: Failure Prediction, Change Analysis and Test Prioritization in Practice—Experiences from Windows,” Proc. IEEE 4th Int'l Conf. Software Testing, Verification and Validation (ICST 11), IEEE CS, 2011, pp. 357-366.
2. E. Shihab et al., “An Industrial Study on the Risk of Software Changes,” Proc. ACM SIGSOFT 20th Int'l Symp. Foundations Software Eng. (FSE 12), ACM, 2012, article 62.
3. K. Glerum et al., “Debugging in the (Very) Large: Ten Years of Implementation and Experience,” Proc. ACM SIGOPS 22nd Symp. Operating Systems Principles (SOSP 09), ACM, 2009, pp. 103-116.
4. T. Gorschek et al., “A Model for Technology Transfer in Practice,” IEEE Software, vol. 23, no. 6, 2006, pp. 88-95.
5. A. Sandberg, L. Pareto, and T. Arts, “Agile Collaborative Research: Action Principles for Industry-Academia Collaboration,” IEEE Software, vol. 28, no. 4, 2011, pp. 74-83.
6. C. Wohlin et al., “The Success Factors Powering Industry-Academia Collaboration,” IEEE Software, vol. 29, no. 2, 2012, pp. 67-73.
7. S. Han et al., “Performance Debugging in the Large via Mining Millions of Stack Traces,” Proc. Int'l Conf. Software Eng. (ICSE 12), IEEE, 2012, pp. 145-155.
8. Y. Dang et al., “XIAO: Tuning Code Clones at Hands of Engineers in Practice,” Proc. 28th Ann. Computer Security Applications Conf. (ACSAC 12), ACM, 2012, pp. 369-378.
9. R. Ding et al., “Healing Online Service Systems via Mining Historical Issue Repositories,” Proc. 27th IEEE/ACM Int'l Conf. Automated Software Eng. (ASE 12), ACM, 2012, pp. 318-321.
10. Q. Fu et al., “Performance Issue Diagnosis for Online Service Systems,” Proc. 31st IEEE Symp. Reliable Distributed Systems (SRDS 12), IEEE CS, 2012, pp. 273-278.
11. D. Zhang et al., “Software Analytics as a Learning Case in Practice: Approaches and Experiences,” Proc. Int'l Workshop Machine Learning Technologies in Software Eng. (MALETS 11), ACM, 2011, pp. 55-58.
12. D. Zhang and T. Xie, “Software Analytics in Practice: Mini Tutorial,” Proc. Int'l Conf. Software Eng. (ICSE 12), IEEE, 2012, p. 997.
13. J.-G. Lou et al., “Software Analytics for Incident Management of Online Services: An Experience Report,” to appear in Proc. Int'l Conf. Automated Software Eng. (ASE 13), IEEE, 2013.
65 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool