The Community for Technology Leaders
RSS Icon
Issue No.04 - July/August (2010 vol.27)
pp: 44-50
Eric Bouwers , Software Improvement Group, Amsterdam
Arie van Deursen , Delft University of Technology, Delft
Architecture evaluations offer many benefits, including the early detection of problems and a better understanding of a system's possibilities. Although many methods for evaluating architectures are available, studies have shown that industry's adoption of architecture evaluations is low. A reason for this lack of adoption is the limited out-of-the-box process and tool support available to start performing architecture reviews. This article introduces the lightweight sanity check for implemented architectures (LiSCIA) evaluation method. LiSCIA can be used out of the box to perform a first architectural evaluation of a system. The check is based on years of experience in evaluating the maintainability of software systems. By periodically performing this check, developers and project managers can control the implemented architecture's erosion as the system (and its requirements) evolves over time.
software architectures, software architecture evaluation, architecture erosion, software quality
Eric Bouwers, Arie van Deursen, "A Lightweight Sanity Check for Implemented Architectures", IEEE Software, vol.27, no. 4, pp. 44-50, July/August 2010, doi:10.1109/MS.2010.60
1. P. Clements and P. Kogut, "The Software Architecture Renaissance," Crosstalk—The J. Defense Software Eng., vol. 7, 1994, pp. 20–24.
2. P. Clements, R. Kazman, and M. Klein, Evaluating Software Architectures, Addison-Wesley, 2005.
3. M.A. Babar, L. Zhu, and D.R. Jeffery, "A Framework for Classifying and Comparing Software Architecture Evaluation Methods," Proc. 2004 Australian Software Eng. Conf. (ASWEC 04), IEEE CS Press, 2004, pp. 309–319.
4. L. Dobrica and E. Niemelä, "A Survey on Software Architecture Analysis Methods," IEEE Trans. Software Eng., vol. 28, no. 7, 2002, pp. 638–653.
5. M.A. Babar and I. Gorton, "Software Architecture Review: The State of Practice," Computer, vol. 42, no. 7, 2009, pp. 26–32.
6. E. Bouwers, J. Visser, and A. van Deursen, "Criteria for the Evaluation of Implemented Architectures," Proc. 25th Int'l Conf. Software Maintenance (ICSM 2009), IEEE CS Press, 2009, pp. 73–83.
7. A. van Deursen and T. Kuipers, "Source-Based Software Risk Assessment," Proc. Int'l Conf. Software Maintenance (ICSM 03), IEEE CS Press, 2003, pp. 385–388.
8. I. Heitlager, T. Kuipers, and J. Visser, "A Practical Model for Measuring Maintainability," Proc. 6th Int'l Conf. Quality of Information and Comm. Technology (Quatic 07), IEEE CS Press, 2007, pp. 30–39.
9. D.E. Perry and A.L. Wolf, "Foundations for the Study of Software Architecture," SIGSOFT Software Eng. Notes, vol. 17, no. 4, 1992, pp. 40–52.
10. T. Kuipers and J. Visser, "A Tool-Based Methodology for Software Portfolio Monitoring," Software Audit and Metrics, Inst. for Systems and Technologies of Information, Control, and Comm. (INSTICC) Press, 2004, pp. 118–128.
11. P. Clements et al., Documenting Software Architectures: Views and Beyond, Addison-Wesley, 2003.
12. P. Tarr et al., "N Degrees of Separation: Multi-dimensional Separation of Concerns," Proc. 21st Int'l Conf. Software Eng. (ICSE 99), ACM Press, 1999, pp. 107–119.
21 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool