February 2008 (Vol. 9, No. 2) p. 2
1541-4922/08/$31.00 © 2008 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
Computing Curricula for the 21st Century
|The roles of software engineering and real-time in computing taxonomy|
|Changes in the computing discipline|
PDFs Require Adobe Acrobat
Graduates of computer science (CS) and software engineering (SE) programs are typically employed to develop industry-strength software. Computer engineering (CE) programs focus primarily on computing-system design, often with significant software components. These three programs have different emphases: development of new algorithms versus development of large, complex software systems versus development of small embedded software and device drivers. All three areas require good SE practices.
Regardless which of the three flavors of computing programs you consider, the common denominator is the development of dependable software. Modern systems are intricate combinations of hardware and software, and the industry of the future needs employees who can understand both sides of the system and appreciate how they must seamlessly communicate and interface.
For example, modern aircraft use increasing numbers of computers and dedicated hardware to process the growing amounts of data needed to monitor their systems' status. Software-intensive control systems are distributed across an airframe, connected by an arcane bus structure that supports such components as flight controls, displays, weather radar, data links, propulsion control, actuation terminals, power systems, and fuel and stores management. Hardware and software for coordinating these various functions have become larger and more complex to meet the increasing onboard computational demands. To create such real-world software, developers must understand the system implications of the SE activity. This observation relates to nearly all areas of modern computing application, from home appliances to banking, from toys to nuclear reactor controls, and from entertainment gadgets to medical equipment.
The roles of software engineering and real-time in computing taxonomy
Most computing-education programs focus on theoretical foundations and programming skills. Frequently, the background and interests of CS faculty, who often prefer theoretical topics of scheduling and concurrency, inhibit them from teaching the practical aspects of hardware-software interactions and the real-time dependable-systems development techniques that are so much in demand by the industry. In addition, conventional computing laboratories limit the potential for students' experiments.
Incorporating SE practices into undergraduate computing programs, complementing the teaching of conventional CS courses, is important for the software industry. 1 , 2 The introduction of process scripts, requirements, design and code reviews, and metrics familiarizes the students with the SE discipline. A rigorous repeatable process helps create an environment where the software products are developed more efficiently and with fewer defects.
Although SE is considered critical to the development of almost all modern systems, it is often misunderstood and mischaracterized as a subdiscipline of CS. The 2004 ACM/IEEE-CS curriculum guidelines for SE programs discusses the common misconception that SE is primarily about process-oriented activities (that is, quality assurance, software process management, and project management). 3 Such a view is a misconception about the nature and challenges of SE.
Because of SE's special nature and its relative youth, in the past decade there have been several attempts to provide guidance for the development of undergraduate SE programs. 4 Because SE is concerned with the development of large, complex systems, software development practices require more than just the underlying CS principles. The students need the analytical and descriptive tools developed in CS, and they need both the rigor that the engineering discipline brings to product design and the reliability and trustworthiness of the produced artifacts.
Real-time is the common characteristic of dependable systems. So, it is interesting to find how real-time is treated in the ACM/IEEE Computer Society taxonomy, which categorizes the computing discipline's body of knowledge into subject areas, knowledge units, and lecture topics. 3 The taxonomy treats real-time marginally, only as an elective topic in the operating systems subject area. Another relevant taxonomy, Computing Curricula 2005: The Overview Report, uses the term "real-time" only once in the glossary, when it defines computer system engineering as integrating aspects of CE, SE, and CS. 5
Changes in the computing discipline
The Proceedings of the 2007 Conference on the Future of Software Engineering presented the state of the art in programming environments, empirical methods, architectural challenges, performance and reliability, testing and analysis, mechatronics, complex systems, academia/industry collaboration, globalization, and educational challenges. 6 The editors write,
Software engineering is a rapidly evolving field of research and practice. It is a highly diverse and vast realm of knowledge, spanning from management and process issues in software development to system issues such as safety, quality, and deployment. … As a result, it is difficult for anyone to follow, even at a high level, how the various elements of software engineering research are evolving and what to expect in the future.
However, recent discussion involving computing faculty from the University of California, Berkeley; Cornell University; Stanford University; and Princeton University identifies new directions in computing that might impact education. 7 In the short term, computing innovations might include high-quality machine translation; reliable speech understanding; lightweight, high-capacity e-books; theft-proof electronic wallets; self-healing software, including adaptive networks that reconfigure for reliability; robotics for mine safety and planetary exploration; prosthetics for medical care; manufacturing; and so on.
Nanotechnology and quantum computing could well be fundamental ingredients in the next computing revolution. Massively parallel computation based on swarms of conventional chips underlies another potential revolution. Trustworthy computing will finally overcome its historical market-failure problems and become a commonplace requirement. These innovations often require better tools, extended programming languages, and new processor architectures.
The broad discipline of computing has changed radically in the past 10 years, not only affecting other fields but also being affected by other fields. The principal issue is that modern computing must involve understanding complex interactions with the real world and the integration of systems. In addition to keeping pace with the rapid progress of technology, the critical issue is enforcing engineering discipline when developing, verifying, and validating complex software-intensive systems. An additional significant issue is the need to understand the multidisciplinary nature of the real world, which requires a combination of expertise ranging from control to electrical to computer to software engineering. However, most computing curricula don't address an integrative view of the discipline and haven't matched industry needs and the challenges posed by ever-expanding and increasingly complex applications. The applications that best fit this category are in aviation and aerospace, medicine, transportation, and nuclear energy, where software plays a critical role and its dependability is of paramount importance.
Thomas Henzinger and Joseph Sifakis write about the need to renew the CS curriculum. 8 Software and hardware designers employ different design principles and approaches. Software designers see the system in terms of dynamic objects and threads constituting sequential building blocks or virtual machines with their semantic interpretation of a computational model. Hardware designers compose the system with parallel building blocks representing physical entities with appropriate data flows between them. The blocks have formal transfer function semantics described by a set of equations forming an analytical model. These two models—that is, computational versus analytical—support different design processes.
SE curricula must account for the close relationship between what used to be considered separate: software versus hardware. For example, software application designers focusing on programs that run on microprocessors often overlook programmable-logic devices. The most popular such device is the field-programmable gate array. An FPGA is a prefabricated integrated circuit that you can configure to implement a particular design by downloading a sequence of bits. In that sense, a circuit implemented on an FPGA is literally software. However, circuit designers are still considered hardware specialists, and the algorithms ported to circuits are still considered hardware algorithms. Treating circuits as hardware poses problems in computing-system development, especially for embedded systems. This is because most computing-oriented engineers—software designers—aren't interested in learning hardware design.
The issue extends beyond circuits and hardware and into the concept of two computation models: 9
• temporal models based on state machines, communicating processes, and sequences of instructions ordering tasks in time and
• spatial models with data-flow graphs and logic circuits executing concurrently in a parallel or pipelined fashion.
Owing to software developers' educational bias, they're accustomed to defining algorithms and subroutines,
But they're typically weaker at creating models that also involve some amount of spatial orientation, like parallel processes, data-flow graphs, or circuits, largely because computing education in universities tends to emphasize the former with little attention given to the latter. Yet with embedded systems continuing to grow in importance, such imbalance can't persist much longer. 9
The computing community's recent work supported by the US National Science Foundation CPATH program (CISE Pathways to Revitalized Undergraduate Computing Education, ( http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=500025) clearly confirms a dire need for rejuvenation of computing curricula and injection of computational thinking.
A statement attributed to Edsger Dijkstra, "Computer science is no more about computers than astronomy is about telescopes," sparked an early debate on teaching computing science, pointing out a fundamental problem. Even if a student can understand separate areas such as operating systems, compilers, programming languages, and database systems, does that mean he or she can understand how a computing system functions as a whole? Incidences of safety violations and security attacks attributed to computer systems show that the interaction of various components is the primary issue. As Dijkstra wrote, computer specialists need to apply a more systems-based approach that emphasizes the whole system's functionality and the interdependence of its components. 10 Such ideas should be topics in the future SE curricula.
In the light of decreasing computing enrollment and the outsourcing gloom, questions such as, "will proficiency in both computer science and communications give students a global edge?" have been asked. 11 Similarly, Thomas Hilburn and Watts Humphrey wrote,
Because of the growing impact of software and its historically poor performance in meeting society's needs, the practice of software engineering is in need of substantial changes. One challenge concerns preparing software professionals for their careers; the field must drastically change its approach to software engineering education if it hopes to consistently provide safe, secure, and reliable systems. 12
It's time to undertake more comprehensive analysis of computing education.
With the rapid progress of microelectronic technology, we can expect further expansion of dedicated and programmable hardware that will be developed and verified using complex software tools. The software consists of not only the system and application programs but also the complex software used to develop and verify programmable-logic circuits. SE principles and approaches might need to be applied to the hardware domain. On the other hand, concepts accepted by hardware designers, such as concurrent execution of spatial circuits, might influence future design of massively concurrent software. Hardware-software codesign, the system-based approach, and the related necessity of understanding both sides of the embedded-system spectrum (that is, hardware and software) are the basic tenets of the education of future dependable-system developers.
Software developers must understand the basic real-time concepts of timing, concurrency, interprocess communication, resource sharing, hardware-interrupt handling, and external-device interfaces. Industry needs computing graduates with knowledge of dependable time-critical reactive systems and an understanding of how the software interacts with the operating system and the environment. In addition, they must be able to work as part of a multidisciplinary team and meet rigorous engineering process and certification standards. They might also need to be able to function in multinational companies. These issues must be integrated into computing curricula, potentially becoming a part of several courses. Each course can contribute to the overall objective of understanding real-time dependable software-intensive systems.
Andrew J. Kornecki is a professor in the Department of Computer and Software Engineering at Embry-Riddle Aeronautical University. Contact him firstname.lastname@example.org