Automating Development
By David Alan Grier

rows of laptopsOne of my former students, a young man named Devin, surprised me recently when he told me he spent about $3,000 to take a course on Ruby on Rails—a Web development framework. Almost immediately, I asked him why he needed to take such a course. I try to educate my students in first principles. I teach the ideas on which our field is based and show how to reason from them to solve problems.

My approach, the strategy of beginning with fundamental concepts, has worked well for me, but I am a teacher. A discipline that works for a teacher does not always work for a student. Hence, I was not surprised by his answer. “I need to make a living in Web programming,” he said, “and that that means that I need to think like a Web designer.”

Devin’s reply identified a fundamental tension in computing. From one perspective, computing can be viewed as a science or at least field like mathematics. The basic ideas of this field are built upon a core set of principles. In theory, anyone who masters these principles should be able to solve any kind of computing problem or create any kind of system.

Yet, from a different perspective, computing is also an engineering discipline, which requires practical problem solving. In such a field, computing professionals need to worry about the best use of resources, about cost and efficiency, and ultimately about engaging the work of the past—computer systems written by others that have been operating for years.

The first computers were built by engineers, but computing, especially computer software, became an engineering discipline only after computers had been operating for nearly 35 years. Although many early computer programmers glimpsed the aspects that would be part of software engineering, they did not start to build a formal discipline until the 1970s.

Before this conversation with Devin, I described software engineering by claiming that those in the field had to learn that they actually engineer system quality rather than any physical or symbolic entity. In this story, I said that the goal of software engineering is to identify a set of requirements for a system and then produce a system that achieved those requirements as much as possible. To reach this goal, early software engineers borrowed their ideas from the field of quality control. In particular, they borrowed heavily form the work of Walter Shewhart, a manufacturing engineer with AT&T during the 1930s. From his ideas, they created the software life cycle, which remains at the center of software engineering practice.

Yet, as Devin reminded me, the software life cycle is not all of software engineering. Software engineering has other elements that developed independent of the software life cycle. These elements include formal specification languages, programming tools, development practices, testing and debugging methods, as well as configuration tools. Many of these elements were first tested in an approach that was called computer-aided software engineering, or CASE.

The CASE approach flourished during the 1970s and 1980s. Like other branches of computer science, it borrowed its ideas from sources outside computing. Notably, it borrowed from the computer-aided design (CAD) movement.

CAD is now so common that we forget it was once new. We associate it with graphical design packages, such as AutoCAD, but it actually has a much broader basis. During the 1970s, it was shaped not only by the rise of computer graphics but also by analytical systems that modeled or tested designs. It used finite element analysis, for example, to test structural designs, and linear programming to make optimize resource use.

CASE followed the path established by CAD, but it had a very different set of problems. The pioneers in this field were divided into three distinct sectors: framing, programming, and environmental.

The framing sector encompassed the problems of defining software requirements and testing basic ideas. It created formal specification languages to record system requirements and rapid prototyping systems that could test ideas on a realistic scale.

During the 1980s, the programming sector was often mistakenly identified as the entire CASE methodology. However, it included only those activities involved in actually producing code for the system. It created tools such as sophisticated editors, libraries, databases and other modules, and high-level languages. In the late 1970s, AT&T created an early version of these tools for the Unix system. AT&T called these tools “the programmer’s workbench.” They included things still familiar to any Unix programmer, such as the vi editor, C++, grep, awk, lexx, and yacc.

The last sector of CASE, environmental, included tools to debug, test, and configure software. It included symbolic debuggers, automatic test generators, and virtual environments that could be configured to simulate specific hardware.

Integrated CASE software systems appeared on the market in about 1983 and were easily available through the late 1990s. These systems included tools from all three sectors of CASE development. They were never very popular or widely used. This lack of popularity was discussed in many articles published by the IEEE Computer Society and the ACM. Most of these articles identified two reasons for the low usage of CASE. They noted that CASE tools were expensive. Few companies saw enough benefit in them to invest in the tools and train their staffs on how to use them.

These articles also noted that CASE tools complicated the work of software developers, requiring them to learn a second system on top of the one they were building. Most developers preferred to concentrate on their work rather than add the additional layer of technical knowledge that CASE required.

Most histories claim that the CASE movement was associated with mainframe computers and died when we replaced mainframes with distributed servers. This statement has some truth, but it misses a key point. CASE was absorbed by software engineering. As vendors ceased to make independent CASE tools, software development environments began to include symbolic debuggers, test generators, and operational environments. As technical journals stopped publishing articles on CASE, training firms started teaching formal specifications and testing procedures to developers like my former student Devin. By the early 21st century, CASE had vanished as an independent field but could be found in every form of software development.

Ultimately, the story of CASE illustrates a key challenge of software engineering—balancing general principles against the needs of specific problems. We would like to think that software engineering consists of a set of broad ideas that can be applied to all programming circumstances. However, software engineering applications are so diverse that we need to modify our general principles to meet the needs of each circumstance. Hence, we no longer have general CASE systems, but we find its ideas in every development system, from Ruby on Rails for simple websites to the development system of the most complex problems in the most sophisticated environments.

So, Devin was probably right in spending his money to be trained on a specific development tool. Although software engineering is supported by general principles, it is practiced by narrow communities with specific tools.

David Alan Grier circle image

About David Alan Grier

David Alan Grier is a writer and scholar on computing technologies and was President of the IEEE Computer Society in 2013. He writes for Computer magazine. You can find videos of his writings at He has served as editor in chief of IEEE Annals of the History of Computing, as chair of the Magazine Operations Committee and as an editorial board member of Computer. Grier formerly wrote the monthly column “The Known World.” He is an associate professor of science and technology policy at George Washington University in Washington, DC, with a particular interest in policy regarding digital technology and professional societies. He can be reached at