Past as Prolog(ue): Humans, Machines, and 20 Years of Internet Computing

William Regli, DARPA

Pages: 8–10

Abstract—To celebrate IEEE Internet Computing's 20th anniversary, William Regli reflects on writing the lead technical article of the inaugural issue in 1997. Some of the observations he made two decades ago, as this discipline arose, were temporal in nature, and indicative of the times. Others, however, carried a certain prescience while distilling enduring themes.

Keywords—Internet/Web technologies; human-machine interfaces; artificial intelligence; intelligent systems; Internet computing; security and privacy

My dear Kepler, what would you say of the learned here, who, replete with the pertinacity of the asp, have steadfastly refused to cast a glance through the telescope? What shall we make of this? Shall we laugh, or shall we cry? —Letter from Galileo Galilei to Johannes Kepler (1610) as quoted in The Crime of Galileo (1955) by Giorgio De Santillana

Twenty years ago I had the privilege of authoring the lead technical article of the inaugural issue of IEEE Internet Computing.1 The article's topic was “Internet-Enabled Computer-Aided Design,” and I reflected on the emerging, pre-Web 2.0 confluence of Internet technology and the practices for design, manufacturing, and industry lifecycle activities. Thankfully, I made no Bob-Metcalfe-esque predictions that will require I eat my words or anything else,2 but I thought I’d take a look at what I wrote back in 1997 and see how I scored. It's interesting to see not only what I got right and wrong, but also what this reveals about the true nature of the fundamental problems that remain. Technology changes, but often the scientific questions and issues remain constant. Or, as Mark Twain is reported to have said poetically, “History doesn't repeat itself, but it often rhymes.”

The Vision of “Internet-Enabled Computer-Aided Design”

The vision outlined in that article saw a highly connected world of people and software services, in which designers could orchestrate the specification and fabrication of products of incredible complexity through their design tools. Many of the concepts I presented were central to DARPA's Manufacturing Automation and Design Engineering (MADE) Program that I was a part of at that time. It's quaint to consider what we thought the world would look like from a vantage point before Google, iPhones, and social media when most connections were still dial-up. The following were among the article's specific observations and prognostications:

  • an information marketplace for engineering services would emerge, enabling seamless connection between producers and consumers of engineering information;
  • that this marketplace would create a “long tail” that included small- to medium-sized manufacturers’, designers, and other content producers;
  • increasing integration of activities within a “browser”-like approach — for example, being able to invoke applications, search catalogs, and create embedded programs; and
  • the Internet would become the preferred method to deliver training materials, catalogs, and software.

How's that for my attempts as a soothsayer?

What I Got Right

Many of the ideas in the original article have come to pass. With the rise of the maker movement and ubiquitous 3D printing, and abetted by industrial policy under the Obama administration, an entire service industry for manufacturing is emerging. Parts are available on demand, design activities can be crowdsourced, and online catalogs can be accessed and searched.

Although many of the specific technologies mentioned in my 1997 piece are considered relics, they were certainly early indicators of the kind of off-the-desktop pervasive computing that has become common. In 1997, Java programming language and object technologies seemed really important. Indeed, while they have proven to be, Java has had staying power while many of the object technologies have faded from memory. Even so, these object frameworks and their descendants are indeed central to normal software development, as well as the foundation for current services-oriented software enterprises.

One issue I noted was that Internet-enabled engineering would give rise to challenges for security and trust, in particular the protection of IP and balance between information sharing and confidentiality. We're seeing these issues play out today in the tension between traditional views of data (as proprietary or corporate assets) and the new emphasis on open data, research reproducibility, and data commons. Hardly a day goes by when we don't hear about something related to cybersecurity, cyberespionage, or cybercrime.

What I Got Wrong

What hasn't changed in the broader design and manufacturing marketplace are the business processes. Although computing and Internet technologies have infiltrated nearly all aspects of industrial life, the fundamental processes of design and manufacturing remain those established after World War II. Computing has made things faster, better, more accurate, and customer-centric, but the level of disruption is minimal compared to the disruption witnessed over the last decade in the music, journalism, and media industries. Today's thinking around Industrie 4.0 in Europe and the Second Industrial Revolution suggests that we have yet to really see the acceleration in innovation and manufacturing from the digital age.

The original article analyzed object technologies and standards that, at the time, seemed important. Today, many are eclipsed by newer standards or relegated to the dustbin by the constant churn of technology. As ephemeral as most of these standards were, each can be viewed in the context of the longlasting problems they aimed to address: chiefly, representation and interoperability.

In 1997, the vision was of the browser as “portal” or even “operation system.” After some detours around ideas such as “thin clients,” perhaps we have arrived. The “Web” browser is no longer king, having been displaced by “the cloud,” virtualization, off-the-desktop computing, and other concepts (most notably wearable and smartphone technologies) which augur that computing is now a nearly limitless fabric to be orchestrated.

I was clearly wrong in my concerns about Internet performance and capacity not being able to rise to the needs of engineering data exchange. In 1997, high-capacity connectivity was a premium service outside the reach of most organizations — now, we have roughly 1,000 × more bandwidth capacity on our personal smartphones than many organizations had at an institutional level.

Last, the role of standards was also a main theme, and I listed numerous standards at various levels of abstraction (data exchange standards for design information, programming interface standards, and Internet Protocol standards among them). Although standards certainly did play, and continue to play, a critical role in integrating engineering and the Internet, I’m struck by how little has actually changed in this regard. Innovation and adoption is still impeded by the lack of standards or the choice of a poor one. Legacy approaches hang around because of the stickiness of standards, and companies still vie to control standards to their profit advantage.

What Remains

I find myself presently directing the DARPA Office, the Defense Sciences Office, that sponsored the MADE Program and other efforts two decades ago. Looking back on that piece, from my current professional position and with the intervening two decades of experiences after I wrote the article, I find I've become drawn to the problems that are perpetual and evergreen. Technology changes and evolves, but often the fundamental questions remain. Internet computing, as broadly defined and covered by this journal, has been one of the transforming technologies for human endeavor. With technology comes new capacity for building tools with which we can study the fundamental questions. These same two decades provide hindsight on the original article's ideas while extrapolating the fundamental issues.

The Value of Time and Human Attention

The late James Gray noted in his 1999 Turing Award Lecture that the fundamental limiting resource for computing wasn't CPU power or memory storage, but human attention. Considering that observation, we can see its reflection in all major Internet innovations since 1997. Tools for search, access to better simulations, and improved tools for integration and interoperability all drive toward applications that enable engineers to more deeply and accurately explore their design and manufacturing options in less time. Optimization of the allocation of human attention in a world where the amount and complexity of information is overwhelming will continue to create opportunities for innovation.

Models, Representation, and Integration

In the past two decades, scientific areas such as astronomy, life sciences, ocean science, and others have seen fundamentally new scientific methods emerge based on open data and data-driven discovery. Often the core enablers of these transformations have been about information capture and knowledge representation, the creation of computational models, and ensuring that these new data and computational methods can interoperate. Representation remains an issue in many areas where we have a surfeit of data but a paucity of models. I would assert that the coming decades will see accelerated opportunities for new forms of computational representation and modeling, often driven by specific end-user problems.

Integrating Human and Human-Machine

While not as evident in 1997, I increasingly see opportunity from the Internet and Internet computing technology as achieving human-machine integration. The Internet is both our human integration fabric (enabling new forms of communication, collaboration, and organization) as well as the mechanism that gives individuals access to the totality of human knowledge. Things bubbling in 1997 could be classified broadly as either technical building blocks (that is, the languages and standards) or systems for human insight (search, catalogs, and so on). Perhaps from this viewpoint of enabling the human-machine team, we might see new frontiers for Internet-based information systems.

Optimistically, I look forward to the next 20 years of Internet-enabled innovation and discovery. As more of the world's population and creative enterprise become accelerated by Internet technologies, we can expect further disruptions and innovations. My hope is that, on the balance, these innovations continue enhancing our collective capabilities and enable us to more rapidly — and creatively — design our way out of societal problems and toward our full planetary potential.


William Regli is the Acting Director of the Defense Sciences Office for DARPA, on leave from his position as a professor of computer and information science for Drexel University's College of Computing and Informatics. His current interests include computational tools to exploit the properties of advanced materials, additive manufacturing systems, and enabling new paradigms for design and production. Regli has a PhD in computer science from the University of Maryland at College Park. He's a Fellow of the IEEE Computer Society and a Senior Member of both ACM and the Association for the Advancement of Artificial Intelligence (AAAI). Contact him at
67 ms
(Ver 3.x)