by Maurizio Morisio


Maurizio Morisio

Software development as an engineering discipline has benefited from varying levels of automation and software support throughout its history. From the first compilers and editors of the early '50s to the modern and highly sophisticated integrated development environments of today, software engineers have always tried to improve and accelerate their development processes. The software development tools of today do not just encompass the compilation of higher-level languages into assembly or machine language, but offer support for software testing, configuration management, design-time modeling, code generation, software life-cycle management, feature/bug tracking, distributed build management, and provide support for all kinds of tasks in software engineering, software management, and software development.

Some of these tools have become commodities, are wide-spread and freely available in the open-source community, and have often reached a high level of (de-facto) standardization and industry acceptance. Others are still striving for dominance, offer state-of-the-art functionality, or can only be acquired by substantial investment. Either way, all these tools today share one goal: they aim to improve and accelerate the software development process, enabling both large and small business to reduce their software development and maintenance costs. Establishing a good software development workbench has become an indispensable skill for today's software developers and architects alike. The articles selected for this theme section touch on the history of software tools, practical concerns, and specific examples. Therefore, this special issue is relevant not only to developers, but also to software managers who are keen on optimizing their software development processes.

Maurizio Morisio is an associate professor in the Department of Automation and Computer Science, Politecnico di Torino. He's IEEE Software's associate editor in chief for online initiatives. Contact him at maurizio dot morisio at polito dot it.



Guest Editor's Introduction: Where's My Jetpack?

By Simon Helsen, Arthur Ryman, and Diomidis Spinellis
From the September/October 2008 issue of IEEE Software

Look at the cover of a science fiction novel written 30 years ago, and you'll invariably notice that everyone has a jetpack on their back whose rockets let them fly around effortlessly wherever they choose. In our age of skyrocketing oil prices and chronic traffic jams, this vision seems like a cruel joke. Have software development tools gone through a similar hype-and-bust cycle?

Inventive Tool Use to Comprehend Big Code

By Sukanya Ratanotayanon and Susan Elliott Sim
From the September/October 2008 issue of IEEE Software

Software developers often need to understand a large body of unfamiliar code with little or no documentation, no experts to consult, and not much time to do it. The prevalence of this problem is evident in the following query posted in Slashdot and the hundreds of passionate responses to it:

Having just recently taken a new job, I find myself confronted with an enormous pile of existing, unfamiliar code written for a (somewhat) unfamiliar platform—and an implicit expectation that I'll grok it all Real Soon Now. ... What sorts of tools do you use for effectively analyzing and understanding a large code base?

Tool Support for Continuous Quality Control

By Florian Deissenboeck, Elmar Juergens, Benjamin Hummel, Stefan Wagner, Benedikt Mas y Parareda, and Markus Pizka
From the September/October 2008 issue of IEEE Software

If organizations fail to take countermeasures, their long-lived software systems undergo gradual quality decay. Without exception, this degradation affects all of the ISO 9126 quality attributes: reliability, functionality, efficiency, portability, usability, and, above all, maintainability. Continuous quality control lets software engineers identify and resolve quality defects early in the development process, when implementing countermeasures is still inexpensive. Quality control consists of three key elements:

  • clearly defined quality goals;
  • techniques, tools, and processes to analyze a system's current quality state; and
  • appropriate measures to react to identified quality deficits.

Full Life-Cycle Support for End-to-End Processes

By Bernhard Steffen and Prakash Narayan
From the November 2007 issue of Computer

Globalization is a general and inevitable trend that started with enterprises and politics and now increasingly characterizes the process landscape. Worldwide operations require global process modeling, coordination, and—at least since the Sarbanes-Oxley Act and Basel II—transparency. This puts enormous pressure on process management and its efficiency, compliance, reliability, and agility.

The CKC Challenge: Exploring Tools for Collaborative Knowledge Construction

By Natalya F. Noy, Abhita Chugh, and Harith Alani
From the January/February 2008 issue of IEEE Intelligent Systems

Web 2.0's great success is fueled mainly by an infrastructure that lets users easily create, share, tag, and connect content and knowledge. In general, the knowledge created in today's applications for collaborative tagging, folksonomies, wikis, and so on is mostly unstructured: tags or wiki pages don't have semantic links between them and usually aren't related in any structured form. By contrast, ontologies, database schemas, and taxonomies usually contain explicit definitions of, and links between, their components, often with well-defined semantics.


What's New


New Wi-Fi Technology Racing Past Standards Process

By Greg Goth
From the October 2008 issue of IEEE Disbributed Systems Online

While the IEEE Task Group for the 802.11n network standard stalls on ratification, successful network deployments proceed apace, especially in universities.

Coming Soon: Research in a Cloud

By Pam Frost Gorder
From the November/December 2008 issue of Computing in Science & Engineering

A trend is taking shape in the computing industry that could significantly change the way academic research is done. A few years from now, researchers who work with massive data sets might stop processing their data locally and find themselves outsourcing the job to massive commercial data clusters. The US National Science Foundation (NSF) is working with Google, IBM, HP, Intel, and Yahoo to promote the development of technologies that will make these super-sized clusters—called computing clouds—amenable to research.

The Biological Half-Life of Software Engineering Ideas

By Philippe Kruchten
From the September/October 2008 issue of IEEE Software

A product's biological half-life is the time it takes the body to eliminate one half of the product taken in by natural biological means. For example, caffeine's half-life is roughly three and a half hours. Of all the molecules of coffee in the cup I just finished, my body will have eliminated—or broken down into simpler compounds—half in three hours, three quarters in six hours, and so on.

Uniting the Paper and Digital Worlds

By Keri Schreiner
From the November/December 2008 issue of IEEE Computer Graphics and Applications

For several decades now, coupling pens and computing has typically meant yoking a stylus and an electronic tablet. Although quite successful in vertical markets, such efforts have yet to experience widespread adoption, due in part to the average tablet's clunkiness. It's nearly as easy, after all, to simply haul around a laptop. Also, pairing a stylus and a powered piece of plastic lacks the familiarity and relatively low profile of the classic pen and paper combination—a relationship that the standalone digital pens seek to both preserve and revolutionize.

Managing Household Wind-Energy Generation

By Geoff James, Wei Peng, and Ke Deng
From the September/October 2008 issue of IEEE Intelligent Systems

The introduction of distributed energy is changing electricity generation and distribution systems worldwide. Local generation close to load centers and the intelligent management of consumption have both increased. The power industry is finding that such approaches can be cost-effective ways to reinforce local distribution networks, thus deferring infrastructure investment. They provide flexibility for demand response to price signals in deregulated markets, and they improve the electricity grid's robustness by decreasing its dependence on long-distance transmission. Furthermore, these approaches can increase the penetration of renewable-energy technologies in the electricity grid and provide reliable energy in countries where the electricity grid cannot grow fast enough to meet emerging industries' needs.

Practitioners or Academics: Which Are We?

By Sorel Reisman
From the September/October 2008 issue of IT Professional

Many people view the IEEE Computer Society's membership as if it were bifurcated—split between practitioners and academics. Yet, from the standpoint of our profession, there's really no such division. In truth, these are merely labels on a continuum. Viewing the profession in this way lets us more readily understand the products and services the Society might offer in order to help computing professionals around the world self-actualize in their careers.

The Search for Interoperability

By John R. Smith
From the July–September 2008 issue of IEEE Multimedia

Today's online users are consuming tremendous volumes of digital content. Digital video and other types of rich media are quickly growing in an online environment characterized by users with short attention spans and a tremendous amount of choices. As a consequence, the Internet is giving way to the next evolution of digital content, as shown in Table 1. For example, short-form video is gaining prevalence because it's more conducive to the viral exchange among today's social networks and compatible with a diverse field of networked devices, portable media players, and multimedia-enabled mobile phones. Online users also are developing more of an interest in long-tail content, which is becoming more prevalent as digital content consumers are increasingly becoming digital content creators. As content authoring and editing tools continue to improve, digital content creation will become as easy as word processing; and the amount of user-generated content online will continue to grow.

Accelerator Architectures

By Anjay Patel and Wen-mei W. Hwu
From the July/August 2008 issue of IEEE Micro

We are entering the golden age of the computational accelerator. The commercial accelerator space is vibrant with activity from semiconductor vendors, large and small, that are designing accelerators for graphics, physics, network processing, and a variety of other applications. System vendors are introducing tools and programming systems to lower the barriers to entry for software development for their platforms. We are already seeing the initial stream of applications that benefit from these accelerators, and there are definite signs that more are yet to come. The research space is blossoming with very broad, multidisciplinary activity in advanced research and development for new classes of accelerator architecture and applications to tap into their power.

A Life-or-Death InfoSec Subversion

By Camilo Viecco and Jean Camp
From the September/October 2008 issue of IEEE Security & Privacy

Details about failures of complex and wellimplemented information-based attacks on systems are extremely difficult to obtain. Attackers don't announce their crimes, and defenders hide their losses. However, here we can look at a real-life analogue—an information attack on a highly complex security system, that of the Colombian guerrilla group FARC (Fuerzas Armadas Revolucionarias de Colombia, or the Revolutionary Armed Forces of Colombia). This operation included a man-in-the-middle attack, targeted denial of service (DoS), and authentication subversion. The attack on FARC's communications structure is interesting not only because of its electronic and analog components, but also because it was a life-or-death matter.

Toward the Semantic Deep Web

By James Geller, Soon Ae Chun, and Yoo Jung An
From the September 2008 issue of Computer

The World Wide Web is arguably the greatest technological success in history. Starting from zero in 1990, it grew to 16 million pages by the end of 1995. In 2008, more than 1.4 billion webpages are accessible to anybody with an Internet connection and a computer.

With the explosion of the Web, indexing and searching the content of all Web documents has become a perpetual challenge, in spite of the continuous technological advancements of Web search engines. However, this challenge is even greater when considering the Web data not accessible by search engines.

Modeling Healthcare Logistics in a Virtual World

By Craig W. Thompson and Fran Hagstrom
From the September/October 2008 issue of IEEE Internet Computing

This is the final Architectural Perspectives column. One of the main themes of these columns has been pervasive computing — exploring a collection of technologies that could work together to make it easier to usher in the Internet of Things. In this last installment, we review enabling technologies, then take a look at how we can use virtual worlds (in particular, Second Life) to accelerate pervasive computing's development.

Designing Micro- and Nanosystems for a Safer and Healthier Tomorrow

By Giovanni De Micheli
From the September/October 2008 issue of IEEE Design & Test of Computers

In this article, I provide an assessment of the current status and needs of electronic systems, their design, and their evolution. To put this analysis into perspective and to motivate it, I consider the progress of electronics over the past 50 years, from the invention of the transistor to the microprocessor, to the design of complex multiprocessors that we see today within gaming consoles and other appliances. Looking forward to the next 50 years, I want to address how we have affected and will affect society with our inventions and products, from personal computers and communicators to the upcoming networked systems for health and environmental monitoring.

OpenStreetMap: User-Generated Street Maps

By Mordechai (Muki) Haklay and Patrick Weber
From the October–December 2008 issue of IEEE Pervasive Computing

The process of mapping the Earth accurately was, until recently, the preserve of highly skilled, well-equipped, and organized individuals and groups. For many years, it was usually the role of surveyors, cartographers, and geographers to map the world and transcribe it on paper or, since the 1960s, into the computer. Lewis and Clark's expedition to map North America's west, and Lambton and Everest's Great Arc expedition to measure India, are just two famous episodes in the history of maps and map making. Each country has an established national mapping agency charged with keeping the national maps accurate and current (for example, the US Geological Survey and the UK Ordnance Survey).