Guest Editor's Introduction

By Anup Ghosh


Anup Ghosh

E-voting entered the popular lexicon after the 2000 US presidential election. The number of irregularities in both paper ballots (remember hanging chads?) and e-voting machines cast widespread doubt on the integrity of the system for recording and counting voter ballots in elections. Since then, the Help America Vote Act (HAVA) and associated funding led to widespread deployment of e-voting machines throughout the US. However, this deployment only led to more controversy about the accuracy and security of e-voting machines in subsequent elections given a wide range of motivated bad actors including hackers, political parties, election officials, and even the machines' manufacturers.


With the upcoming US presidential election in November in mind, I've assembled several recent articles on e-voting from IEEE Security & Privacy and Computer. "The Dynamics of Counting and Recounting Votes" highlights the complexity of the problem. "An Optical Scan E-Voting System based on N-Version Programming" demonstrates how using N-version programming and improving an e-voting authentication system and the resulting data transmission could further enhance the security of the electoral process. "A Three-Ballot-Based Secure Electronic Voting System" covers a proposal from three Brazilian authors for a secure electronic voting system based on the three-ballot scheme developed by Ron Rivest and Warren Smith. And finally, "Secure and Easy Internet Voting" presents a case study of an e-voting system used in several Cantons in Switzerland that allows users to vote over the Internet and via mobile devices using SMS.


In addition, links to other articles and resources will enable you to delve further into e-voting systems and security and privacy in general. No doubt, this topic will grow in importance and reach a crescendo pitch this November. In the interim, take the time to get smart about the issues and proposed solutions developed by computer professionals like you.


Anup Ghosh is a research professor and chief scientist at the Center for Secure Information Systems at George Mason University. He's also a member of IEEE Security & Privacy's editorial board. Contact him at anup dot ghosh at computer dot org.



Theme — E-VOTING

   

The Dynamics of Counting and Recounting Votes

By Alec Yasinsac and Matt Bishop
From the May/June 2008 issue of IEEE Security & Privacy

Accuracy is a key component of a fair election. A reliable voting mechanism must accurately capture the vote, with a ballot that correctly reflects the voter's choice, as well as accurately count and tabulate the votes. The system must also accurately report the result, correctly declaring the winner. A system that fails to do any of these things can cause an overall error in the election results.

Given how much a representative government depends on election results, voting mechanisms should be scrupulously crafted to ensure each election's accuracy. However, the US presidential election in 2000, for example, was an exercise in confusion; in some jurisdictions, voters found the ballots confusing while others had controversies about accurate tallies. This controversy, coupled with the desire to eliminate paper storage and provide better access, led to the US Help America Vote Act (HAVA). That act, and ancillary funding, led to the widespread deployment of electronic voting (e-voting) systems.


A Three-Ballot-Based Secure Electronic Voting System

By Altair O. Santin, Regivaldo G. Costa, and Carlos A. Maziero
From the May/June 2008 issue of IEEE Security & Privacy

Today, there's a wide understanding that traditional voting systems should be computerized to reduce the vote counting time, provide evidence that a vote is being correctly accounted, reduce fraud, remove errors in filling out ballots, and improve system usability for people with special needs. In fact, E-voting are increasingly replacing traditional paper-based systems. This raises several security issues, given that democratic principles depend on the electoral process's integrity. Providing security to voting systems isn't trivial. Beyond the classic security properties (integrity, confidentiality, and availability), other properties need to be ensured. Some e-voting system requirements seem contradictory, like ensuring voter authenticity and vote anonymity, providing a vote-counting proof while preventing vote trade, allowing voting via the Internet but avoiding voter coercion, guaranteeing the uniqueness of the vote in decentralized voting, allowing vote automation while providing vote materialization, and ensuring auditability in a software or hardware environment that could malfunction.


An Optical Scan E-Voting System based on N-Version Programming

By Inaki Goirizelaia, Maider Huarte, Juanjo Unzilla, and Ted Selker
From the May/June 2008 issue of IEEE Security & Privacy

Researchers are working hard to ensure that the e-voting systems used today—and the ones that will be used in the near future—are secure, democratic, and well-designed, especially given that there has been concern about possible dangers of e-voting systems.

For example, when the Irish Government looked into e-voting for the 2004 European elections, it decided not to use e-voting systems. The Commission on Electronic Voting of Ireland concluded that, because of public concerns about transparency, it couldn't recommend using the proposed system at the local and European elections. Its reports state that testability and the ability to audit the ballots would help maximize trust in voting systems. But if the audit trail is paper, and requires conventional counting, it wouldn�t be able to achieve the accuracy level of electronic counting. Counting paper ballots by hand is a very difficult task when you have lot of different questions and different people counting.


Secure and Easy Internet Voting

By Giampiero E.G. Beroggi
From the February 2008 issue of Computer

Although modern societies rely heavily on information and communication technology for business, work, and leisure time activities, they have thus far seemed hesitant to use ICT for democratic decision-making activities such as voting. Meanwhile, the lost and uncounted votes associated with current paper ballots could very well be contributing to biased political decisions.

One reason for the delay in implementing more technologically sophisticated voting methods is the computer science community's almost unanimous wariness of Internet-based elections. Many governments have simply dismissed e-voting as too risky. Others are not fully aware of e-voting's strong advantages over paper ballots: reliable and secure vote casting, precise vote counting, the option to conduct voting in a centralized and decentralized manner, and the rapid availability of results.


 


What's New

   

Whither Bluetooth?

By Franklin Reynolds
From the July–September 2008 issue of IEEE Pervasive Computing

Almost all smart phones sold today include Bluetooth—a short-range, low-power radio technology first introduced in the late 1990s. Bluetooth-enabled accessories for phones—in particular, hands-free headsets—are also quite popular. In the less than 10 years, since version 1.0 of the Bluetooth specification was published, nearly 2 billion Bluetooth products have shipped. Here, I take a look at some of the technology improvements and new applications on the horizon.


Using Process Tailoring to Manage Software Development Challenges

By Peng Xu and Balasubramaniam Ramesh
From the July/August 2008 issue of IT Professional

In today's turbulent business environment, software development organizations must continuously tailor their processes to meet evolving project goals and business requirements at Internet speed. A software process defines the practices or activities to be performed, each activity's critical characteristics (such as inputs and outputs, entry and exit criteria, and stakeholder roles), and the relationships among the activities.


Cassandra or False Prophet of Doom: AI Robots and War

By Noel Sharkey
From the July/August 2008 issue of IEEE Intelligent Systems

Artificial intelligence is just as famous for its false prophesies as for its role in science fiction. In the early 1960s, we told the public that robots would do all their household chores and menial tasks within 20 years. That was nearly 50 years ago, and we're just beginning to get the floors vacuumed. We're also infamous for creating superintelligent robots that will take over the world and kill everyone. This idea has been wedged into the human psyche ever since 1921, when Karel Capek first used the word "robot" in his apocalyptic play R.U.R (Rossum's Universal Robots).


Semantic Wikis

By Sebastian Schaffert, François Bry, Joachim Baumeister, and Malte Kiesel
From the July/August 2008 issue of IEEE Software

Wikis let users practice lean knowledge management: we can enter text and other data, such as files, and connect the content through hyperlinks. Wikis offer easy setup and a huge variety of editing support in all types of intranet- and Internet-based information sharing. The drawbacks show up when we need to structure data as opposed to just edit text. Many wikis have tons of useful content, but the volume and lack of structure make it inaccessible over time. This is where semantic wikis enter the picture. In this article, Sebastian Schaffert and his colleagues describe semantic wikis and explain how to model wiki knowledge and content for improved usability.


The Changing Design Landscape

By Ajith Amerasekera
From the July–August 2008 issue of IEEE Design & Test of Computers

In 1965, Gordon Moore introduced his now famous law on the doubling of IC complexity. For many decades, this doubling has been realized through process technology innovation that has enabled semiconductors to achieve their complexity roadmap through scaling of transistor geometries. As we approach the 45-nm technology node of the International Technology Roadmap for Semiconductors (ITRS), this scaling is alive and continuing, but some of the supporting infrastructure is not tracking it. In particular, performance is no longer increasing by significant amounts at every node. In conjunction with this, the equipment capabilities to print these incredibly small geometries are now the main driver for the process technology.


Visualizing Data: Seeker's Affective Interaction

By Ann Finegan, Josephine Starrs, and Leon Cmielewski
From the April–June 2008 issue of IEEE MultiMedia

Seeker is an interactive data-mapping media-art installation, created by Leon Cmielewski and Josephine Starrs, that won an Award of Distinction at Ars Electronica in 2007. Made for three large screens, it explores the themes of diaspora and displacement under late-twentieth and early twenty-first century capitalism. Seeker combines dynamic content with the deep immersion of its cinematic scale and content. The use of cinema is matched by the historical patterning of the work's themes. Blood diamonds, oil, dictatorships, war, terror, and refugee counts are all tracked as data flows of people, money, economics, geopolitics, and power across two large, affective screens (see Figure 1). One screen shows evocative landscapes of vast depeopled spaces (sites of ancient civilizations such as Lake Mungo in outback Australia). The other screen shows teletext news of refugees lost or found layered over aerial images of cities tracked by Google Earth's view of places from or to which people are escaping.


Deploying Rural Community Wireless Mesh Networks

By Johnathan Ishmael, Sara Bury, Dimitrios Pezaros, and Nicholas Race
From the July/August 2008 issue of IEEE Internet Computing

Wireless mesh networks create a resilient infrastructure using a combination of wireless networking technology and ad hoc routing protocols, which together let service providers or communities establish networks in places without prior groundwork. A WMN is a self-managing network in which all nodes act as routers that can route traffic either directly or via a multihop path. The system is dynamic; it can adapt to nodes entering the network or those exiting it due to node failure, poor connectivity, and so forth. Mesh networking's robust nature makes it an ideal technology to use in rural villages in which establishing a wired network would be overly complex.


Information and Communication Technologies for Development

By Kentaro Toyama and M. Bernardine Dias
From the June 2008 issue of Computer

On a planet with 1.2 billion Internet users, a far less fortunate 1.2 billion people survive on less than a dollar a day. The same technology that has transformed our lives—the lives of the wealthiest people on the planet—remains out of reach and irrelevant for the poorest.

As if in sudden recognition of this stark gap, we have witnessed in the past decade an explosion in projects that apply information technology to support socioeconomic development. What is the value of a PC in a rural school? How do you design an interface that an illiterate migrant worker can use? Can computing technology have a positive impact on a farmer earning a dollar a day? These are just a few questions that scholars and practitioners have begun to explore with increasing creativity and ambition.


System-Level Performance Metrics for Multiprogram Workloads

By Stijn Eyerman and Lieven Eeckhout
From the May/June 2008 issue of IEEE Micro

Performance metrics are the foundation of experimental computer science and engineering. Researchers and engineers use quantitative metrics for assessing their new ideas and engineering progress. Obviously, adequate metrics are of primary importance to research progress, whereas inappropriate metrics can drive research and development in wrong or unfruitful directions.

The recent trend toward multicore and many-core systems makes adequate performance metrics for assessing the performance of multithreaded computer systems running multiple independent programs essential. There are two key reasons for this need. First, as the number of on-chip cores increases exponentially according to Moore's law, more multiprogram workloads will run on the hardware. Second, coexecuting programs affect each other's performance through sharing of resources such as memory, off-chip bandwidth, and (potentially) on-chip caches and interconnection networks. In addition, programs can share resources within a single simultaneous multithreading (SMT) core.


Medical Software has Astronomers Seeing Stars

By Pam Frost Gorder
From the July/August 2008 issue of Computing in Science & Engineering

A project at Harvard University is proving that two very different disciplines have very much in common. The Astronomical Medicine Project is working to convert medical imaging software into tools that fuel discoveries in astronomy. But if the scientists behind the project have their way, any discipline that relies on large, complex data sets will reap the benefits.


Software-as-a-Service: The Spark That Will Change Software Engineering?

By Greg Goth
From the July 2008 issue of IEEE Distributed Systems Online

Software-as-a-Service (SaaS) is receiving a lot of attention in analysts' briefings and technology trade press articles. In the past year, SaaS has emerged from its pioneering group of start-ups and medium-sized vendors to be embraced, albeit awkwardly, by software giants including Oracle and SAP.


Virtual Control Desks for Nuclear Power Plant Simulation

By Maurício Alves C. Aghina, Antônio Carlos A. Mól, Carlos Alexandre F. Jorge, Cláudio M.N.A. Pereira, Thiago F.B. Varela, Gerson G. Cunha, and Luiz Landau
From the July/August 2008 issue of IEEE Computer Graphics and Applications

Operator-training programs for nuclear power plants (NPPs) must take into account high safety requirements. These programs must efficiently train workers to manage both normal and abnormal operational conditions and to respond correctly to abnormal conditions through established procedures. The "Nuclear Power Plant Operation" sidebar provides an overview of NPPs and their operational challenges.


Computing Then


The Technical Development of Internet Email

By Craig Partridge
From the April–May 2008 issue of IEEE Annals of the History of Computing

The explosive development of networked electronic mail (email) has been one of the major technical and sociological developments of the past 40 years. A number of authors have already looked at the development of email from various perspectives. The goal of this article is to explore a perspective that, surprisingly, has not been thoroughly examined: namely, how the details of the technology that implements email in the Internet have evolved.


Jack Dongarra: Supercomputing Expert and Mathematical Software Specialist

By Thomas Haigh
From the April–May 2008 issue of IEEE Annals of the History of Computing

Jack J. Dongarra was born in Chicago in 1950 to a family of Sicilian immigrants. He remembers himself as an undistinguished student during his time at a local Catholic elementary school, burdened by undiagnosed dyslexia. Only later, in high school, did he begin to connect material from science classes with his love of taking machines apart and tinkering with them. Inspired by his science teacher, he attended Chicago State University and majored in mathematics, thinking that this would combine well with education courses to equip him for a high school teaching career. The first person in his family to go to college, he lived at home and worked in a pizza restaurant to cover the cost of his education.