The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2013 vol.46)
pp: 124
Published by the IEEE Computer Society
David Alan Grier , George Washington University
ABSTRACT
We don't have satisfactory answers to questions that ask what we must share with our neighbor for our common good and the rights we retain in the process. Yet such answers will be the legacy of our age. The Web extra at http://youtu.be/U0piYSt35YU is a video segment in which author David Alan Grier expands on his Errant Hashtag column, in which he discusses how the idea of "smart cities" makes more sense than the idea of "smart homes."
We have to be honest with ourselves. When Big Brother arrived, we did not resist. When we were told that we could purchase portable devices that would monitor our every move, we rushed to the discount stores to see what kind of bargain we could find. Only after making the fatal purchase did we starting thinking about security, privacy, and the consequence of our actions. Such questions have become increasingly visible as we start deploying new invasive technologies, such as the Internet of Things, smart cities, and big data. Yet, as we consider such issues, we also need to contemplate a related set that considers what information a society must, by nature, share.
Over the past 50 years, we’ve generally overlooked the necessity and responsibility of shared assets, even though they’re at the heart of one of the more important classes of successful software systems. In general, software systems have been successful in society if they’ve done one of four things: automated control, replaced human labor with machine labor, expanded markets through unification and customization, and finally, enabled communities to share expensive resources.
Some of the most common shared assets are transportation systems. For example, my neighborhood, like several American cities, now has a shared bicycle program managed through an extensive software system that tracks when the bikes are rented, identifies where each trip begins and ends, and bills the riders. With this information, the system lets its participants share two expensive assets: the urban real estate where the bicycles are stored and the fleet of bicycles itself. The software produces a schedule that allows a team of some 20 trucks to redeploy the bikes during the day, clearing the spaces at overfilled stations and replenishing the empty ones.
Without shared information, this system would likely be impracticable: if it had to rely on the natural flow of bicycles throughout the city, it would require more bikes and larger stations than the current system. It would also be so expensive that many of its users would have to seek other forms of transportation.
Lurking behind the success of the bicycle program is the now too common specter of personal privacy. The system keeps a database about its customers: the trips they commonly take, the hours they commonly travel, and the places that they visit only occasionally and perhaps with a special purpose in mind, a purpose that the individuals might have good reason for wanting to keep private.
Over the past year, I’ve seen several presentations that might easily be entitled “Better Living through Invasive Surveillance.” In these talks, I’ve heard researchers describe how to monitor our children’s exercise through Doppler radar, our personal health through highly instrumented toilets, our economic judgment through our aggregated purchases, and our psychological well-being through a massive analysis of our physical data. At some point, all of these presentations have been stopped by a member of the audience asking the speaker if he or she would be willing to live in the kind of world implied by that particular presentation. A few have said that they would not, which usually ends the presentation in a moment; the rest have equivocated and said they would if it produced enough benefits for them. In my experience, neither answer is entirely satisfactory to the audience.
But in the end, we really don’t have any satisfactory answers to the questions that ask what we must share with our neighbor for our common good and the rights we retain in the process. Yet such answers will be the legacy of our age and demand a more measured response than a mad rush to purchase or to deploy or even to develop that latest technology.
David Alan Grier is the author of When Computers Were Human, The Company We Keep, Crowdsourcing for Dummies, and other books. Contact him at grier@computer.org.
31 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool