AUGUST 2004 (VOL. 37, NO. 8) PP. 6-7
0018-9162/04/$31.00 © 2004 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
|Univac Overdrive Modification|
PDFs Require Adobe Acrobat
Although I enjoyed Butler Lampson's article in Computer's June 2004 issue ("Computer Security in the Real World," pp. 37-46), I am concerned that part of the article might be misread, perpetuating a viewpoint that leaves our computers vulnerable to viruses.
Figure 1 in this article shows the guard authenticating the principal and making an access decision based on that authentication. Further, the authentication and authorization steps are described using the pronoun "who."
Lampson clearly states that the principal need not be a person, but readers might miss the significance of this point. Some could conclude that Lampson is saying that we can base access decisions on the identity of the person who started the program. Virus attacks show the folly of this approach. Access control based on a user's identity can keep people from doing what they are not allowed to do. Unfortunately, it does nothing to stop a process acting on a user's behalf from doing something the user is allowed to do but doesn't want done. Viruses exploit this flaw in the access control model of today's operating systems.
The solution is to enforce the Principle of Least Privilege at a granularity finer than that of the user without making a system that is secure but hard to use. The research prototype we have built at HP Labs for Microsoft Windows shows this can be done. Manipulating privileges based on the user actions that designate files and programs enhances security without adversely affecting usability.
Alan Karp, Palo Alto, Calif.; firstname.lastname@example.org
The author responds:
Karp is quite right that it is unwise to base access control entirely on the identity of the user who starts a program. I discuss this point at length on p. 44 in the article, but I should have emphasized that treating a program as a principal is the most effective way to control viruses.
In "Computer Security in the Real World," Butler Lampson offers two suggestions: "… the system should classify all programs as trusted or untrusted based on how they are signed" and "… a trusted authority must sign all executable programs." It is important to caution readers against this kind of black-and-white thinking.
No entity can ever be classified as "trusted" in an absolute sense. The word "trusted" has no meaning without an accompanying context that answers two questions: "Trusted by whom?" and "Trusted to do what?"
Software signatures do not guarantee correct behavior. Moreover, "correct behavior" is not even a useful concept without an understanding of the software user's expectations. For example, spyware behaves correctly—just not in the user's interest. A signature is only useful if the user understands the particular assurance the signer intended to make and trusts the signer to make that assertion.
Requiring signatures on all programs also leads in a dangerous direction: It needlessly increases the barriers to acceptance of software from smaller companies or open source software teams.
We must acknowledge that users cannot be expected to determine the correctness of their software and place absolute trust in it. Instead, we should provide the means to run software while limiting its ability to do harm.
I encourage all readers to start asking questions whenever they see the words "trust" or "trusted" used without qualification.
Ka-Ping Yee, University of California, Berkeley; email@example.com
Neville Holmes is indeed correct that negative stories about PowerPoint are "not a joke" ("In Defense of PowerPoint," The Profession, July 2004, pp. 100, 98-99). He is also correct in stating that the ultimate responsibility for the quality of a presentation lies with the presenter. However, he is dead wrong in his implication that PowerPoint is not a deeply flawed tool that encourages bad presentations and discourages good ones.
By deploying zillions of defaults and automated "corrections" that are hard to disengage, PowerPoint's producers assume significant responsibility for the stylistic wasteland all too familiar to audiences around the world. Metaphorically, I believe the responsibility of PowerPoint's producers would fall under the legal heading "attractive nuisance."
Articles by and about Edward Tufte, author of the Wired article, "PowerPoint Is Evil" ( www.wired.com/wired/archive/11.09/ppt2.html), provide information about the negative aspects of PowerPoint. Oddly, Holmes never names Tufte. Just Google "tufte powerpoint," and you'll find more than you care to read.
A balanced starting point for the interested reader is the Wikipedia entry on PowerPoint: http://en.wikipedia.org/wiki/PowerPoint.
Richard (DJ) L. Waddell Jr., Laurel, Md.; firstname.lastname@example.org
Neville Holmes responds:
I had not intended to imply that I thought PowerPoint a good thing. Indeed, I imagined that stating on the opening page that I found it very difficult to make PowerPoint do what I wanted and offering lengthy suggestions about when computing professionals should not use PowerPoint would have made this plain.
My main points, emphasized in the conclusion, were that computing professionals should be trained in presentation techniques and that they should resist professional and public attempts to blame digital technology for the ills consequent on its misuse.
By this, I do not mean to imply that I disagree with the points Richard Waddell makes about PowerPoint's qualities and effects. However, my strongly held view is that PowerPoint should not be the main target of criticism. Rather, the criticism should be directed at the educators and other professionals who neglect their responsibilities in favor of letting technology do the driving.
On Edward Rolf Tufte, I suggest that readers go straight to http://artenumerica.com/inspiration/tufte.en.html for an introduction to Tufte and links to his more appropriate writings. Computing professionals should ignore the chip Tufte has on his shoulder about PowerPoint, and they should take his splendid work on data presentation to heart.
In his essay about PowerPoint, Neville Holmes offers the shopworn argument that technology is benign, and it is only the use of technology that is for good or ill. The same argument is also used for handguns, and, in both that use and in the defense of PowerPoint, it remains unconvincing.
It might be useful, instead, to consider the statement (by Edsger Djikstra, I believe) that the tools we use affect not only the way we think, but our ability to think. Since PowerPoint is primarily a management tool, what is its effect on the way that managers receive, present, and act upon information? At what point does the need to put the argument in a set of bullet points and box-and-line diagrams suppress consideration of details that all too often are key to project success of failure? How does PowerPoint affect a manager's ability to think about detail?
John Boddie, Landenberg, Pa.; email@example.com
Neville Holmes responds:
To use a rather shopworn cliché, John Boddie is putting words into my mouth. My point was not that technology is benign, but that it is neutral—neither benign nor malignant.
Technology—tools and techniques— is something we use, professionally or domestically. If our use of technology results in good or bad, then the credit or blame belongs to us. Although society might perhaps be considered indirectly responsible, the blame for running amok with a handgun belongs directly to the person who used the handgun in that way.
When we use technology professionally, either as computing professionals or as managers, it is our professional responsibility to use it properly. If we don't know how to do that, it is our direct fault for using the technology at all. If it is important technology, failing to use it properly is indirectly the fault of the educators in our professions for not having equipped us with the appropriate skills.
Certainly our tools and techniques can influence the way we solve problems, but we are being unprofessional if we analyze problems to suit the technology we have or if we design our solutions primarily to suit the technology we plan to use.
As professionals, our focus must be on the people whose problems we are tackling, not on our own convenience. If we allow our tools to dictate how we think, then more fools we, domestically or professionally.
I fully agree with Neville Holmes's comments about PowerPoint.
Every user of this useful tool is responsible for the developed slides and the manner in which they are presented. Frequent poor use of PowerPoint doesn't mean that it can't be used intelligently. I recommend using a combination of PowerPoint and Microsoft Equation 3.0 for presentations that include mathematical symbols and equations.
Janusz Kowalik, Seattle, Wash.; firstname.lastname@example.org
Univac Overdrive Modification
I am trying to locate information about a hardware modification called "Overdrive" that was designed for the Univac I. I have (with the bemused permission of Unisys) written a simulator for Univac I and Univac II, and I would like to incorporate "Overdrive" in it as an option if I can find authenticated details of how it worked.
I am not particularly interested in the mechanics of the hardware modification itself, but rather in the changes it made to a Univac I from a programmer's point of view. I believe the modification allowed for three instructions per Univac word, rather than the "normal" two, but I have no idea what it did for instructions that required two characters (most were one character) and no address or for those that required two characters and an address.
I would be delighted to hear from any readers who have any information—or are even merely curious about —Univac I.
Peter Zilahy Ingerman, Willingboro, N.J.; email@example.com