Pages: pp. 6-7
I read with interest the discussion about passwords in the Letters department of Computer's February issue. For many years I have used a system that works well for most of the passwords I need. It works best if you have reasonable typing skills though.
The idea is simple. Choose an easily remembered word to use as the base for all your passwords. Before typing it, shift your hands so that, instead of resting on "a," the little finger of the left hand is on some other key. For example, if I choose the word "secret," starting on "q" gives "w3d435," starting on "s" gives "drvtry," and so on. Then you can keep a record of your passwords on a Post-it note stuck on your monitor (or your preferred high-tech alternative), but you only need to record the starting key, which is useless to anyone who doesn't know your base word. And you never even need to know your real passwords.
You can of course get more password combinations by moving the starting positions of both hands independently.
Wayne Wolf's article in Computer's March 2009 issue (Embedded Computing, "Cyber-physical Systems," pp. 88-89) shows that he understands the true nature of the proposed Smart Grid, as an example of a "continental scale" distributed system (he even uses very similar descriptive terminology).
However, the passage, "The power grid remains woefully primitive in many respects—as one commentator put it, Thomas Edison would feel right at home with the equipment in the average power plant," leaves me grinding my teeth with frustration and annoyance.
The commentator's assumption is that new ideas in the power domain follow the availability of new technologies like GPS, high-bandwidth communications, and massive computing power. On the contrary, visionaries in the power industry have been waiting for years for these technologies to (a) appear and (b) become cost-effective, to enable the implementation of what is now popularly called the Smart Grid.
Also, the power industry has driven technological change in such a way as would bedazzle Thomas Edison—for example, SF6 circuit breakers capable of interrupting many tens of kA at 500 kV, and steam turbo generators capable of sustained output of many hundreds of megawatts. Sexy—perhaps not; mechanical—definitely; high tech—you bet.
With information and communications technologies finally coming to the party, we can now dare to believe that our dreams of the Smart Grid will, at long last, be implementable. What we now need is purposeful collaboration between the ICT and power domains, not the kind of dismissive claptrap quoted in Wolf's article.
As the president of EEMBC, the EDN Embedded Microprocessor Benchmark Consortium, I'm charged with seeing the completion of the SPECpower benchmarks, knowing that this effort is challenged with much political and technical resistance. Klaus-Dieter Lange's article in Computer's March issue (Green IT, "Identifying Shades of Green: The SPECpower Benchmarks," pp. 95-97) does a good job of describing the development and thought process that go along with putting together such a benchmark standard.
No matter how well constructed this industry-standard benchmark is, there will always be those who disagree with various aspects, such as the workloads and duty cycles that were chosen. But this benchmark is not about finding the perfect analysis mechanism for power and energy; instead, it's about leveling the playing field and providing a tool for a solid apples-to-apples comparison. Furthermore, the SPECpower benchmark should be about encouraging all the players involved in the development of servers and PCs to take an active role in reducing power consumption. In other words, the power of any server and PC is derived from the sum of its parts. Which brings me to the primary reason for my letter.
Although I suppose that this is a somewhat biased perspective, the microprocessor lies at the center of every server and PC—or, for that matter, it lies at the center of practically every embedded system as well. Therefore, industry-standard benchmarks that specifically measure the power and energy of processors are essential. As a matter of fact, such a benchmark has been available since 2006. Specifically, EEMBC has developed EnergyBench, an industry-standard benchmark tool that allows the simultaneous measurement of a processor's performance and power/energy.
While most, if not all, of the systems being analyzed with the SPECpower benchmark have the luxury of being monitored from the wall plug, the challenge for EnergyBench users is that attaching the current probes usually requires dissection of the processor board.
Unfortunately, despite the incredible importance of being able to measure a processor's power while under a load, few vendors of processor evaluation or development boards provide a means for making these measurements. In turn, this has resulted in a limited amount of recognition for EnergyBench, the first industry-standard benchmark for measuring a computer system's power and performance characteristics.
Regardless, we should all continue our efforts to reveal the pertinent benchmark information that will help green the world that we live in.
While "The Design of Government" (N. Holmes, The Profession, May 2009, pp. 104, 102-103) was quite enjoyable, it paints an idealized scenario that does not seem to account for the foibles of human beings. Holmes describes the tendency of people to assemble "superorganisms." It seems to me that the bifurcation of politics, even if initially successful, would eventually succumb to this superorganism effect. That is, self-aggrandizement to maximize resources and rewards to the group, and high-ranking individuals in particular, would eventually win the day.
I wonder if it's possible to ever set up a political system that approaches the democratic ideal.
Larry L. Gadeken
Neville Holmes responds:
In my opinion, we'll never get even close to the democratic ideal unless we get close to the educational ideal. The present education systems of "advanced" countries encourage the selfishness that corrupts our democracies. And redesign of government looks like a prerequisite for achieving the great improvement in education that digital technology has made practical ("Supporting Acquisition of Basic Skills," The Profession, Mar. 2008, pp. 104, 102-103).
Neville Holmes has rediscovered one of the great advances of the Enlightenment in the understanding of government (The Profession, "The Design of Government," May 2009, pp. 104, 102-103). The Enlightenment's concept of separation of powers (written into the US Constitution) is very much like Holmes's separation of government into analytical and operational functions. The details differ, but in both cases government is split along functional lines. The functional divisions are also made independent of each other to provide a series of checks and balances.
Interestingly, this analysis also bears some similarity to what has been learned through the Capability Maturity Model that the US Department of Defense is developing. The CMM has shown that productivity improves when processes are put in place and, most importantly, followed. Collecting data to determine how well the processes are working is also useful, but not quite as important as following them.
The primary problem with implementing these reforms is more a personal one. Creating separations in powers or implementing procedures means that the people involved lose some of their personal power. Politicians do not like that. Often, their constituents don't like it either. It is harder to get favors from an official who has to get approvals from or the cooperation of other officials. Thus there is a bias against a more rational organization for government.