March 2012 (Vol. 45, No. 3) pp. 11-12
0018-9162/12/$31.00 © 2012 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
32 & 16 Years Ago
PDFs Require Adobe Acrobat
FAULT TOLERANCE (p. 6) "Ten years ago fault-tolerant computing was a narrow specialty. Today, driven by an ever increasing need for reliable, available systems in applications such as spacecraft control and telephone switching, fault-tolerant computing finds itself a full-fledged discipline supported by an extensive body of research."
TEST GENERATION (p. 9) "Microelectronics is making life easier and more pleasant for almost everyone except the individuals responsible for the design and testing of these increasingly complex digital systems—especially those in test generation who must develop routines for exercising these intricate devices … In the past, these workers were reinforced by a battery of engineers armed with probes, oscilloscopes, and sophisticated logic clips. Now, more and more of the diagnostic burden must be borne solely by the test routines which their test generation procedures must produce."
MICROPROCESSOR TESTABILITY (p. 17) "A digital system is tested by applying a sequence of input patterns (tests) which produce erroneous responses when faults are present. Fault detection tests, i.e., go/no-go tests, are intended to determine whether or not a system contains a fault. Fault location tests attempt to isolate a fault to a specific component, preferably an easily replaceable one."
ERROR CONTROL (p. 27) "It is not surprising that error-control coding techniques have been used in computers for many years, especially since they have proven effective against both transient and permanent faults. What is surprising is that coding techniques have not found even more extensive use (in commercial computers, for example), in view of their potential for improving the overall reliability of computers. In this article, therefore, we will bring out some of the reasons for the limited acceptance of error-control coding techniques. …"
FAULT DIAGNOSIS (p. 52) "… It must still be determined what types of internal system organizations can most efficiently support the diagnostic procedures implied by a directed graph. Another problem requiring attention is that of modeling, not only of the diagnostic phase in a system, but also of normal system operation. An integrated approach to the modeling of fault-tolerant computing systems must consider both normal and diagnostic operations as well as reconfiguration."
DISTRIBUTED FAULT TOLERANCE (p. 55) "Distributed computers offer new ways of achieving fault-tolerant operation. Their salient feature is the distribution of computing power—when one computer fails, others may be able to aid in recovery. This distributed intelligence supports software recovery algorithms that are much more complex and powerful than those that a fault-tolerant uniprocessor can support. … Distributed computers allow other computers to perform detailed diagnosis of a faulty machine and then to reconfigure it, replace it with a spare, or take over portions of its computations to effect graceful degradation. …"
DESIGN COURSES (p. 67) "The advances of integrated circuit technology in the last decade have been staggering. … New approaches and new tools will be necessary for the design of the next generation of VLSI chips, which will have computational power equivalent to more than 100,000 gates. A modern curriculum needs courses that prepare the students for the task of designing future computing systems in the framework of these VLSI circuits.
COMPUTER ELEMENTS WORKSHOP (p. 75) "Testing will not impose limits on size or complexity, but testing needs will have a very significant impact on designs. While that was the overall consensus of the first session, a minority opinion held that up to 50 percent of chip area will be required for test features—in addition to well-structured designs. Participants agreed that modularized designs which allow adequate controllability and observability are needed, and that a set of design standards to ensure testability is important. Clearly, testability must be considered in early design phases rather than as an afterthought."
ST. PATRICK'S ALMANAC (p. 78) "In opposition to the revised liturgy, which celebrates National Engineers' Week in February, orthodox Celts and Southern Gentlemen hold that the only proper time for this rite is two days after the ides of March, the birthday of that patron saint of engineering, St. Patrick. In observance of this august occasion, we herewith present the Authorized Version of 'St. Patrick's Almanac,' a comprehensive collection of authentic aphorisms and sacred lore, compiled with the help of saintly Patrick Skelly of Honeywell, who diligently examined the ancient manuscripts and printouts."
MICRO FIRMWARE (p. 95) "Complete off-the-shelf applications systems based on firmware modules will be the next and most significant development in the microprocessor world, predicts Microcomputer Software, the latest report in Infotech's state-of-the-art series."
READER PARTICIPATION (p. 8) "Meeting readers' needs for timely, relevant information has been the goal of every editor-in-chief, Editorial Board member, reviewer, and staff member throughout the magazine's 29-year history. In addition to relying on our own judgment, we constantly seek reader input on content and direction. We do this by talking with our colleagues, conducting periodic readership surveys, checking Reader Service Card comments, and—more recently—operating the Computer 100, an electronic focus group. To gain wider participation in the Computer 100 and to make it easier to compile and analyze the results, we've converted from e-mail distribution to Web access for the survey form. We encourage—and need—widespread participation to make the Computer 100 an effective evaluative, planning tool."
VIRTUAL REALITY (p. 16) "Vendors are scrambling to come up with the best framework in which to bring three-dimensional images and virtual reality to the World Wide Web. They are battling over which enhancements should be made to the Virtual Reality Modeling Language."
"A battle is brewing over whose technology is the most comprehensive for VRML's upcoming 2.0 version. …"
EARLY LITERACY (p. 19) "Two US universities report excellent results from a computer-game-based program they recently developed to help children overcome language development and reading problems."
NEUROCOMPUTING (p. 24) "Several novel modes of computation have recently emerged that are collectively known as soft computing. The raison d'être of this mode is to exploit the tolerance for imprecision and uncertainty in real-world problems to achieve tractability, robustness, and low cost. Soft computing is usually used to find an approximate solution to a precisely (or an imprecisely) formulated problem."
ARTIFICIAL NEURAL NETWORKS (p. 31) "These massively parallel systems with large numbers of interconnected simple processors may solve a variety of challenging computational problems. This tutorial provides the background and the basics."
NETWORK TRAINING (p. 45) "Supervised learning can be considered an unconstrained nonlinear minimization problem in which the objective function is defined by error function and the search space is defined by weight space. Unfortunately, the terrain modeled by the error function in its weight space can be extremely rugged and have many local minima. …"
RULE-BASED NETWORKS (p. 64) "Instead of using numerical values, the rule-based approach relies on a symbol system to represent the human problem-solving process in the form of procedural or heuristic rules. The chaining of rules under a set of input conditions solves problems. While rich in knowledge representation and reasoning, expert systems are unable to learn and adapt to a new user's needs. …"
SYMBOLIC NETWORKS (p. 71) "Neural networks often surpass decision trees in predicting pattern classifications, but their predictions cannot be explained. This algorithm's symbolic representations make each prediction explicit and understandable."?
NETWORK SECURITY (p. 95) "… the relatively new concept of timing attacks against cryptographic systems has generated a lot of attention in the past several months. Encryption companies say they are eliminating any possibility of the attack's becoming a practicable way of sabotaging a company's security. But it remains to be seen if the method will adversely affect the scores of legacy systems in place and if newer, unanticipated strains of the attack will arise."
THE BEGINNING (p. 99) "In 1969, Intel had accepted an assignment from Busicom, a calculator manufacturer, to produce a chip set for a series of programmable calculators. Influenced by the architectural simplicity of the minicomputer, Marcian E. Hoff Jr., Intel's 12th employee, conceived the microprocessor as a flexible, cost-effective substitute for the 12-chip set Busicom had specified. …"
CONTRACT PROGRAMMING (p. 109) "'Why can't software be more like hardware?'" has been the software engineer's lament for nearly as long as there have been large software systems. In particular, why isn't there a software components industry to rival the existing hardware components industry?"
RULES OF THUMB (p. 116) "In spite of the plentiful availability of commercial software-estimating tools, I continue to receive e-mail and phone messages requesting simple rules of thumb that can be used with pocket calculators. So, in response, here are 10 simple rules of thumb covering various aspects of software development and maintenance."
PDFs of the articles and departments from Computer' s March 1980 and 1996 issues are available through the IEEE Computer Society's website: www.computer.org/computer.