Pages: pp. 9-10
HARDWARE TESTING (p. 7) "The testing of digital systems has grown increasingly complex. LSI circuits, commonplace in today's systems, require thorough testing at the component level. VLSI adds even more complexity. How are these challenges being met? This special issue surveys the state-of-the-art and attempts to answer the question."
LOGIC TESTING (p. 9) "The testing problem has two major facets: (1) test generation and (2) fault simulation. With the vast increase in density, the ability to generate test patterns automatically and conduct fault simulations with them has drastically waned. As a result, some manufacturers are foregoing these more rigorous approaches and are accepting the risk of shipping a defective product."
MEMORY TESTING (p. 23) "Semiconductor memory testing is often thought of as a one-shot operation occurring just prior to product delivery. However, testing plays a much more important role in the life cycle of a product. The three key areas of product production—device design, lithography, and processing—are verified and monitored through the test and characterization of the end product."
SYSTEM TESTING (p. 32) "… Rapid changes in the electronics industry quickly render any rules or procedures obsolete. Therefore, the only way to ensure that products are actually testable is for the designer to clearly understand the testing process, including its needs and capabilities. And he must be able to implement state-of-the-art test techniques as he designs state-of-the-art products. …"
ANALOG TESTING (p. 40) "Because an analog electronic circuit or device may be tested many times during its lifetime, testing has become a significant component of life-cycle cost. Automatic test equipment, or ATE, reduces this cost and improves the quality of testing as well. Today's automatic testers are almost exclusively computer-controlled machines that execute user-created test programs. …"
SELF-TESTING COMPUTERS (p. 49) "… Controllability and observability problems—long the nemeses of digital test engineers—have become particularly difficult as computer IC densities have increased. New testing approaches have become necessary in many computer applications to provide on-line visibility into computer functional processes. A viable approach to increased on-line computer process visibility is built-in-test, which is being used in digital computers as a means of providing continuous on-line performance monitoring, particularly in modular computer systems. …"
PROGRAMMING SMALL (p. 61) "Microdare is a new high-level language system for laboratory automation, signal processing, control, and simulation. Combining high computing speed with direct execution (i.e., no external compiler, linker, or loader is needed), Microdare is embedded in an advanced Basic dialect which serves for interactive program entry, editing, file manipulation, job control, and for programming multi-run experiments. …"
FEATURE LEARNING (p. 75) "The usefulness of computer simulations in learning has long been recognized. Indeed, according to some proponents, this is the only mode in which computers should be used! I, however, maintain that it is only one of many valuable computer-based learning approaches; the classroom environment presents a full spectrum of possibilities."
SCIENCE VERSUS ENGINEERING (p. 88) "hellip; The function of the computer scientist is to know, while that of the software engineer is to do. The computer scientist adds to the store of verified, systemized knowledge of the computer-centered world; the software engineer brings this knowledge to bear on practical problems."
ATHLETICS (p. 97) "A computerized system will be playing a major role in preparing American athletes for the 1980 Winter and Summer Olympic Games.
"The computer, an Eclipse S/250 donated to the US Olympic Committee by Data General, and a Whizzard 7000 vector refresh graphics system, donated by Megatek, will be used by athletes training for international competition, starting with the 1980 Olympics. Dr. Gideon Ariel, a member of the Sports Medicine Committee and developer of the programming that allows the computer to analyze and improve athletic performance, will direct the effort at the committee' s new Bio-Mechanics Laboratory."
CONFERENCE (p. 100) "Registration at the ACM SIGGRAPH conference on Computer Graphics and Interactive Techniques, combined this year with the IEEE Computer Society conference on Pattern Recognition and Image Processing, exceeded 2200, double 1978's 1100. Other numbers set records, too: 80 exhibitors, 1400 exhibitor guests, 2000 paying exhibit visitors. Total: almost 6000."
FUTURE SCHOOLS (p. 10) "FutureSchools are in the knowledge business, which means they are content providers just like newspaper, magazine, and software publishers, and movie and video game producers. The Hudson Institute, headquartered in Indianapolis, Indiana, reviewed 20 years of research on computer-based instruction and found that students learn 30 percent more in 40 percent less time and at 30 percent less cost when using computer-aided instruction. Who says automated delivery isn't as good as delivery in the flesh? …"
INTERNET ADDRESSING (p. 12) "As new mobile-computing technologies emerge and Windows 95 brings thousands of new users on line, the need for more Internet protocol address space is becoming crucial. Because of the rapid growth of the Internet, the Internet Engineering Task Force (IETF) is furiously working on an update to its transmission-control protocol/Internet control protocol, the Internet' s underlying addressing and routing scheme."
HARDWARE DESCRIPTION (p. 18) "As the name suggests, VHDL [VHSIC Hardware Description Language] is a language for describing digital hardware. It was developed under the auspices of DOD's Very High Speed Integrated Circuits (VHSIC) program in the 1980s and accepted as an IEEE standard in 1987. VHDL is very similar to a programming language, but the end result is a description of a piece of hardware, not an algorithm to be executed on a processor. A VHDL 'program' is usually called a model because it rarely describes the piece of hardware completely; certain details of the hardware's behavior are always ignored. … If a VHDL model describes the hardware in sufficient detail, a program called a synthesizer can automatically convert it into a gate-level description that can be manufactured."
SOFTWARE REUSE (p. 36) "In the future, new application requirements and technological advances can be expected to drive domain evolution, generating a continuous process of domain analysis and domain architecture restructuring. Domain and reuse library artifacts will be continually created and restructured to produce new software product inventories for use in expanded or reorganized domains. As new programming languages and platforms are introduced, we will increasingly be concerned with reengineering selected reuse library components rather than reengineering entire legacy software applications."
OBJECT TECHNOLOGY (p. 57) "Although the benefits of object-oriented systems are recognized, OT has been embraced somewhat slowly, because it requires adapting existing structured tools and techniques to support an object-oriented approach, providing a smooth learning curve for people whose experiences and expertise are in developing systems using structured methods, and adaptability of existing structured software through reengineering with the new object-oriented software. However, we should begin preparing carefully for the inevitable transition to object technology. …"
TEACHING COMPUTING (p. 73) "… To a large extent, computer technology changes far more rapidly than the basic ideas of the science of computing, which center on the notion of an algorithm and its use in computing systems. Based on the sixty years or so since the pioneering days of CS, these ideas have lasting and fundamental value. Thus, although a proposed high school program should enhance a student's ability to exploit computers beneficially, its backbone must be based on science. The program should provide insight, knowledge, and skills independent of specific computers and programming languages. …"
TELEVISION (p. 81) "For Advanced Television to play an important role in the emerging NII [National Information Infrastructure], [Apple's Donald A.] Norman argued, ATV would have to be made fully computer compatible. For that to happen, the computer industry would have to be included in the standards process, and standards would have to be based on quality and long-term flexibility rather than on the short-term cost-minimization strategy that he sees prevailing in the television industry."
INTERNET SECURITY (p. 101) "These activities seem certain to lead the Internet into a new age characterized by the security equivalent of citizens' passports. Every Internet user will have one or several public key pairs and corresponding certificates issued by CAs [certification authorities] who act as trusted third parties. The provision of security services such as authentication, data confidentiality and integrity, access control, and nonrepudiation services will be based on the availability of public key certificates on a global scale."
BENCHMARKING (p. 102) "Function points provide useful metrics on two key components of software quality: measuring defect potentials and calculating defect removal efficiency levels.
"The defect potential of a software application is the total quantity of errors found in requirements, design, source code, user manuals, and 'bad fixes' or secondary defects inserted as an accidental byproduct of repairing other defects. The defect removal efficiency is the percentage of software defects removed prior to delivery."
PDFs of the articles and departments from Computer's October 1979 and 1995 issues are available through the IEEE Computer Society's website: www.computer.org/computer.