Pages: pp. 15-16
NTC 74 KEYNOTE (p. 10–11). "'By a fortunate coincidence of timing, electronics breakthroughs are now emerging which, when fully applied to business, industry and transportation, will cut costs, helping to curb inflation and provide new and effective products to enhance employment and resist recession,' according to Dr. Simon Ramo, Vice Chairman of TRW Inc."
"'… Economic synthetic "brainpower" can make each human smarter at his job. The potential for increasing the value of every hour of man's time is tremendous. Even though this new information technology is still largely unapplied, the first installations are already cutting costs and rendering better services'."
COMPUTERS IN MEDICINE (p. 19). "The future for the application of computers in medicine is bright. With health care now considered a right rather than a privilege, the demands on physicians offer a unique opportunity to use the computer as a 'physician assistant.' Large computer files on patients will be kept, and decisions made from these files will assist the physician and health care provider. Physician 'peer' review and quality standards for care are already demanding more careful record-keeping and auditing. A computerized health care system is the answer to these new demands."
INTENSIVE CARE (p. 26). "In view of the fact that the impact of patient monitoring systems on patient outcome, ICU length of stay, or manpower utilization has not yet been quantified for comparison with alternat[iv]e methods of providing care, it is hard to justify these systems on the basis of cost or quality of care alone. Even the earliest users of computerized systems are still modifying their systems or adding new features to make them more valuable. Therefore computerized patient monitoring systems will probably still be most successful in the hands of research-oriented physicians who can make contributions to the current state of the art, as well as take advantage of research and teaching capabilities of the systems."
AMBULATORY HEALTH CARE (p. 31). "… A physician is perhaps the only professional who uses the majority of his working time in creating thousands of inter-related data bases (the records for each patient), using the data to make decisions, recording his decisions, and recording the patient's progress so the data can be used in subsequent decisions. He carries out his business in his office, in various examining rooms, in a hospital or rest home while making rounds, and in the hospital emergency room nights and week-ends. The doctor plainly needs data processing assistance, but the problem of entering and retrieving data accurately and efficiently from a variety of locations and sources has not been solved."
THE CHRONICALLY ILL (p. 49). "The ability of the computer-based data bank to provide a quality of information about like patients that is not available in textbooks, journals, and monographs creates a degree of excitement and timeliness in the long-term management of chronically ill patients. The computer provides the physician with a memory extension just as the stethoscope and ophthalmoscope extends his hearing and sight. With the data bank, the notion of diagnosis in chronic disease is lost. It is replaced by the doctor's ability to directly associate the descriptors of a patient with an intervention, if available, leading to a favorable outcome."
MAGNETIC BUBBLES (p. 52). "Semiconductor technology is being pushed also to extend into the file memory area (10 5 to 10 7 bits), now dominated by magnetic recording in the form of disk and tape files; shift registers utilizing charge-coupled devices, as well as dynamic RAMs with density almost as great are being pursued for this application. Here, however, magnetic bubbles provide a competing technology."
MINICOMPUTER COBOL (p. 58). "ANSI Standard 1974 COBOL for minicomputers has been announced by Digital Equipment Corporation. Designed to run on Digital's PDP-11 computer systems that employ the RSX-11D operating system, the COBOL package extends the use of such computer systems to business data processing applications.
"The new PDP-11 COBOL is an ANSI-74 compiler with accept and display features; inspect, string, and unstring verbs; and relative and sequential input-output modules. It also features nested conditionals, a library function, and conditional variables at Data Division Level 88."
A MUSICAL COMPUTER (p. 63). "Composing at Oberlin College Conservatory of Music will take on new dimensions in sound this semester when a high-speed computer is added to its instrumental family.
"The conservatory, which has been experimenting with techniques for sound synthesis, will be able to invent or create a wide range of new sounds with the help of a Xerox computer …"
"To insure ready access by students and faculty to the computer, 40 time-sharing terminals will be installed in strategic locations throughout the campus; more as the demand dictates."
ARTY COMPUTERS (p. 64). "Fourteen artists from across the country participated in an unusual art show recently at the Alice Tully Hall in New York's Lincoln Center cultural complex.
"On exhibit were paintings representing a wide range of styles from landscape to surrealism. All were on computer equipment. The showing is part of a new corporate program to humanize working environments."
OUR 40TH YEAR (p. 7). "… even though our roots extended back to the 1940s, our own Board of Governors decided in 1974 that the official year of our birth had been 1951, the year the IRE's Computer Group held its first official administrative committee … meeting. Had we elected to chart our beginning from the AIEE's Committee on Computing Devices (formally inaugurated on January 29, 1948), we could now claim to be the oldest computing society in the US."
DESIGN DIVERSITY (p. 18). "Fifteen years ago, most large-scale scientific and engineering computations were performed on sequential von Neumann machines. Comparisons among these machines focused on running sets of common benchmarks and on ranking the machines based on the number of instructions executed per second.
"However, as the number of commercial systems increased, so did the diversity of their architectural design. As each new architecture diverged from the classical von Neumann model, new languages and annotated versions of older sequential languages were developed for execution on these new machines. This made it difficult to run a standard benchmark. Not only did each benchmark require translation into each language, but the translation process and newer optimizing compilers obscured the relative merit of the results."
BENCHMARKS (p. 48). "The design of an experimental system is a complex matter with a bewildering number of design choices. Consider the design of an instruction set in which adding a new instruction might improve overall performance by providing a better interface to the hardware. However, this addition might also degrade performance by reducing the clock speed. Consequently, the choice of whether to include a new instruction depends on the system's speed with and without the instruction. The usefulness of the new instruction depends on the programs the system will run. In practice, designers often use benchmark programs and assume they represent user programs …"
"Hence, the system's performance while it runs the benchmarks determines the design of the system. However, there is a problem of putting the cart before the horse. How can a system be designed to run a set of benchmarks efficiently if its benchmark performance cannot be measured until the design is completed?"
A MICROSUPERCOMPUTER (p. 57). "As supercomputer performance continues to grow, packaging techniques will remain critical for reducing chip-to-chip delays. In addition, higher integration levels will become increasingly important because they can drastically reduce the number of chip crossings. Microcomputer systems have enjoyed a performance increase of 100 to 200 percentevery three years, in part due to the growth in chip integration density. In contrast, mainframe supercomputers have improved by only about 50 percent every three years. It should not be surprising then if the next generation of supercomputers evolves from the microprocessor rather than continuing the mainframe tradition."
NEURAL NETWORKS (p. 107). "… Rosenblatt begat the perceptron (what we now refer to as a single-layer neural network), and Minsky and Papert shredded it in 1969 when they demonstrated that the perceptron could do practically nothing of interest. Consequently, research in connectionist, massively parallel models of intelligence (with the notable exceptions of Grossberg, Kohonen, and a few others) died for about 20 years.
"In the interim, researchers concentrated their resources on the rationalist paradigms of AI that typically involved logic, reasoning, rule-based systems, and the representation of knowledge by the use of symbols. The postperceptron perception that the simple expedient of adding layers to the net could accomplish significant learning and problem solving spread gradually in the early eighties."
MASSIVE PARALLELISM (p. 121). "… One of the driving factors is that massively parallel machines are getting better and faster at a much more rapid pace than vector machines … In 1985, vector machines were still the faster of the two. In 1988, it was a tough call. Now … it is clear that the fastest applications are being done on massively parallel machines. By 1995, massively parallel machines will probably be about 100 times faster than conventional supercomputers."
COMPUTER CHESS (p. 124). "With a victory over Hitech in the final round, Mephisto tied Deep Thought for ACM's 21st Annual North American Computer Chess Championship, held during Supercomputing 90 November 12-14 in New York City. Hitech, which led the competition going into the fifth and final round, finished in a third-place tie with M Chess.
"Mephisto also won recognition as the best small computing system. Ranked as the top commercially available computer-chess player, Mephisto was developed by Richard Lang, Hegener & Glaser, A.G., Munich, Germany.
"Deep Thought and Hitech were both developed in the Department of Computer Science at Carnegie Mellon University in Pittsburgh. Deep Thought has since moved to the IBM T.J. Watson Research Center, Yorktown Heights, New York."