Abstract—A summary of articles published in Computer 32 and 16 years ago.
SPECIAL MESSAGE (p. 4) The Computer Society represents a large, diverse group of interests and disciplines that are advancing at an increasing rate. The pace and importance of this field should produce greater recognition of the contributions of Computer Society members than the 12 to 15 fellow awards per year we have been granted, especially since we represent 25 percent of the IEEE membership.
INTRODUCTION (p. 10) Some 15 years ago, J. Rodriguez at MIT and D. Adams at Stanford began to work on research that eventually led to the development of concepts still in use today in data flow systems. Important advances have been made since that time, and many researchers are now investigating data flow concepts as an alternative to von Neumann machines and languages. In the pages that follow, the reader will be presented with an overview of the field, especially as it relates to high-speed computing.
DATA FLOW LANGUAGES (p. 15) Like other forms of parallel computers, data flow computers are best programmed in special languages: most data flow designs would be extremely inefficient if programmed in conventional languages such as Fortran or PL/I. However, languages suitable for data flow computers can be very elegant. The language properties that a data flow computer requires are beneficial in and of themselves and are very similar to some of the properties that are known to facilitate understandable and maintainable software.
DATA FLOW GRAPHS (p. 26) Data flow languages form a subclass of the languages which are based primarily upon function application. By data flow language we mean any applicative language based entirely upon the notion of data flowing from one function entity to another or any language that directly supports such flowing. This flow concept gives data flow languages the advantage of allowing program definitions to be represented exclusively by graphs.
GRAPH INTERPRETATION (p. 42) The usual method of interpreting data flow graphs assumes a finite token capacity (usually one) on each arc. This unnecessarily limits the amount of parallelism that can be easily exploited in a program. The U-interpreter is a method for assigning labels to each computational activity as it is dynamically generated during program execution. The U-interpreter assigns and manipulates labels in a totally distributed manner, thus avoiding a sequential controller, which can be a bottleneck in the design of large multiple-processor machines.
A PROTOTYPE (p. 51) When the construction of a practical data flow machine was started at Manchester University, England, in 1978, all of these [above] problems were considered worthy of further examination and provisions for their investigation were included in the design of the facilities. The prototype machine described in this article ran its first program in October of 1981, and implements a dynamic tagged data flow model.
A SECOND OPINION (p. 59) In this article we undertake two tasks. The first is to sketch the principles and practices of data flow computation and to point out a number of shortcomings of this approach to high-speed computation. The second is to sketch an alternative that leads to high-speed computation through higher-level use of dependence graphs.
LIFE-CYCLE VALIDATION (p. 71) In the integrated approach described in this article, validation is a part of each phase of the life cycle. Two validation activitiesanalysis and test data generationtake place during each phase. The programming and maintenance phases also include actual execution of program tests.
COMMAND SYNONYMY (p. 96) The ever-increasing propagation of heterogeneous computer networks such as the one we use makes development of a common command language highly desirable, if not mandatory. If we are to effect a solution to the command synonym problem before customer frustration builds to intolerable levels, let's get started now and begin programming our software to process alternative command verbs. We have nothing to lose, only happy, productive customers to gain.
AUTOMATED REASONING (p. 111) Aura, short for 'automated reasoning assistant, is a general-purpose reasoning program that can be used to design electronic circuits, detect flaws in other computer programs, and help solve previously unsolved problems in advanced mathematics, according to its developers.
NEW CURES (p. 12) Over the years, the antivirus industry has had to keep pace as virus writers have become more sophisticated. Antivirus products now not only detect and eliminate viruses, they can even delete or repair infected files, and remove infected sectors from system memory and disk drives.
FUTURE CHIPS (p. 16) Microprocessors running at 1,500 MHz by 2001 and chip feature sizes of 0.035 microns by 2012 were among the predictions included in a report recently released by the Semiconductor Industry Association (SIA).
OPTICAL COMPUTING (p. 25) Only if we foolishly define optical computing as the attempt to supplant electronics in computing is optical computing dead. Together with electronics, optics points to future computers far more useful than the current purely electronic ones.
HIDDEN DATA (p. 26) Steganography and cryptography are cousins in the spycraft family. Cryptography scrambles a message so it cannot be understood. Steganography hides the message so it cannot be seen. A message in ciphertext, for instance, might arouse suspicion on the part of the recipient while an invisible message created with steganographic methods will not.
INTRODUCTION (p. 36) This issue presents a tutorial introduction to the field of optical information processing, in particular, digital optical computing. We present current trends in optical computing research, in the hopes of a closer interaction with the broader computer science and engineering community.
FREE-SPACE OPTICS (p. 38) Like the human eye, which takes in an enormous amount of information in parallel, a low-cost lens can provide over a million independent connections. We aim to exploit optoelectronic computing's capability for such massively parallel data transfers.
DATA PROCESSING (p. 45) So photonics already contributes to data storage and data communication. The missing link between storage and communication is data processing. We believe that photonics, in the form of special optoelectronic architectures, can support data processing and in fact can enhance the performance of electronic computers by speeding up specific computer tasks.
HOLOGRAPHY (p. 52) Digital data storage using volume holograms offers high density and fast readout. Current research concentrates on system design, understanding and combating noise, and developing appropriate storage materials. Possible applications include fast data servers and high-capacity optical disks.
PARALLEL COMPUTING (p. 61) However, if we integrate suitable optoelectronic devices with silicon electronics, we can use optical communication channels to transfer data on and off chips. Optics can effectively communicate data to the chip surface in a massively parallel fashion and at high speed.
SWITCHING (p. 69) Two architectures have been developed that use and demonstrate free-space optical interconnects for digital logic: a high-performance optoelectronic computing module and a second-generation digital opto-electronic computer.
EXTRAORDINARY DESIGN (p. 76) Thomas S. Kuhn describes 'ordinary science as the modification of existing paradigms to suit the current requirements of scientific investigation and 'extraordinary science as the development of a new (and hopefully simpler) paradigm that subsumes the old paradigm and subsequent modifications. It could be argued that the OO approach is an example of ordinary science and the collection-oriented approach (among others) is extraordinary science, within the context of language design.
SYSTEMS ECOLOGY (p. 107) Managing information effectively so that it facilitates a smooth transition into the information age calls for a systems ecology, a body of methods for engineering systems based on analogous models of natural ecologies.
URLs IN PRINT (p. 114) While contact and supplementary information is one question, the relevance and quality of nonpeer-reviewed Web sites being used as references in journal articles is another. Computer asked for comments on the topic.
COMPUTER SCIENCE EVALUATION (p. 116) Pygmalion was a sculptor who fell in love with his statue, Galatea, a statue that was brought to life for him by Aphrodite. In the 60s, two American psychologists, Robert Rosenthal and Lenore Jacobson, used this myth to name an observation of theirs: Whenever someone evaluates something, the evaluator's expectations concerning the evaluated object influence the evaluation, in a way that tends to prove the evaluator's initial hypothesis.