Issue No. 12 - December (2004 vol. 30)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TSE.2004.89
Ted J. Biggerstaff , IEEE
A challenge of many transformation-based generators is that they are trying to achieve three mutually antagonistic goals simultaneously: 1) deeply factored operators and operands to gain the combinatorial programming leverage provided by composition, 2) high-performance code in the generated program, and 3) small (i.e., practical) generation search spaces. The Anticipatory Optimization Generator (AOG) has been built to explore architectures and strategies that address this challenge. The fundamental principle underlying all of AOG's strategies is to solve separate, narrow, and specialized generation problems by strategies that are narrowly tailored to specific problems rather than a single, universal strategy aimed at all problems. A second fundamental notion is the preservation and use of domain-specific information as a way to gain extra leverage on generation problems. This paper will focus on two specific mechanisms: 1) Localization: the generation and merging of implicit control structures and 2) Tag-Directed Transformations: a new control structure for transformation-based optimization that allows differing kinds of retained domain knowledge (e.g., optimization knowledge) to be anticipated, affixed to the component parts in the reuse library, and triggered when the time is right for its use.
Domain-specific architectures, image processing, inference engines, logic programming, optimization, partial evaluation, pattern matching, program synthesis, reusable software, search, program transformations.
Ted J. Biggerstaff, "A New Architecture for Transformation-Based Generators", IEEE Transactions on Software Engineering, vol. 30, no. , pp. 1036-1054, December 2004, doi:10.1109/TSE.2004.89