Issue No. 02 - March/April (1998 vol. 13)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/5254.671089
<p>Inductive-learning algorithms are powerful tools for identifying meaningful patterns in large volumes of data, and their use is increasing in fields such as data mining and computer vision. However, conventional inductive-learning algorithms are selective?they rely on existing, user-provided data to build their descriptions. Thus, data analysts must assume the important and sizeable task of determining relevant attributes. If they provide inadequate attributes for describing the training examples, the descriptions the program creates are likely to be excessively complex and inaccurate. </p> <p>Attributes can be inadequate for the learning task when they are weakly or indirectly relevant, conditionally relevant, or inappropriately measured. Constructive induction is a general approach for coping with inadequate attributes found in original data. It uses two intertwined searches?one for the best representation space, the other for the best hypothesis within that space?to formulate a generalized description of examples. </p> <p>Originally, constructive induction focused on improving the representation space by generating additional task-relevant attributes. It was subsequently observed that this was only one way of modifying the space. Attribute construction is a form of representation space expansion; attribute selection and attribute value abstraction are forms of representation space destruction. Furthermore, it became clear that this improvement of the representation space by expansion and destruction could have a profound impact on the simplicity and predictive accuracy of concepts induced from that space. The better the representation space, the easier it is for the program to learn. It is thus important to not only add relevant attributes, but also to remove irrelevant ones and find a useful level of precision for the attribute values. </p> <p>Constructive induction methods are classified according to the information used to search for the best representation space: </p> <p>? data-driven constructive induction (DCI) uses input examples, </p> <p>? hypothesis-driven constructive induction (HCI) uses intermediate hypotheses, and </p> <p>? knowledge-driven constructive induction (KCI) uses domain knowledge provided by an expert.</p> <p>In multistrategy constructive induction (MCI), two or more of these methods are used. </p> <p>This expanded definition of constructive induction guided our development of several constructive induction programs: AQ17-DCI, AQ17-HCI, and AQ17-MCI. These all use an AQ-type rule learning algorithm for conducting hypothesis search, hence the "AQ" prefix. Here we describe our latest methodology for the data-driven constructive induction, implemented in AQ17-DCI. Our methodology combines the AQ-15c learning algorithm with a range of operators for improving the representation space. These operators are classified into constructors and destructors. Constructors extend the representation space using attribute generation methods and destructors reduce the space using attribution selection methods and attribute abstraction. We integrated these operators?which are usually considered separately?into AQ17-DCI in a synergistic fashion. We tested the method on two real-world problems: text categorization and natural scene interpretation. </p> <p>The power of a constructive induction approach is illustrated by an example from the "second Monk?s problem" which was used in an international competition of machine-learning programs. </p>
Machine learning, data mining, constructive induction, feature generation, attribute selection, attribute abstraction, discretization.
Eric Bloedorn, Ryszard S. Michalski, "Data-Driven Constructive Induction", IEEE Intelligent Systems, vol. 13, no. , pp. 30-37, March/April 1998, doi:10.1109/5254.671089