Issue No. 02 - March/April (2002 vol. 14)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/69.991727
<p>We present an analytic evaluation of the runtime behavior of the C4.5 algorithm which highlights some efficiency improvements. Based on the analytic evaluation, we have implemented a more efficient version of the algorithm, called EC4.5. It improves on C4.5 by adopting the best among three strategies for computing the information gain of continuous attributes. All the strategies adopt a binary search of the threshold in the whole training set starting from the local threshold computed at a node. The first strategy computes the local threshold using the algorithm of C4.5, which, in particular, sorts cases by means of the <it>quicksort</it> method. The second strategy also uses the algorithm of C4.5, but adopts a <it>counting sort</it> method. The third strategy calculates the local threshold using a main-memory version of the RainForest algorithm, which does not need sorting. Our implementation computes the same decision trees as C4.5 with a performance gain of up to five times.</p>
C4.5, decision trees, inductive learning, supervised learning, data mining
S. Ruggieri, "Efficient C4.5," in IEEE Transactions on Knowledge & Data Engineering, vol. 14, no. , pp. 438-444, 2002.