The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - March/April (2002 vol.14)
pp: 438-444
ABSTRACT
<p>We present an analytic evaluation of the runtime behavior of the C4.5 algorithm which highlights some efficiency improvements. Based on the analytic evaluation, we have implemented a more efficient version of the algorithm, called EC4.5. It improves on C4.5 by adopting the best among three strategies for computing the information gain of continuous attributes. All the strategies adopt a binary search of the threshold in the whole training set starting from the local threshold computed at a node. The first strategy computes the local threshold using the algorithm of C4.5, which, in particular, sorts cases by means of the <it>quicksort</it> method. The second strategy also uses the algorithm of C4.5, but adopts a <it>counting sort</it> method. The third strategy calculates the local threshold using a main-memory version of the RainForest algorithm, which does not need sorting. Our implementation computes the same decision trees as C4.5 with a performance gain of up to five times.</p>
INDEX TERMS
C4.5, decision trees, inductive learning, supervised learning, data mining
CITATION
S. Ruggieri, "Efficient C4.5", IEEE Transactions on Knowledge & Data Engineering, vol.14, no. 2, pp. 438-444, March/April 2002, doi:10.1109/69.991727
18 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool