Issue No. 10 - October (2000 vol. 22)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.879795
<p><b>Abstract</b>—In this paper, we motivate the need for estimating bounds on learning curves of average-case learning algorithms when they perform the worst on training samples. We then apply the method of reducing learning problems to hypothesis testing ones to investigate the learning curves of a so-called ill-disposed learning algorithm in terms of a system complexity, the Boolean interpolation dimension. Since the ill-disposed algorithm behaves worse than ordinal ones, and the Boolean interpolation dimension is generally bounded by the number of system weights, the results can apply to interpreting or to bounding the worst-case learning curve in real learning situations. This study leads to a new understanding of the worst-case generalization in real learning situations, which differs significantly from that in the uniform learnable setting via Vapnik-Chervonenkis (VC) dimension analysis. We illustrate the results with some numerical simulations.</p>
Generalization, concept learning, generalization error, learning curves, sample complexity, PAC learning, worst-case learning, interpolation dimension.
H. Takahashi and H. Gu, "How Bad May Learning Curves Be?," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 22, no. , pp. 1155-1167, 2000.