\def\om{\omega}
We propose a multistage recognition method built as a cascade of a linear parametric model and a $k$-nearest neighbor ($k$-NN) nonparametric classifier. The linear model learns a ``rule'' and the $k$-NN learns the ``exceptions'' rejected by the ``rule.'' Because the rule-learner handles a large percentage of the examples using a simple and general rule, only a small subset of the training set is stored as exceptions during training. Similarly during testing, most patterns are handled by the rule -learner and few are handled by the exception-learner thus causing only a small increase in memory and computation. A multistage method like cascading is a better approach than a multiexpert method like voting where all learners are used for all cases; the extra computation and memory for the second learner is unnecessary if we are sufficiently certain that the first one's response is correct. We discuss how such a system can be trained using cross validation. This method is tested on the real-world application of handwritten digit recognition.
AMS: 62H;
BACK to VOLUME 34 NO.4