Speaker

Probal Chaudhuri

Title

Multiscale Approach in Supervised Statistical Learning

Abstract

In supervised statistical learning, i.e., statistical discriminant Analysis as it is often called, one uses a training sample to build the classifier. For the construction of nonparametric classifiers,

one popular approach is to plug in nonparametric estimates of class densities in the Bayes rule. Like other nonparametric methods, the performance of these nonparametric density estimates and that of the nonparametric classifier depend on the values of associated smoothing parameters. One generally uses cross-validation or bootstrap type methods to select optimum values of these smoothing parameters, which are used for Classification of all future observations. However, in addition to depending on the training data, good choices of smoothing parameters depend on the observation to be classified. A fixed level of smoothing may not work well in all parts of the measurement space. In addition to that, for a specific data point, one may like to assess the strength of evidence in favor of different competing classes at different levels of smoothing. Therefore, instead of using a fixed level of smoothing, one useful idea is to adopt a multi-scale approach, where one studies the classification results for wide ranges of smoothing parameters simultaneously. Such multi-scale methods are more informative than classifiers based on fixed smoothing parameters. In this talk, multi-scale smoothing techniques in classification will be demonstrated using some standard nonparametric classifiers. Techniques for visualization of classification results for varying choices of smoothing parameters and procedures for aggregation of those results leading to a definite classification will be discussed.