# SCH

## Seminar

### Some thoughts on nonparametric classification: nearest neighbours, bagging and max likelihood estimation of shape-constrained densities

Seminar Room 1, Newton Institute

#### Abstract

The $k$-nearest neighbour rule is arguably the simplest and most intuitively appealing nonparametric classifier. We will discuss recent results on the optimal choice of $k$ in situations where the underlying populations have densities with a certain smoothness in $\mathbb{R}^d$. Extensions to the bagged nearest neighbour classifier, which can be regarded as a weighted $k$-nearest neighbour classifier, are also possible, and yield a somewhat suprising comparsion with the unweighted case.

Another possibility for nonparametric classification is based on estimating the underlying densities explicitly. An attractive alternative to kernel methods is based on the maximum likelihood estimator, which can be shown to exist if the densities satisfy certain shape constraints, such as log-concavity. We will also discuss an algorithm for computing the estimator in this case, which results in a classifier that is fully automatic yet still nonparametric.

**Related Links**

- http://www.statslab.cam.ac.uk/~rjs57/Research.html - Personal webpage

## Comments

Start the discussion!