Tag Archives: bandwidth

Cross Validation for Kernel Density Estimation

In a post publihed in July, I mentioned the so called the Goldilocks principle, in the context of kermel density estimation, and bandwidth selection. The bandwith should not be too small (the variance would be too large) and it should not be too large (the bias would be too large). Another standard method to select the bandwith, as mentioned this afternoon in class is the cross-validation technique (described in Chiu (1991)). Here, we would like to minimize

https://latex.codecogs.com/gif.latex?\mathbb{E}\left[\int%20[\widehat{f}_h(x)-f(x)]^2dx\right]

The integral can be writen

https://latex.codecogs.com/gif.latex?\int%20\widehat{f}_h(x)^2dx-2\int%20\widehat{f}_h(x)f(x)dx+\int%20f(x)^2dx

Since the third component is constant, we have to minimize the expected value of the sum of the first two.

The idea is to approximate it as

https://latex.codecogs.com/gif.latex?J(h)=\int%20\widehat{f}_h(x)^2dx-\frac{2}{n}\sum_{i=1}^n%20\widehat{f}_{(-i)}(X_i)

which can easily be computed. Consider here some sample, with 50 observations, from a Gaussian distribution,

> set.seed(1)
> X=rnorm(50)

From Silverman’s rule of thumb (which should be appropriate here since the sample has a Gaussian sample) the optimal bandwidth is

> 1.06*sd(X)*length(X)^(-1/5)
[1] 0.4030127

Using the cross-validation technique mentioned above, compute

> J=function(h){
+ fhat=Vectorize(function(x) density(X,from=x,to=x,n=1,bw=h)$y)
+ fhati=Vectorize(function(i) density(X[-i],from=X[i],to=X[i],n=1,bw=h)$y)
+ F=fhati(1:length(X))
+ return(integrate(function(x) fhat(x)^2,-Inf,Inf)$value-2*mean(F))
+ }
> vx=seq(.1,1,by=.01)
> vy=Vectorize(J)(vx)
> df=data.frame(vx,vy)
> library(ggplot2)
> qplot(vx,vy,geom="line",data=df)

The function has the following shape

and the optimal value is

> optimize(J,interval=c(.1,1))
$minimum
[1] 0.4687553

$objective
[1] -0.3355477

Note that, indeed, it is close to Siverman’s optimal bandwidth.

Statistics, and the Goldilocks Principle

By the end of May, in Toronto, we had that great talk at the SSC by Jeff Rosenthal, on monte carlo techniques, and Jeff mention the name of “the Goldilocks principle” (it was in the contect of MCMC, and I did mention it in my talk in London on MCMC, when I discussed the value of the rejection rate of the Hastings Metropolis algorithm, which should be not to large, and not too small…). In the story, Goldilocks, there are always three alternative, one is always too much in one extreme (too hot – for the soup – or too large – for the bed, or the chaiir), one is too much in the opposite extreme (too cold, or too small), and one is “just right“.

Continue reading Statistics, and the Goldilocks Principle