Tag Archives: transformed

Càdiz, Nonparametric Statistics

Emmanuel Flachaire will be presenting some joint work in Càdiz, Spain, tomorrow evening, at the second conference of the International Society of NonParametric Statistics. Jeff invited me a few months ago, to go there, but unfortunately, I’ve already been moving a lot recently. The talk will be based on the same work that I mentioned at the SSC annual conference (Canadian Statistical Society), in Toronto, at the end of May. His talk is on quantiles and inequality indices estimation from heavy-tailed distribution. As mentioned in my previous post, we will upload the slides (and the paper) in a close future.

So, Emmanuel will go there, and enjoy the beach (and the conference, the program is truly amazing).

(nonparametric) copula density estimation

Today, we will go further on the inference of copula functions. Some codes (and references) can be found on a previous post, on nonparametric estimators of copula densities (among other related things).  Consider (as before) the loss-ALAE dataset (since we’ve been working a lot on that dataset)

> library(MASS)
> library(evd)
> X=lossalae
> U=cbind(rank(X[,1])/(nrow(X)+1),rank(X[,2])/(nrow(X)+1))

The standard tool to plot nonparametric estimators of densities is to use multivariate kernels. We can look at the density using

> mat1=kde2d(U[,1],U[,2],n=35)
> persp(mat1$x,mat1$y,mat1$z,col="green",
+ shade=TRUE,theta=s*5,
+ xlab="",ylab="",zlab="",zlim=c(0,7))

or level curves (isodensity curves) with more detailed estimators (on grids with shorter steps)

> mat1=kde2d(U[,1],U[,2],n=101)
> image(mat1$x,mat1$y,mat1$z,col=
+ rev(heat.colors(100)),xlab="",ylab="")
> contour(mat1$x,mat1$y,mat1$z,add=
+ TRUE,levels = pretty(c(0,4), 11))

http://freakonometrics.blog.free.fr/public/perso6/3dcop-est1.gif

Kernels are nice, but we clearly observe some border bias, extremely strong in corners (the estimator is 1/4th of what it should be, see another post for more details). Instead of working on sample https://latex.codecogs.com/gif.latex?(U_i,V_i) on the unit square, consider some transformed sample https://latex.codecogs.com/gif.latex?(Q(U_i),Q(V_i)), where https://latex.codecogs.com/gif.latex?Q:(0,1)\rightarrow\mathbb{R} is a given function. E.g. a quantile function of an unbounded distribution, for instance the quantile function of the https://latex.codecogs.com/gif.latex?\mathcal{N}(0,1) distribution. Then, we can estimate the density of the transformed sample, and using the inversion technique, derive an estimator of the density of the initial sample. Since the inverse of a (general) function is not that simple to compute, the code might be a bit slow. But it does work,

> gaussian.kernel.copula.surface <- function (u,v,n) {
+   s=seq(1/(n+1), length=n, by=1/(n+1))
+   mat=matrix(NA,nrow = n, ncol = n)
+ sur=kde2d(qnorm(u),qnorm(v),n=1000,
+ lims = c(-4, 4, -4, 4))
+ su<-sur$z
+ for (i in 1:n) {
+     for (j in 1:n) {
+ 	Xi<-round((qnorm(s[i])+4)*1000/8)+1;
+ 	Yj<-round((qnorm(s[j])+4)*1000/8)+1
+ 	mat[i,j]<-su[Xi,Yj]/(dnorm(qnorm(s[i]))*
+ 	dnorm(qnorm(s[j])))
+     }
+ }
+ return(list(x=s,y=s,z=data.matrix(mat)))
+ }

Here, we get

http://freakonometrics.blog.free.fr/public/perso6/3dcop-est2.gif

Note that it is possible to consider another transformation, e.g. the quantile function of a Student-t distribution.

> student.kernel.copula.surface =
+  function (u,v,n,d=4) {
+  s <- seq(1/(n+1), length=n, by=1/(n+1))
+  mat <- matrix(NA,nrow = n, ncol = n)
+ sur<-kde2d(qt(u,df=d),qt(v,df=d),n=5000,
+ lims = c(-8, 8, -8, 8))
+ su<-sur$z
+ for (i in 1:n) {
+     for (j in 1:n) {
+ 	Xi<-round((qt(s[i],df=d)+8)*5000/16)+1;
+ 	Yj<-round((qt(s[j],df=d)+8)*5000/16)+1
+ 	mat[i,j]<-su[Xi,Yj]/(dt(qt(s[i],df=d),df=d)*
+ 	dt(qt(s[j],df=d),df=d))
+     }
+ }
+ return(list(x=s,y=s,z=data.matrix(mat)))
+ }

Another strategy is to consider kernel that have precisely the unit interval as support. The idea is here to consider the product of Beta kernels, where parameters depend on the location

> beta.kernel.copula.surface=
+  function (u,v,bx=.025,by=.025,n) {
+  s <- seq(1/(n+1), length=n, by=1/(n+1))
+  mat <- matrix(0,nrow = n, ncol = n)
+ for (i in 1:n) {
+     a <- s[i]
+     for (j in 1:n) {
+     b <- s[j]
+ 	mat[i,j] <- sum(dbeta(a,u/bx,(1-u)/bx) *
+     dbeta(b,v/by,(1-v)/by)) / length(u)
+     }
+ }
+ return(list(x=s,y=s,z=data.matrix(mat)))
+ }

http://freakonometrics.blog.free.fr/public/perso6/3dcop-est3.gif

On those two graphs, we can clearly observe strong tail dependence in the upper (right) corner, that cannot be intuited using a standard kernel estimator…

Beta kernel and transformed kernel

This Thursday I will give a talk at Laval University, on “Beta kernel and transformed kernel : applications to copula density estimation and quantile estimation“. This time, I will talk at the department of Mathematics and Statistics (13:30 at the pavillon Adrien-Pouliot). “Because copulas have bounded support (the unit square in dimension 2), standard kernel based estimators of densities are (multiplicatively) biased on borders and in corners of the support. Two techniques can be used to avoid that underestimation: Beta kernels and Transformed kernel. We will describe and discuss those two techniques in the first part of the talk. Then, we will see that it is possible to combine those two techniques to get nice estimator of several quantities (e.g. quantiles): transform the data to get on the unit interval – using a transformed kernel – then estimate the (transformed) quantile on [0,1] using a beta kernel, then get back on the initial support. As we will see on simulations, that technique can be better than standard quantile estimators, especially when data are heavy tailed.” Slides can be downloaded here.

  • kernel based density estimation

Kernel based estimation are a popular (and natural) technique to estimate densities.  It is simply and extension of the moving histogram:

so we count how many observations are a the neighborhood of the point where we want to estimate the density of the distribution. Then it is natural so consider a smoothing function, i.e. instead of a step function (either observations are close enough, or not), it is possible to give weights to observations, which will be a decreasing function of the distance,

With a smooth kernel, we have a smooth estimation of the density

http://freakonometrics.blog.free.fr/public/perso3/kernel-f-01.gif

Then it is possible to play on the bandwidth, either to get a more accurate estimation of the density, but not that smooth (small bias but large variance),

or a smoother one (large bias, but small variance),

In R, it is simply

> X=rnorm(100)
> (D=density(X))
 
Call:
	density.default(x = X)
 
Data: X (100 obs.);	Bandwidth 'bw' = 0.3548
 
       x                   y            
 Min.   :-3.910799   Min.   :0.0001265  
 1st Qu.:-1.959098   1st Qu.:0.0108900  
 Median :-0.007397   Median :0.0513358  
 Mean   :-0.007397   Mean   :0.1279645  
 3rd Qu.: 1.944303   3rd Qu.:0.2641952  
 Max.   : 3.896004   Max.   :0.3828215  
 
> plot(D$x,D$y)
  • Beta kernel

The idea of Beta kernel is to consider kernels having support [0,1]. In the univariate case,

http://freakonometrics.blog.free.fr/public/perso3/kernel-f-06.gif

where http://freakonometrics.blog.free.fr/public/perso3/kernel-f-07.gif is the density of a Beta distribution, i.e.

http://freakonometrics.blog.free.fr<br />
/public/perso3/beta-distribution.gif

For additional material, I have uploaded some R code to fit copula densities using beta kernels,

library(copula)
beta.kernel.copula.surface = function (u,v,bx,by,p) {
s = seq(1/p, len=(p-1), by=1/p)
mat = matrix(0,nrow = p-1, ncol = p-1)
for (i in 1:(p-1)) {
a = s[i]
for (j in 1:(p-1)) {
b = s[j]
mat[i,j] = sum(dbeta(a,u/bx,(1-u)/bx) *
dbeta(b,v/by,(1-v)/by)) / length(u)
} }
return(data.matrix(mat)) }

Then we can used it to see what we get on a simulated sample

library(copula)
COPULA = frankCopula(param=5, dim = 2)
X = rcopula(n=1000,COPULA)
p0 = 26
Z= beta.kernel.copula.surface(X[,1],X[,2],bx=.01,by=.01,p=p0)
u = seq(1/p0, len=(p0-1), by=1/p0)
persp(u,u,Z,theta=30,col="green",shade=TRUE,
box=FALSE,zlim=c(0,6))

http://freakonometrics.free.fr/copula-kernel-beta.gif
(yes, the surface is changing… to illustrate the impact of the bandwidth on the estimation).

  • transformed kernel estimation

I the talk, I will also mention the transformed Kernel estimate, as introduced in the book on L1 density estimation by Luc Devroye and Laszlo Györfi (the book can be downloaded here). I probably spend a few minutes on the original chapter, in order to provide another application of that techniques (not only to estimate copula densities, but here to estimate quantiles of heavy tailed distribution). In the univariate case, the R code is the following (here I consider two transformation, the quantile function of the Gaussian distribution, and the quantile function of the Student distribution with 3 degrees of freedom),

set.seed(1)
sample=rbeta(100,4,3)
 
transfN = function(x){
Y=qnorm(sample)
f=density(Y,from=-4,to=4,n=2001)
ny=sum(f$x<=qnorm(x)); 
  g=f$y[ny]/dnorm(qnorm(x))
return(g)
}
 
df0=3
 
transfT = function(x){
Y=qt(sample,df=df0)
f=density(Y,from=-4,to=4,n=2001)
ny=sum(f$x<=qt(x,3)); 
  g=f$y[ny]/dt(qt(x,df=df0),df=df0)
return(g)
}
 
tN=Vectorize(transfN)
tT=Vectorize(transfT)
 
u=seq(.01,.99,by=.01)
vN=tN(u)
vT=tT(u)
plot(u,vN,type="l",lwd=3,col="blue")
lines(u,vT,lwd=3,col="green")
lines(u,dbeta(u,4,3),col="red",lty=2)

The density estimation is the following,

(the red dotted line is the true density, since we work on a simulated sample). Now, let us get back on the initial chapter,

In the book, this is introduced as follows,

The original idea we add it to use this kernel based estimator for copulas, i.e. since we can estimate densities in high dimension with unbounded support, using

http://freakonometrics.blog.free.fr/public/perso3/kernel-f-02.gif

the idea is to transform marginal observations,

http://freakonometrics.blog.free.fr/public/perso3/kernel-f-10.gif

and to use the fact that the associated copula density can be written

http://freakonometrics.blog.free.fr/public/perso3/kernel-f-12.gif

to derive an intuitive estimator for the copula density

http://freakonometrics.blog.free.fr/public/perso3/kernel-f-13.gif

An important issue is how do we choose the transformation

And Luc Devroye and Laszlo Györfi mention that this can be used to deal with extremes.

well, extremes are introduced through bumps (which is not the way I would have been dealing with extremes)

and note that several results can be derived on those bumps,

e.g.

Then, there is an interesting discussion about estimating the optimal transformation

and I will prove that this can be an extremely interesting idea, for instance to estimate quantiles of heavy tailed distribution, if we use also the beta kernel estimator on the unit interval. This idea was developed in a paper with Abder Oulidi, online here.

Remark: actually, in the book, an additional reference is mentioned,

but I have never been able to find a copy… if anyone has one, I’d be glad to read it…