We have seen extreme value copulas in the section where we did consider general families of copulas. In the bivariate case, an extreme value can be written
where is Pickands dependence function, which is a convex function satisfying
Observe that in this case,
where is Kendall’tau, and can be written
For instance, if
then, we obtain Gumbel copula. This is what we’ve seen in the section where we introduced this family. Now, let us talk about (nonparametric) inference, and more precisely the estimation of the dependence function. The starting point of the most standard estimator is to observe that if has copula , then
has distribution function
And conversely, Pickands dependence function can be written
Thus, a natural estimator for Pickands function is
where is the empirical cumulative distribution function of
This is the estimator proposed in Capéràa, Fougères & Genest (1997). Here, we can compute everything here using
> library(evd) > X=lossalae > U=cbind(rank(X[,1])/(nrow(X)+1),rank(X[,2])/ + (nrow(X)+1)) > Z=log(U[,1])/log(U[,1]*U[,2]) > h=function(t) mean(Z<=t) > H=Vectorize(h) > a=function(t){ + f=function(t) (H(t)-t)/(t*(1-t)) + return(exp(integrate(f,lower=0,upper=t, + subdivisions=10000)$value)) + } > A=Vectorize(a) > u=seq(.01,.99,by=.01) > plot(c(0,u,1),c(1,A(u),1),type="l",col="red", + ylim=c(.5,1))
Even integrate to get an estimator of Pickands’ dependence function. Note that an interesting point is that the upper tail dependence index can be visualized on the graph, above,
> A(.5)/2 [1] 0.4055346