In 1940, Wassily Hoeffding published Masstabinvariante Korrelationstheorie, which was an impressive paper. For those (like me) who unfortunately barely speak German, an English translation could be found in The Collected Works of Wassily Hoeffding, published a few years ago. As I keep saying in my courses about copulas, almost everything was in that paper, by Wassily Hoeffding. For instance, we can see the following graph, of a cumulative distribution function,
What is the difference with a copula? A copula (in dimension 2) is the cumulative distribution function of a random pair with uniform on , as defined by Abe Sklar
But Wassily Hoeffding considered a random pair with uniform on . But everything else is the same. He can even derive the level curves of the density of the Gaussian copula,
> library(mnormt)
> r=.6
> dc=function(u,v) return(
+ as.numeric(dmnorm(cbind(qnorm(u),qnorm(v)),varcov=
+ matrix(c(1,r,r,1),2,2))/dnorm(qnorm(u))/dnorm(qnorm(v))))
> n=500
> vectu=seq(1/n,1-1/n,length=n-1)
> matdc=outer(vectu,vectu,dc)
> contour(vectu,vectu,matdc,levels=
+ c(.325,.944,1.212,1.250,1.290,1.656,3.85),lwd=2)
But another interesting point is that there is the so-called Hoeffding’s equality
which is interesting, and quite important, actually, to understand that the covariance (or the correlation) can be seen as some ‘distance‘ to the independence. More precisely, observe that
where would be the joint cumulative distribution function of some independent variables, with the same marginal distributions.
Of course, it is not exactly a distance, since it can be negative. But still. Now, the thing is that the proof is not trivial. But it is using interesting identities. For instance, in 1885, Franklin wrote a nice paper, Proof of a Theorem of Tchebycheff’s on Definite Integrals, in the American Journal of Mathematics. To get some heuristics about the identity, consider some (finite) sequences and , then one can prove that
And there is a continuous version of that identity. Consider two bounded functions and , on some interval, then
is equal to
In 1979, in Monotone Regression and Covariance Structure, Gerald Shea gave a more probabilistic interpretation of that results, using the fact that
and using a different measure. More precisely, assume now that functions and are integrable, with respect to some measure , on some set . Then
is equal to
In the case where is a probability measure of , i.e. , this equality is the one used by Wassily Hoeffding, in 1940. The interpretation in terms of random variable is simple that
(with standard assuptions of existence of those quantitites) where and are two independent vectors, with identical distribution, . Actually, this relationship can also be found in Some Concepts of Dependence, by E. L. Lehmann, published in 1966. Oh, and by the way, the connection with Chebyshev inequality (claimed in the title of seminal paper by Franklin) come from the fact that if and are monotonic, then the left part of the identity is positive, and thus,
But let’s get back to Hoeffiding’s result. How do we get it from that lemma. The idea is to write
as
i.e.
We can then intervert the integral and the expectation, use the fact that
and then, and some integral calculus can be used to rewrite that expression as
So we get here Hoeffding’s identity. Actually, as mentioned by Ben Derrett about the equality above, it can be observed (see http://math.stackexchange.com/105713) that2\text{cov}(X,Y)=2\big(\mathbb{E}[XY]-\mathbb{E}[X]\mathbb{E}[Y]\big)can also be written
where again, and are two independent vectors, with identical distribution, . The later can be writen