A short post to get back on a property I gave briefly in the MAT8595 class in January, and again in the MAT8181 class this week (to illustrate the distinction between weak and strong white noises). Recall that (real-valued) random variables and
are independent if for all
,
Another characterization, for integrable variable is that
for all
, which can be written, if variables are square integrable
The idea to prove this characterization is to observe that if
and
are independent
can be written
Using a standard argument in integration theory, equality
is valid for step functions (not only indicators), and then to positive measurable functions, and finally to integrable functions. Proving this result is not that difficult. Observe that Rényi (1959) – inspired by Gebelein (1947) – followed by Sarmanov (1958) introduced the concept of maximal correlation, that can be related to this result,
where the maximum is taken over all functions
and
such that the correlation exist. Actually, it is possible to consider only transformations such that
and
(and similarly for
, the idea is that we simple center and scale, which does not impact the correlation.Thus,
and
are independent if and only if
Algorithm to estimate that coefficient are interesting. The problem can be written, equivalently
And if the minimization is considered over
, assuming that
is fixed, then the optimal transformation is
And similarly for
. So using an iterative algorithm, it is possible to get
and
(see Breiman and Friedman (1985) for more details). Actually, those functions appear in nonlinear canonical analysis. As mentioned in Lancaster (1957), for a Gaussian random vector
and in that case
and
are affine functions. This can be related to Hermite’s polynomial and to the expansion of the bivariate Gaussian density. I still hope that someone will go further for the project in the MAT8181 course.