Tag Archives: IQ

The ethics of modelling in a world where normality no longer exists

(this article was originaly writen in French – part one and two – and published in Risques)

The mechanism for covering natural disasters, in France, was created to compensate “direct uninsurable material damage caused by the abnormal intensity of a natural agent” (article L. 125-1 paragraph 3 of the Insurance Code). Still on the legal level, the Court of Cassation formulated, in November 1986, a principle according to which “no one must cause others an abnormal neighbourhood disturbance”. And in order to be entitled to compensation following pre-trial detention, it is necessary for the plaintiff to prove that the detention caused him “manifestly abnormal and particularly serious harm” (Article 149 of the Code of Civil Procedure). But what does this “abnormality” in all these articles mean? According to the dictionary, abnormality is defined as “contrary to the usual order of things” (one could see there an empirical, statistical notion), “contrary to the just order of things” (this notion of “just” probably calls for a normative definition) but also “not in conformity with the model”. Defining a standard is already not simple if we are only interested in the descriptive, empirical aspect, as actuaries can do when they construct a model (especially in large dimensions, where, as we shall see, normality no longer exists), but if we also integrate a dimension of justice and ethics, we wonder if the task is not impossible…

The average man from Quetelet and Galton

In the 19th century, if several astronomers measured the speed of the same celestial object, they obtained (often) several different measurements. In order to know which one to use in their calculations, the idea of using the “averages method” was quickly imposed – as Stahl[2006] recalls, and especially Sheynin[1973] – this average having a greater precision than any other quantity (or would now say statistically). From a set of observations \{x_1,...,x_n\}, we set
\bar x=\frac{(x_1+⋯+x_n)}{n}=\frac{1}{n}\sum_{i=1}^nx_iWe can note that this size is also solution of the optimization problem
\bar x=\text{argmin}\lbrace\sum_{i=1}^n(x_i-m)^2\rbracewhich shows the importance of “least squares”. Adolphe Quételet was, it seems, the first to apply this calculation of averages to human measures, introducing his famous concept of the “average man”. If we define the mean using a quadratic error minimization, we have an interpretation in terms of forecast: the mean size is the size that a randomly drawn person should measure (up to a random – and unpredictable – variation). In 1846, in a letter Adolphe Quételet used the image of the gladiator statue to explain what the average man might be:

Suppose a thousand statues were used to copy the gladiator with all the care imaginable. Your Highness certainly does not think that the thousand copies that will have been made will each reproduce exactly the model, and that by measuring them successively, the thousand measures that I would obtain would be as concordant as if I had taken them all on the statue of the gladiator himself. The first chances of error would be joined by the inaccuracies of the copyists; so that the probable error would perhaps be very great. Despite this, if the copyists have not worked with preconceived ideas, exaggerating or reducing certain proportions according to school prejudices, and if their inaccuracies are only accidental, the thousand measures, grouped in order of magnitude, will still present a remarkable regularity and will follow one another in the order assigned to them by the law of possibility. I see Your Highness smiling; she will no doubt tell me that such assertions will not compromise me, since we will not be willing to try the experiment. And why not? Perhaps I will surprise him by saying that the experience is ready-made. Yes really, more than a thousand copies of a statue have been measured, which I will not guarantee to be that of the gladiator, but which, in any case, is not far from it: these copies were even alive, so that the measures were taken with every possible chance of error: I would add, moreover, that the copies could have been distorted by a host of accidental causes. One must therefore expect, here, to find a very sensitive probable error.

This average man liked a lot at the time, especially within the English eugenicist school, directed by Francis Galton, even if the latter is mainly interested in deviations from this norm (upward deviation and downward deviation). As Bulmer recalls[2004], “the deviations from that average – upwards towards genius, and downwards towards stupidity – must follow the law that governs deviations from all true averages”. Galton’s work was aimed at understanding these deviations. If Florence Nightingale stated that “the average man is God’s will”, Galton was more interested in the hereditary character of the deviation than in the average. But does that mean anything to this average man?

Looking for the “average” person

Rose[2016] presents two examples in her book Tyrany of the NormThe End of Average . The first is drawn from problems encountered by the US military in the 1950s. When designing the cockpits of fighter aircraft, engineers had used the dimensions of more than 4,000 pilots to optimally position the seat relative to the pedals, the joystick, the height of the windscreen, but also the shape of the seat, the helmet, etc. These measurements made it possible to calculate the measurements of the “median” pilot in about ten dimensions. For example, the average pilot size was 179 cm, which allowed the average pilot size to be defined between 175 and 185 cm. While a majority of the pilots were medium in size, none of the 4,000 pilots was “average” in all dimensions. As Daniels[1952] stated, “designing a cockpit for the average pilot was in fact not designing one for any pilot.

The second example is linked to two statues, those of Norma and Normann (historically on display in Cleveland, now in the Harvard Library). The artist Abram Belskie and the obstetrician Robert Latou Dickinson made these statues together in 1943. Their particularity is that no model has been represented. In fact, it was to represent a woman and a man who had the average measurements of the time (from measurements made on thousands of subjects). Once these statues were made, a contest was held to find out who these statues could represent. Several thousand people from Ohio sent their measurements, but none matched those of the statues. Of course, several hundred were the same size. Several hundred had the same chest circumference. But none had all the right measurements. Because as Todd Rose explains, man is not unidimensional: it is on several dimensions that we measure it in several dimensions. And trying to summarize it in a one-dimensional size is far too reductive. This is what he shows in his book on intelligence tests, for example, where the same IQ can be associated with two very different people. It makes no sense to focus on a single indicator when deciding to recruit someone. The concern when working in a multivariate context is that the average loses its meaning. In fact, from a probabilistic point of view, being average can be extraordinary.

The curse of the dimensionality

In fact, this problem is well known to statisticians as the “scourge of the dimension”. Let’s take a simple example: suppose that a quantity of interest follows a normal law N(\mu,\sigma^2), for example weight, height, chest circumference, etc.. One could say that being in the norm is being in an interval [\mu\pm1.5\sigma]. If we have a normal law, this situation occurs in 85% of cases. And the 15% that are not in this range will be seen as “abnormal”. The sizes can be abnormally small, or abnormally large. It is the drawing of figure 1, on the top. We can now look at two dimensions, weight and height, for example. The norm here would be that in both dimensions, we are in the interval [\mu\pm1.5\sigma]. If the quantities are independent, the probability that both quantities are “normal” is 75%, since 0.85^2\sim 0.75

In other words, in dimension two, 75% of the observations are globally normal, and 25% are then abnormal. In dimension 3, we pass to 65 %, that is to say more than a third of abnormal observations (on the bottom on figure 1, the red points being the abnormal points).


Figure 1 Proportion of “average” individuals in dimensions 1, 2 and 3

In dimension five, we go below 50%, in other words, being in the norm in the five dimensions is no longer the case of the majority. And in dimension twenty, those which are normal are rather atypical, with a proportion of the order of 5 %. Thus, in large dimensions, normality is no longer associated with the idea of a majority. This is the problem that actuaries face today when using very large data, in pricing for example: it becomes very difficult to characterize a rate class (by saying what the average insured in that class looks like).

Normality, statistics and standards

From an empirical, descriptive point of view, being within the norm means nothing other than being within the average, not getting too far from that average. We will then tend to define the norm as the frequency of what happens most often, as the attitude most frequently encountered or the preference most regularly expressed. But this normality is not normativity, and “to be in the norm”, to be exemplary, is then a different dimension, which this time no longer relates to a description of reality but to an identification of what it should tend towards. So we move from the register of being to that of being, from “is” to “ought” to use Hume’s terminology[1739]. It is indeed difficult to envisage the model (or normality) without sliding towards the second meaning that can be found in the concept of standard, which in turn has a strictly normative dimension. This vision leads to confusion between norms and laws, even if not all normativity is exhausted by laws. Hume thus notes that, in all moral systems, authors move from statements of fact, that is, statements of the “there is” type, to proposals that include a normative expression, such as “one must”, “one must”. What Hume disputes is the shift from one type of statement to another: for him, these are two types of statements that have nothing to do with each other, and that cannot therefore be logically linked with each other, in particular from an empirical norm to a normative rule. For Hume, an assertion that is not normative cannot give rise to a normative conclusion. Hume’s assertion has given rise to numerous comments and interpretations, particularly because, as it stands, it seems to be an obstacle to any attempt at naturalization of morality – as McIntyre[1959] or Rescher[1990] detail. In this sense, there is a strong distinction between the norm in regularity (normality) and the rule (normativity).

Statistical laws, from micro to macro

The statistical law is about what “is” because it has been observed (for example, “men are taller than dogs”). Human law (divine, or judicial) is what “is” because it has been decreed, and therefore “must be” (“Men are free and equal” or “Man is good”). Finally, the physical law is about what “is” because we can show it (“The planets are attracted to each other”), often within the framework of hypotheses. We see that the three concepts can be linked. For example, Kepler’s law was historically established using observations (and historically fell into the first category), before being demonstrated in the Copernican model (and then moved on to the third). A concept of balance can also be associated with this law, this “norm”. However, as Hilpinen[1971] points out, however, probabilistic laws ask many questions, one need only think of dice throws or expectations: what is meant by “it is normal to wait five minutes for the bus to stop”, or more ethically disturbing, “it is normal for a person remanded in custody to be imprisoned for eighteen months”?

The norm can be seen as a regularity of cases, observed using frequencies (or averages), for example, on the size of individuals, the duration of sleep, in other words the data that constitute the description of individuals. Anthropometric data have thus made it possible to define an average size of individuals in a given population, according to their age; compared to this average size, a difference of 20% more or less determines gigantism or dwarfism. If we think of road accidents, it can be considered abnormal to have a road accident in a given year, at an individual (micro) level, because the majority of drivers do not have an accident. Nevertheless, from the insurer’s (macro) point of view, the norm is that 10% of drivers have an accident. It would therefore be abnormal for no one to have an accident.

Nevertheless, from the insurer’s (macro) point of view, the norm is that 10% of drivers have an accident. It would therefore be abnormal for no one to have an accident. This is the argument found in Durkheim[1897]. From the singular act that is suicide, if it is considered from the point of view of the individual who commits it, Durkheim tries to see it as a social act, then a real regularity, within a given society. From then on, according to Durkheim, suicide became a normal phenomenon. Statistics then make it possible to quantify the tendency to suicide in a given society, as soon as we no longer observe the irregularity that appears in the singularity of an individual story, but a social normality of suicide.

Standard, convention and ethical aspects

If we take an evolutionary view, what is normal is what is most capable of adapting, of responding to needs, of providing a model for the resolution of situations (nature making abnormality disappear), and normality tends towards normativity, and it becomes difficult to distinguish between the two aspects. In fact, David Hume addresses this point in the well-known example of rowers, who get into the same boat to cross a river and row in rhythm (this example is discussed at length in Mackie[1980]). The two rowers gradually adjust their rowing strokes, one in relation to the other, and it is not necessary to obtain an explicit agreement (which would formulate the standard) that they would respect. The law, which consists in imposing a standard can be useful in case of conflict (if one of the rowers refuses to row, or two rowers of very different physical capacities), but very often, it is not necessary to formulate explicitly this standard inherent to their conduct. The external observer will observe a regularity (when the cruising rhythm is reached) that he can model, but this normal observed rhythm is not necessarily imposed by a law. In the case of rowers, we find the notion of balance mentioned previously. To build a model is to extract the signal from the noise (to use Silver’s distinction[2015]), it is to look for a standard, in the statistical sense. But this goes further if a predictive model is constructed, and reality must then conform to the model, as actuaries often hope.

Patrick Blackburn, Maarten de Rijke & Yde Venema, Modal Logic, Cambridge University Press, 2002.

Bulmer M., Francis Galton: Pioneer of Heredity and Biometry. Johns Hopkins University Press, 2004.

Daniels G., “The Average Man”, Air Force Aerospace Medical Research Lab, vol. 53, n° 7, 1952.

Durkheim E., Le suicide, 1897.

Hilpinen R., Deontic Logic: Introductory and Systematic Readings, 1971, Dordrecht, D. Reidel Publishing Company.

Hume D., Traité de la nature humaine. Tome III : de la morale, 1739.

McIntyre D.C., “Hume on ‘Is’ and ‘Ought’”, The Philosophical Review, vol. 68, n° 4, 1959, pp. 451-468, Duke University Press.

Mackie J.L., Hume’s Moral Theory, Routledge & Kegan Paul Books, 1980.

Rescher N., “How Wide Is the Gap Between Facts and Values?”, Philosophy and Phenomenological Research, vol. 50, 1990, pp. 297-319.

Silver N., The Signal and the Noise: Why So Many Predictions Fail – But Some Don’t, Penguin Press, 2015.

Rose T., The End of Average: How We Succeed in a World That Values Sameness, HarperOne, 2016.

Sheynin O., “Mathematical Treatment of Astronomical Observations (A Historical Essay)”. Archive for History of Exact Sciences, vol. 11, 1973, pp. 97-126.

Stahl S., “The Evolution of the Normal Distribution”, Mathematics Magazine, vol. 79, 2006, pp. 96-113.