Tag Archives: variances.eu

Les biais, les discriminations et l’équité en assurance

Sur Variances, un court article pour présenter le rapport remis au début de l’été à l’Institut Louis Bachelier, Assurance : Discrimination, biais et équité.

Les données massives et les performances obtenues par les algorithmes d’apprentissage automatique ont chamboulé l’assurance et l’actuariat. Les questions soulevées par ces nouveaux outils dans d’autres contextes (que ce soit la justice prédictive (ou justice “actuarielle” comme l’appelle Harcourt (2008)) ou les débats sur les fake news, en passant par les véhicules autonomes et la médecine prédictive) poussent les actuaires au doute, et à la méfiance. Kranzberg (1986) affirmait que “technology is neither good nor bad; nor is it neutral”, mettant en avant que, même sans mauvaises intentions, les algorithmes d’apprentissage pouvaient être injustes. Et corriger ces possibles injustices n’est pas simple. Pour Nielsen (2020), “technology does not necessarily self-regulate, via either market or social pressures” (la main invisible des marchés ou de la pression sociale ne suffira peut être pas). C’est dans ce contexte que nous allons revenir ici sur les problématiques de biais, de discrimination et d’équité, des modèles prédictifs utilisés en assurance. Ces changements, tant sur les données que sur les modèles, que l’on observe depuis une petite dizaine d’années, avaient déjà questionné l’existence même de l’assurance (à suivre).

From betting to “prediction market”

This is the second part of a series on sports betting

Sports betting has long fascinated economists and statisticians. Griffith (1949) showed early on that horse race bettors put too much money on horses that have little chance of winning, and too little on those that have the best chance of winning. This tendency to underbid on the most likely event has been obaserved in all sports betting, where the “most likely event” is calculated on the basis of recent statistics. And it can be explained in a fundamental way by the mechanics of mutual betting: the bettor opposes his beliefs to those of the crowd, because the various bets will determine the odds.

Predictions, before surveys

Today, in the months leading up to each election, we find ourselves drowned under the polls, conducted every day (and commented on several times a day, as if estimation noise was worth exegesis). As Frédéric Dabi (Deputy Director General of Ifop) pointed out in a debate organised by Risques magazine in 2017, “surveys are an indication of the electoral balance of power, not a prediction”, but it is nevertheless often in the idea of having a prediction that they are used.

But if we go back in time, Rhode & Strumpf (2008) reminds us that other techniques were used, before the idea of surveys became necessary, in particular betting. In 1549, Matteo Dandolo (Ambassador of Veneto) noted that “it is therefore more than clear that the traders are very well informed of the state of the election, and that the employees of the cardinals in conclave (i conclavisti) participate with them in betting, which therefore leads to several tens of thousands of crowns changing hands” as Baumgartner (2003) tells us. Closer to home, betting markets during the elections were popular in the United States until the Second World War. Rhode & Strumpf (2008) suggests several reasons for the loss of interest in the second half of the 20th century: improvements in sampling techniques… and the legalization of horse betting. But online betting sites have revived interest in betting, whatever it may be. Because the sites we mentioned in a previous article are often not limited to sports betting, but also allow betting on a magnitude earthquake, an Oscar winner, or even the observation of the Higgs boson, as proposed by intrade.com, which was liquidated in 2015. As onlinebettingsites.com shows, we could bet on the French elections in 2017, or on the referendum on Brexit (even if for the latter, the predictive markets were not able to reflect the beliefs of the crowds, as an article in The Economist recalls).

Mathématiques du pari-mutuel

The “pari-mutual” theory is not unlike the mutualisation of risks, an important foundation of the insurance mechanism, dear to actuaries. Working in the horse betting markets, Edmund Eisenberg and David Gale obtained, in a short three-page article, Consensus of Subjective Probabilities, relatively general results, as long as the bet is static.

Supposons que I joueurs puissent parier sur J chevaux. Chaque joueur possède une somme totale bi, que l’on normalisera de telle sorte que bi désigne la part de la somme totale misée (et donc b1 +…+ bI =1). Le joueur i peut alors miser la somme bi,j sur le cheval j (avec ici bi,1+…+bi,J = bi). Lorsque les paris sont clôturés, on note pj le montant parié sur le cheval j, autrement dit b1,j+…+bI,j = pj). La contrainte de budget impose que la somme de ces montants soit égale a 1, ce qui donne aux pj une interprétation probabiliste. Nous reviendrons sur l’utilisation de ces « prix » par la suite. On peut aussi noter qj la cote de gain (payoff-odds) définie comme pj-1-1, de telle sorte que pj=(1+ qj) -1. Si on suppose qu’une portion 1-a est gardée par le bookmaker, alors pj= a(1+ qj) -1 et qj =( a -pj)/ pj. Les rendements espérés sur chacun des chevaux doivent être égaux, à l’équilibre, au rendement net attendu, où l’espérance est calculée sous la probabilité p, de manière à refléter les croyances de tous les parieur, soit ici

pjqj+ (1-pj)(-1)= a-1

The key result of the Eisenberg & Gale model is to show that there is a balance in this market. More precisely, the fraction bet on each horse must be equal to the probability of the horse market. To achieve this balance, it is often assumed that the equilibrium ratings are found by an auctioneer (this role will be played by the bookmaker). As Blough (2008) noted, the hypothesis that no wagering is made until the odds are balanced is a hypothesis that is indeed true in horse racing.

If we assume that each bettor is risk neutral (and seeks to maximize his expectation of winning) and that his beliefs are materialized by a probability vectors pi=(pi1,…,piJ) – in the sense that player i thinks that horse j will win with a probability pij – then at equilibrium, if bi,j >0,

pij=pj max{pis/ps}

where argmax{pis/ps}= argmax{pis(qs+1)}

s the horse on which bettor i must bet everything if he bets on a single horse. Blough (2008) elaborates at length on the description of this balance, and extends it to the case where agents potentially have risk aversion (but the same) and potentially different beliefs. This balance is then described as a consensus of belief.

In an article entitled Interpreting the Predictions of Prediction Markets, Charles Manski proposed using this theory to interpret these prices, in conjunction with more traditional approaches in economics, such as Arrow-Debreu prices.

To illustrate this consensus, let us consider a world cup final that should lead either to the victory of A or the victory of B. Let us imagine a contract offering $1 if A wins, and let this contract be offered at price pA. Si on n’autorise pas d’arbitrage, on a une loi du prix unique, et on en déduit que pB = 1-pA. Imaginons un joueur qui pense que la probabilité que A gagne est supérieure à pA, soit, avec les notations précédentes, piA > pA. Dans ce cas, le joueur a intérêt à parier tout son agent sur la victoire de A, c’est-à-dire acheter ce contrat. La demande agrégée pour ce titre sera alors

[b1P[p1A > pA]+…+ bI P[pIA > pA]] / pA

et on aura un équilibre si la demande agrégée pour les deux titres vérifie

[b1P[p1A > pA]+…+ bI P[pIA > pA]] / pA  

= [b1P[p1A < pA]+…+ bI P[pIA < pA]] / pB

 de telle sorte que

pA = b1P[p1A > pA]+…+ biP[piA > pA]] +…+ bIP[pIA > pA]

which allows the prize to be written as an average of the players’ beliefs.

It should be noted here that the balance is static, allowing the bookmaker to just set a rating. Recently, Agrawal et al (2014) proposed an algorithm to balance this market in continuous time. It may also be noted that this notion of equilibrium appears in many algorithms, such as in the so-called Fisher market.

The predictive power of prices

But this idea of seeing in the prices an aggregation of players’ beliefs is not new! In 1655, in Van Rekeningh in Spelen van Gelucken, (published in Latin under the title’De Ratiociniis in Aleæ Ludo’), Christiaan Huyghens proposed to extract information on beliefs from prices. In 1671, Wilhelmina de Witt noted that, as the price of a contract paying an annuity until death could be seen as a weighted average of annuities (with a fixed maturity), by observing the prices of the different insurance contracts, probabilities interpreted as probabilities of survival could be extracted.

These probabilities are “subjective” as Bruno de Finetti or Frank Ramsey called them. The latter did not see probabilities from a frequentist angle, but saw it as a measure of the degree of belief, which could be measured through bets, in Truth and Probability (1926). This is finally what the theory presented by Kenneth Arrow in 1953, and further developed by Gérard Debreu in 1959, introducing the “Arrow-Debreu prices”, says.

Many websites use odds to infer players’ beliefs, which are misrepresented as the probability that a team will win a competition. We can also note the work carried out last summer by doctoral students at the University of Rennes who had compared the odds on online betting sites, and the forecasts obtained by several algorithms (ranging from a naive Bayesian classifier to boosting, SVM or neural networks). A special issue of The Economist, published in 2007, entitled The Future of Futurology, noted that “the most heeded futurists these days are not individuals, but prediction markets, where the informed guesswork of many is consolidated into hard probability”. This idea has now largely returned to the forefront, as predicted in the article by Chen & Pennock (2010) published in AI Magazine.

Agrawal, Shipra, Delage, Erick, Peters, Mark, Wang, Zizhuo & Ye, Yinyu (2014). A Unified Framework for Dynamic Prediction Market Design. Operations Research.

Baron, Ken & Lange, Jeffrey (2006). Parimutuel Applications In Finance: New Markets for New Risks. Springer.

Baumgartner, Frederic (2003) Behind locked doors: a history of papal elections. Palgrave.

Blough, Stephen R. (2008) Differences of opinion at the racetrack. In Efficiency of Racetrack Betting Markets, 323-341, World Scientific.

Chen, Yiling & Pennock, David (2010). Designing Markets for Prediction. AI Magazine.

Decker, Wolfgang & Thuillier, Jean-Paul (2004). Le sport dans l’antiquité. Picard.

Eisenberg, Edmund & Gale, David (1959). Consensus of Subjective Probabilities: The Pari-Mutuel Method. Annals of Mathematical Statistics, 30:1, 165-168.

Griffith, RM (1949) Odds adjustments by American horse-race bettors. The American Journal of Psychology, 62, 290-294.

Manski, Charles (2005) Interpreting the Predictions of Prediction Markets. NBER 10359.

Rhode, Paul, W. & Strumpf, Koleman (2008) Historical Political Futures Markets: An International Perspective. NBER 14377.

[1] Baron & Lange (2006) discusses the comparison between so-called “risk-neutral” valuation in finance (based on the law of single price and arbitrage), and that relating to mutual betting. They thus speak of “self-hedging” because, in a bet, the bettors share the winnings in proportion to their initial bet. This is reminiscent of the way mutual insurance companies operate, where the money used to compensate victims must correspond to the total premiums charged.

 

A brief history of sports betting

this article was originaly published – in French – in variance.eu

A report by the American Gaming Association (May 2017) estimated that between $100 billion and $400 billion was bet each year on an estimated gross income of between $5 billion and $20 billion, just for sports betting. We will return here to a brief history of sports betting, emphasizing the concept of pari-mutuel betting. We will see, in a second article, the links of this principle with mathematical finance, and insurance.

From games to sports

Sports betting has been around for a long time, even if the origin of the first bet is impossible to date. We can think of the Greeks, inventors of the Olympic Games, where it was not uncommon for spectators to bet among themselves on the winners (Decker & Thuiller, 2004). Closer to home, as Georges Vigarello reminds us, “Under the Ancien Régime, gambling was the subject of a real passion. It takes the form of either betting games or prize games.

The first, bets, are made between people from the same social world, between farmers or between nobles. The second, the prize games, take place during parish celebrations, and show different regional practices, with the struggle in Brittany, or the jump in Provence. We can also think of the confrontations between villages at the soule for example. Among the nobles, prize games are organized for special occasions (birth or wedding). These games were recreational and festive moments.

It was not until the end of the 19th century that gambling became a sport, in line with the hygienist theories of the time. We can think of Baron Pierre de Coubertin, who wanted to “use all the means appropriate to develop our physical qualities to make them serve the collective good” through “sport”. We will find the Baron again in 1887 with the creation of the Union of French Societies of Athletic Sports, the official appearance of the notion of “sport”, replacing that of “game”, as Dietschy & Clastres (2006) points out, noting in passing that this Union is based on amateurism, in reaction against the companies of cycling (from 1860) and walking (around 1870) which resumed the traditions of price and betting games. Around 1890, this union, dedicated to athletics, opened up to other sports (rugby, field hockey, fencing, swimming) which were represented by specialized commissions.

The first bookmakers and gambling

A little earlier, during the Industrial Revolution, horse betting organised by bookmakers developed. These bets were popular in the United Kingdom in the 16th and 17th centuries, but remained reserved for the aristocracy and the landed gentry. And in reality, only horse owners were allowed to bet on the results of these private races, known as “matches”. One of his races, launched by the twelfth Earl of Derby (Edward Smith-Stanley) around 1870, also left its mark on sporting vocabulary. If these races were originally private, Charles II’s passion for these races made them more ambitious, attracting huge crowds, betting more and more important sums. Innkeepers and pub owners were then major promoters of these races, which encouraged owners to organize the races near their establishments. They then naturally became the first bookmakers, organizing the first steeple-chases, a form of race (first created in Ireland) where riders ran from one church tower to another by jumping everything in their path! In 1826, at the stables in Saint Alban, north of London, the idea of horses starting and finishing in the same place was launched, giving rise to modern racecourses.

Betting was not yet regulated and betting on races was based on a credit system. And since gambling near a place where alcohol was available in large quantities can have dramatic consequences, the British government banned gambling in pubs, which led to the opening of betting shops, run by bookmakers, with the adoption of the Gambling Act in 1845. The bookmakers not only played the role of scribes, keeping track of transactions in registers, they also served as arbitrators in betting. The bookmaker has become the intermediary with whom to bet, he receives the bets, but does not bet against the player. The arbitrator does not only act at the end, in the event of a dispute, but above all to make the bet official. Indeed, cash bets were rare, and bookmakers decided whether the items bet had the same value and, if not, what the difference was. One of the players then adds money to a cap. Players put their hands in the hat and remove them, either to agree with the assessment or to indicate their disagreement. This is referred to as “hand in cap”, which refers to the amount of money needed to ensure a fair bet. The word “handicap” was then commonly used in horse betting (to designate disadvantaged participants at the start of a race) and then to have a medical connotation from 1950 onwards.

Thereafter, bookmakers will not lack imagination, introducing cash bets, then offering fixed odds against each horse in a race. Parliament then went backwards with the Suppression of Betting Houses Act in 1853. Credit institutions and games of chance on racetracks were allowed. At the same time, in France, Léon Sari invented the “pari mutuel” in 1857 with Charles de Morny, owner of the Maisons-Laffitte racetracks (which became a building with stands in June 1878). Joseph Oller, who co-founded the Moulin-Rouge, is the concessionaire. As the Senate report on gambling in France reminds us, the law of June 2, 1891 legalizes betting on horse races and establishes the principle of mutualization. As we will see later, this principle means that bettors play against each other and share the winnings (once the legal levies provided for by law have been made for the benefit of the State and the institution of racing). In mathematical finance, we speak of “self-hedging strategy”. In March 1931, the PMU (“pari mutuel urbain”) was born, and it was not until 1985 that the “sports lotto” arrived.

From horses to other sports

The “pool” has long referred in England to a game of cards played for collective stakes, drawing its etymology from the French “hen”, or rather from the old French “hen”, referring to a young poultry (we will find the Latin word pulla, de pullus, the “young animal”), but also “booty” or “looting”. Here we find the idea of playing for money. This use can be traced back to 1870 (in the sense of “collective betting”) before becoming a pool during the First World War, and then to designate a group of people sharing skills. As early as 1920, the term “football pool” was coined, as recalled by Forrest (1999).

In Liverpool, England, John Moores founded Littlewoods in 1923, a retail company, before launching mail order sales, while offering football bets. The most famous game was the “Treble Chance”, where players could choose to bet on 10, 11 or 12 football matches for the coming weekend. Anecdotally, as noted by Forrest & Pérez (20013), when a match could not take place (for example because of rain), a panel of experts appointed by Littlewoods had to model the match, and provide a forecast. After the Second World War, in Europe, we will see the arrival of so-called 1X2 formulas where the player must predict whether, in a set of 12 to 15 games, the home team will win (1), lose (2) or draw (X). It can be noted that these “football pools” could refer to any form of pari-mutuel betting, very strongly resembling a lotto. The main difference is that in the lottery, the draw is supposed to be a pure random process, unlike football matches. And for the players, the difference is significant! In the 1980s, Liverpool was one of the largest private companies in Europe. Before decreasing with the birth of online betting sites….

Internet and online betting

Now, in addition to the betting companies that still exist in the United Kingdom, the strong point of bookmakers is their online presence. The first sites were created around 1995, for example Intertops, which was based on a law passed by the island nation of Antigua and Barbuda (an officially independent, Commonwealth member country) in 1994, granting licences to companies wishing to provide gambling services over the Internet (subsequently, they obtained licences from the Mohawk territory of Kahnawake in Quebec, or Malta). Betting on sports events has quickly become very popular.

In 2000, Betfair was launched, and revolutionized the industry: Betfair itself did not take customer bets, but rather offered customers to place bets between them. These peer-to-peer betting was quickly very popular. In 2002, the first live betting was launched, offering bettors the opportunity to bet on a sporting event while it was taking place. Today, on lƒes larger sites, all kinds of sports are available, whether collective (football, basketball) or individual (tennis, boxing), with possibly a competition involving more than two players or teams (athletics, cycling). The player can choose an objective, which can be a final score (1X2 in football), a number of goals scored, etc., then he concludes the bet by choosing the amount he is willing to bet (the bet). On all sites, no less than 20,000 bets are possible every day.

Decker, Wolfgang & Thuillier, Jean-Paul (2004). Le sport dans l’antiquité. Picard.

Dietschy, Paul & Clastres, Patrick (2006). Sport, société et culture en France du XIXe siècle à nos jours. Hachette, Carré Histoire.

Forrest, David (1999). The Past and Future of the British Football Pools. Journal of Gambling Studies, 15:2, 161-176.

Forrest, David & Pérez, Levi (2013) The Football Pools in The Oxford Handbook of the Economics of Gambling, 147-162

Vigarello, Georges (2004) Le sport est-il encore un jeu ? Sciences Humaines, no 152.

To be continued…. with a post on how bets, predictions and players’ beliefs are linked.

Mapping cities

a French version of this article is online at http://variances.eu/

Issue 53 of Insee Analyses Ile-de-France provides an analysis of “a social mosaic specific to Paris“, with the map in Figure 1.

Figure 1 : INSEE, Insee Analyses 53, 2017

This map is a priori familiar to many people, in the sense that we quickly recognize the city represented, we know how to quickly find different elements, and we know how to read the information presented, almost instinctively. In urban history, the way we saw, and how we represented the maps, has often been the basis of urban planning. Changing representation has made it possible to change the structure of cities. We will take up here the two major historical turning points, mentioned in Söderström (1996), based on two recent works: the representation of Rome at the beginning of the Renaissance, and the first iconographic plans, described in Maier (2015), and the “social” or “health” maps of London of Victorian civil servants, described in Vaughan (2018). In particular, the latter are the ancestors of zoning maps, which are widely used in urban planning, but also correspond to the majority of maps produced by statisticians and economists (the INSEE map is an example). And some maps from the last century have nothing to envy to the maps produced today, in the era of big data.

Rome, Leon Battista Alberti and Leonardo Bufalini, and the unchanging motives

Choay (1980) emphasizes the fundamental role in the history of urban planning of Alberti’s De Re Aedificatoria (presented in manuscript form to Pope Nicholas V in 1452, but published only in 1485). The Alberti Treaty is indeed the first text to consider construction (Alberti prefers the term “construction” – ædificatoria – to cover both architecture and urban planning) in terms of an autonomous domain to which the rational method must be applied. The history of representation sees a turning point with the Renaissance, with figurative forms to represent urban space. We will leave the medieval aesthetic with the rediscovery of perspective, which will produce a rationalization of what can be seen, even if it often induces a partial vision of the object. In his treatise, Leon Battista Alberti proposes a scientific method governing the art of building the house, but also the entire city. But it is in Descriptio urbis Romae, probably written at the same time, that he deepened the idea of urban planning, taking the particular example of Rome.

In his book, Alberti does not propose any map of Rome, but a list of instructions to be followed to create one, with the coordinate tables of several important elements of the city, natural, but also artificial. The list includes the ramparts, the river (the Tiber), the city gates, more than thirty public buildings, including the Capitol, which for Alberti is the reference point of the urban plan. He proposed to represent the city by using a disc divided into 48 portions, and by using the distance to the Capitol (in addition to a compass) to place any building. All calculations are detailed in Ludi Matematici Descriptio, using triangulation techniques. In 1450, Alberti invented the geometric plan, corresponding to what we would today call the plan of a city, even if the circular shape may surprise at first sight (see Figure 2), and does not correspond to the ichnographic plan that we all use today (obtained by horizontal and geometrical projection on a plan).

Figure 2 : reconstruction of Alberti’s map in Descriptio urbis Romae, by Luigi Vagnetti in Lo studio di Roma negli scritti albertiani (1974). Source: Maier (2015).

Its plan corresponds to the emergence of a new mode of representation, very geometric. But it was not until Leonardo Bufalini’s plan in 1531 that the first ichnographic plan arrived (it would be unfair to forget Imola’s plan drawn in 1503 by Leonardo da Vinci). If Alberti’s plan indicated the coordinates of a building, Bufalini decided to incorporate the ground plan of the buildings into his city plan.

Figure 3 : carte de Bufalini, Roma, 1551, British Library Londres. Source : Maier (2015).

But if Alberti’s plan has had such an impact, it is also because it came at the time when Pope Nicholas V launched a plan to rebuild Rome, covering an entire district, from Castel Sant’Angelo to the Vatican. This is probably the first urban planning on this scale, proposing to use the urban form as an instrument of social engineering. Alberti’s representation helped this project, with a scientific vision of the map, no longer depending on the artist’s artistic skills, or to inscribe the map in a story that would give it meaning. This urban map is self-sufficient, containing the terms of its own meaning. In Latour’s (1989) terminology, these representations that can be detached from the place (or object) they represent, “while remaining immutable so that they can be moved in all directions without further distortion, loss or corruption” correspond to immutable motives. Alberti’s map is one of the first examples of these immutable mobiles. It juxtaposes the natural and the human construction, the profane and the sacred, placing measurement and position as the only values.

These plans see the urban space as a whole, not offering a single point of view, such as Jacopo Filippo Foresti’s more classic maps (for the time), for example (see Figure 4). It is possible to take Foresti’s point of view to see his map. Alberti’s map exists only as an abstract object.

Figure 4 : view of Rome by Jacopo Filippo Foresti, 1490. Source: Maier (2015).

If Leonardo Bufalini’s map revolutionized urban mapping, and if the iconographic plan is the dominant representation today, these maps have long remained marginal, because they were exclusively reserved for administrative, military or administrative purposes. The map of Foresti has not completely disappeared: it can be found in tourist maps, for example, which are not very concerned about proportions, simply seeking to stage monuments or to indicate itineraries. We then contrast an often local, horizontal vision (on a human scale) with a vision sometimes called zenithal which proposes to conceive objects in abstract terms. It is the latter that makes it possible to represent the city in the form of different neighbourhoods, with different levels of wealth for example, resulting in geometric plans for social statistics in Victorian times, making it possible to be the subject of census, measurement and comparison.

Also noteworthy is the 1748 map of Rome created by Giambattista Nolli. Previously, Leonardo Bufalini proposed to take the point of view of an eagle, flying over the city. Nolli established the now common practice of representing entire cities from above without a single focal point, each block being considered as if the cartographer were directly above it.

Figure 5 : Giambattista Nolli’s map of Rome, 1748. Source: Sylvain Mottet.

London, Thomas More and Charles Booth, and the zoning maps

At the end of the 19th century (from 1870 onwards) Germany saw the first “social maps”, born in the context of an increasingly dense urban population, high social tensions and deteriorating health conditions. German planners proposed a vision of the innovative city as a living organism that needed to be made to function more efficiently. In 1876, Reinhard Baumeister in Stadterweiterungen in technischer, baupolizeilicher und wirtschaftlicher Beziehung and especially Josef Stübben in Der Städtebau, in 1890, proposed the first urban planning manuals. Thus, towards the end of the first chapter, Baumeister proposes to use an urban expansion plan, a master plan to organize the future urban space. For him, it was a question of ensuring the stability and proper functioning of a city designed as a living organism to deal with the problems it faces: overcrowding in certain districts, traffic and hygiene problems, social unrest, etc. To do this, he suggests specializing the city’s sectors in functional and social terms – what we will later call a “zoning plan” (or Bauzonenplan) – and ensuring the sustainability of this specialization. However, he warns against an overly rigid and inflexible master plan: urban development cannot be planned with too much precision, and it is therefore counterproductive to want to freeze it in a totally predetermined framework. Its plan aims to provide general guidelines necessary for the cohesion of the urban organization. In particular, he notes that the more guidelines there are, the more they will have to be the subject of local plans with a limited time horizon.

While the zoning plan was not originally conceived as part of the management plan, it quickly became the key document, its clearest and most effective part. The objective was to understand, at a glance, the whole city as part of an administrative project. It is not only a question of having an overall vision of the city (which the iconographic plan already allowed) but also of using colour codes that facilitate the total regulation of this city. In particular, this zoning plan made it possible to predict several years or even decades in advance what the morphological and functional characteristics of a given area would be. In particular, it allowed investors to anticipate the future of an area and guarantee a certain return on their investments.

This vision proposed by Baumeister thus made it possible to see better, for example, that the most bourgeois areas were often located in the west of the cities. This position is simply because these areas are often healthier from a health point of view: the smoke and smog produced by cities are dispersed in the upper layers of the atmosphere, and when the wind comes from the west (which happens most often in most European cities) the smoke and smog are transported eastwards and towards the lower layers of the atmosphere. From this observation, it becomes natural to build factories in the east and houses in the west. Baumeister’s work was not only theoretical: he worked on the development of the city of Frankfurt in 1891, then Berlin, Cologne, Essen, etc. In Frankfurt, he thus proposed the idea of concentric zones, which was later taken up by many economists. Figure 6 shows this form of a city, in an article published in 1925 by Ernest Burgess (who would later become one of the founders of the Chicago school). At the beginning of the First World War, all German cities had a zoning plan. And in the following years, it was the United States that adopted the concept, with New York in 1916, and more than 500 cities in 1926. In that year, zoning was officially institutionalized, with the approval of the Supreme Court. In 1933, it was the Athens Charter that recognized zoning as the main and central task of urban planning.

Figure 6 : the concentric city, Burgess (1925). Source : Vaugha (2018)

But in parallel with German development, where civil servants imagine the instruments of contemporary urban planning, social planning in England takes place in a context of strong social tensions. The impoverishment of a large part of the population, the many very precarious housing units, the disastrous sanitary conditions and the increase in crime in large cities have made urban development management an extremely sensitive and political subject. It is not surprising to see Patrick Geddes’ work published in Edinburgh, a biologist by training (the city is seen as a living organism) and an anarchist activist, he thought of the image and cartography as a central tool in the fight against poverty. He developed and advocated the use of statistics and mapping in land use planning and urban development, probably more than anyone else at that time. But history will remember Charles Booth’s work in London from 1886 onwards.

Charles Booth, who began as a merchant and shipowner, devoted himself fully to the first social surveys at the end of the 19th century, based on a precise taxonomy of social categories. He was the first to produce social maps covering the entire urban space. His investigations focused first on the East End, London’s most deprived neighbourhood, before spreading throughout the city over more than 17 years. Its objective was to provide a scientific study of the living conditions of the London population in order to put an end to the images of deprived neighbourhoods. As he said in 1902, his objective was to establish “the numerical relation which poverty, misery and depravity bear to regular earnings and comparative comfort, and to describe the general conditions under which each class lives”.

Booth’s approach was based on the creation of a statistical classification of social categories, ranging from A (the lower class) to H (the upper middle class). It has therefore created, on the basis of the notes taken in the field by the inspectors, a taxonomy that distinguishes the different sectors of the social spectrum. He estimated the number of “poor” (classes A-D) at 300,000 people in the East End and 1,300,000 for the city as a whole, almost a third of the total population at the time. The impact of the figures on the public was enormous and was reinforced by the poverty maps that were included in the results volumes dealing first with the East End and then, a few years later, with the entire city, as illustrated in Figure 7.

Figure 7 : Charles Booth Map Descriptive of London Poverty, in 1898. Source : Vaughan (2018). See also https://booth.lse.ac.uk/map/

The map makes it possible to move from a social logic to a spatial logic: a particular class is translated into cartographic terms, becomes a building, a block of houses, a street, an entire urban area. The social map therefore made it possible to think of the city in terms of homogeneous spatial units. This reasoning is essential for urban planning: it could not develop in the context of the complexity of the discourse, distinguishing between the different inhabitants of the same building. This social vision of mapping, with its focus on slums and poor neighbourhoods, should be brought closer to a health objective.

That said, thinking about urban development in terms of health interventions to heal society from its ills is not new. In 1516, Thomas More founded one of the main forms of urban planning theory, starting with a diagnosis of the disease and then proposing a definitive solution through a total restructuring of the urban form. During the 18th century, the translation of this principle consisted in isolating particular intervention areas (characterized by their insalubrity) and removing them, sweeping away the urban past. The solution adopted at the end of the 19th century was rather to work from what already existed, and to find the most effective solutions to manage the probable future changes in the urban context.

At the end of the 19th century, we also moved from “descriptive statistics” to “prescriptive statistics”, to use Ogien’s terms (2013). We no longer simply evaluate the number of smallpox patients, we begin to make the choice to vaccinate (or not) a specific population, and therefore to set up a mandatory preventive intervention (at the time the vaccine still killed about 1 person out of 300).

The homme moyen (average man) by Adolphe Quetelet will launch moral statistics, with the search for people becoming the norm, the average. Diseases are also beginning to be linked to population density, poor ventilation and humidity. “Dirty, unhealthy, infectious, corrupt or simply stinking are the categories that make it possible to think what we now call pollution” in the words of Fureix and Jarrige (2015). We then move from the social map to the “moral map”, a city thought up by hygienists. Moral geography, which until then had been the subject of partial and unsystematized observations, finds in the map a (graphical) space that synthesizes and organizes it. The social map gave the globalizing vision necessary for the existence of urban planning, and for the precise location of the sites necessary for the targeted and rational functioning of its therapeutic action. In mind is Dr. John Snow’s 1854 map of the cholera epidemic, presented (and updated) in Figure 8. At the time, the dominant theory was the theory of miasmas, claiming that diseases such as plague or cholera spread in the form of bad air. In 1854, with the help of the Reverend Henry Whitehead, by interviewing local residents, he established the geographical distribution of cases, and identified the source of the epidemic: a public water pump on Broad Street. While microbial research has not scientifically established the danger of the water pump, the mapping study of the spread of the epidemic has been sufficient to convince the authorities to close it.

Figure 8 : John Snow On the Mode of Communication of Cholera, in 1855. Source :  https://tabsoft.co/2y82nbf

However, as Vaughan (2018) points out, similar works can be found throughout England at the same time, such as Edwin Chadwick’s Sanitary Map of the Town of Leeds, shown in Figure 9. On this map, Chadwick identifies two groups of dwellings: working class houses and shops, workhouses and artisans’ houses. Colour dots, indicating contagious diseases, only seem to proliferate in poor neighbourhoods. In particular, the map noted that the patients did not live in contiguous areas, but that they are scattered around the map, while being located in poor neighbourhoods.

Figure 9 : Edwin Chadwick, Sanitary Map of the Town of Leeds, 1842. Source : Vaughan (2018) et https://bit.ly/2zL3pM8

The maps had considerable public health impacts, and the zoning, formalized by Charles Booth, was the basis for spatial statistics, as it developed throughout the 20th century.

If the cartography of the city is now complex and rich, it should be noted that economists have taken a long time to leave the “linear city” model, introduced in Hotelling (1929), which has been refined over time, as shown in Figure 10, pitting the residential part (RD – residential district) against the business centre (BD – business district). But that’s another story….

Figure 10 : the different forms of the linear city. Source : Fujita & Thisse (1997).

References:

Booth, Charles (1902) Life and Labour in London. 17 volumes.

Burgess, Ernest (1925). The Growth of the City:An Introduction to a  Research Project.

Choay, Françoise (1980). La règle et le modèle, Paris, Seuil.

Fujita, Masahisa et Thisse, Jacques-Francois. (1997), Économie géographique, Problèmes anciens et nouvelles perspectives. Annales d’Économie et de Statistique, 45, 37-87.

Fureix, Emmanuel et Jarrige, François. (2015), La modernité désenchantée : relire l’histoire du XIXe siècle français, Paris, La Découverte.

Hotelling, Harold (1929). Stability in Competition. The Economic Journal, 39, 41-57.

Latour, Bruno (1989). La science en action. Paris, La Découverte.

Maier, Jessica (2015). Rome, measured and imagined. The University of Chicago Press.

Ogien, Albert (2013). Désacraliser le chiffre dans l’évaluation du secteur public, Versailles, Éditions Quæ,

Söderström, Ola (1996) Paper cities : visual thinking in urban planning. Ecumene, 3, 249-281.

Vaughan, Laura (2018) Mapping Society: The Spatial Dimensions of Social Cartography. UCL Press.

Dossier Assurance sur Variances.eu

Sur variances.eu (le blog d’ENSAE Almuni) commence un dossier sur l’assurance, avec un premier billet sur l’assurance et le défi des taux d’intérêt bas, par Bernard Delas, vice-président de l’Autorité de contrôle prudentiel et de résolution (ACPR).

Les principales économies européennes connaissent depuis la crise des subprimes une baisse ininterrompue des taux d’intérêt. Ils ont atteint, au cours des cinq dernières années, des niveaux qu’ils n’avaient jamais connus jusqu’alors, inférieurs au taux zéro. Un tel scénario n’avait pas été envisagé par la réglementation post-crise des institutions et des marchés financiers. Ces taux d’intérêts, s’ils sont à court terme bénéfiques pour le financement des activités marchandes et industrielles, posent en revanche des défis inédits non seulement aux fonds d’investissement, aux banques et aux assureurs mais également  aux autorités de contrôle des institutions financières chargées de veiller à la préservation de la stabilité du système financier et, tout particulièrement depuis la crise, à la protection des consommateurs et des épargnants.

[à suivre] Et pour l’occasion – j’en profite aussi avec mon premier cours d’actuariat qui commence demain, à l’ENSAE, pour remettre quelques statistiques sur les ENSAE qui deviennent actuaires

Regard personnel sur le Big Data

Sur variances.eu, la thématique Big Data du mois d’octobre se poursuit avec aujourd’hui le regard (personnel) de Xavier Dupré,

Je pianote ce soir sur le clavier de mon ordinateur. Il est 23h et je suis au hackathon organisé par Data Science Game – une association créée par d’anciens ENSAE. Les participants se débattent avec des données actuarielles pour prédire le taux d’achats de police d’assurance. Les 500 Mo de données auraient été considérées comme “big” il y a cinq ans. Elles tiennent maintenant aisément sur un ordinateur portable et il ne faut que quelques secondes pour calculer une moyenne. Tous ont déjà participé à des compétitions Kaggle, utilisé plusieurs librairies de machines learning. Certains obtiennent des résultats une heure après le début malgré des variables de types très variés. On se doute que ce n’est pas la première compétition à laquelle ils participent. Je circule d’une équipe à l’autre, les écrans montrent du python, un peu de R, une ligne de commande ouverte dans un coin, un notebook dans un autre, des machines distantes qui tournent quelque part. Ca discute beaucoup au sein des équipes. Elles sont pour la plupart composées de personnes qui se complètent, le data manager, le programmeur, le machine learner. Au dîner, j’échange avec quelques équipes. L’un d’eux me parle d’un de ses amis qui apprend des réseaux profonds pour entraîner une machine à jouer au jeu vidéo Doom. Il utilise des techniques de deep reinforcement learning. Nous abordons les domaines dans lesquels ces technologies s’appliqueront. La médecine sort en premier. Les transports en second. Les participants, vingt équipes de quatre, ont été sélectionnées sur une première compétition. Même si je joue le rôle de mentor, peu me posent des questions techniques. Il y a peu de choses qu’ils ne sachent déjà. Tous connaissent la programmation et les statistiques. Après 24h, certains assemblages deviennent plutôt complexes, la prédiction est issue d’un patchwork de modèles. On apprécie le coup de crayon des peintres, certains participants ont un bon coup de clavier. Les équipes prennent tour à tour la place de numéro 1.

[à suivre…]

La nécessaire transformation des métiers analytiques

Sur variances.eu, on continue la thèmatique Big Data du mois d’octobre, avec le point de vue de Baptiste Beaume, de Covéa, sur les aspects métiers.

L’arrivée puis la pérennisation du Big Data dans le monde professionnel impacte pleinement le marché de l’emploi : 130 000 créations de postes prévues en France d’ici 2020 (cf Plan Big Data) ou encore une pénurie annoncée, d’ici 2 ans, de 190 000 Data Scientist aux Etats Unis selon cf McKinsey. Ces profils à mi-chemin entre les statistiques et l’informatique devront être armés pour maîtriser les nouveaux types de données, les derniers outils ou langages mais aussi les méthodes pour exploiter toutes ces informations.

Ainsi de plus en plus de formations s’adaptent, voire se créent, pour proposer sur le marché ces CVs encore trop rares : ENSAE, Telecom Paris Tech, ENSAI, ENSIMAG, Polytechnique, Paris Dauphine, Telecom Nancy etc… Les étudiants d’aujourd’hui peuvent décider de prendre cette voie et ainsi s’assurer un bel avenir en termes d’employabilité – attention quand même à vivre une réelle histoire d’amour avec la donnée sans quoi le temps passé derrière un écran pourra sembler très long, la tension de ces profils sur le marché n’incitera pas les entreprises à les faire bifurquer vers d’autres domaines.

Les entreprises font actuellement toutes face à d’immenses enjeux de compétitivité liés à la maîtrise des données, la pénurie de Data Scientist n’en est que plus préoccupante dans une optique de recrutement. Cette problématique-là a toutefois tendance à en cacher une autre : quid des centaines de milliers de personnes qui ont déjà un métier analytique ? Des changements sont-ils à opérer ? Alors si oui lesquels ?

[à suivre….]

Métadonnées et vie privée, l’équation insoluble?

Dans le cadre de la série d’articles (sur le thème des données) que j’édite sur variances.eu (le blog d’ENSAE Alumni), Yves-Alexandre de Montjoye –  @yvesalexandre – revient sur le thème métadonnées et vie privée, l’équation insoluble?

À l’image du texto, les comptes-rendus d’appels ou CRA, autrefois uniquement destinés à des fins de facturation, ont de loin dépassé leur but premier. Il est devenu clair au cours des derniers cinq ans, que ces métadonnées, littéralement “données à propos des données”, étaient utiles pour bien plus que juste la facturation. Connaître, même de manière anonyme, qui téléphone ou qui envoie un texto, à quel moment, et de quel endroit est une source d’information exceptionnelle sur nous-même et la société dans laquelle nous vivons.

Les comptes-rendus d’appels (CRA ou CDR en anglais) ont déjà été utilisés par des chercheurs pour étudier les vecteurs de propagation d’épidémies comme la malaria [1] ou de réaliser des recensements de population en temps réel [2]. Au-delà du monde de la recherche, l’utilisation commerciale de ces données est déjà en plein essor aux États-Unis et ailleurs, comme en témoigne le programme “precision insight” de Verizon ou FluxVision d’Orange.

Le potentiel de ces données ne doit cependant pas nous faire oublier que chacune d’entre elles est produite par une personne passant un appel, recevant un texto d’un ami ou effectuant un déplacement. Les CRA contiennent des informations détaillées et potentiellement sensibles sur le comportement, les habitudes de déplacement, ou encore le style de vie d’une personne.

Permettre la collecte et l’utilisation des données en respectant la vie privée des utilisateurs passe souvent, d’un point de vue pratique mais aussi légal, par leur “anonymisation”. L’idée est en effet que si les données ne sont pas associées à un individu, les informations qu’elles contiennent ne peuvent pas lui nuire.

[à suivre…]

Retour sur 20 ans de numérique et d’Internet

Ce mois-ci, je vais être éditeur d’une série d’articles sur le thème des données, qui seront publiés sur variances.eu, le blog d’ENSAE Alumni. Pour le premier article, Philippe Tassi propose un retour sur 20 ans de numérique et d’Internet.

Depuis le bouleversement apporté, à la fin du XIXème siècle et varile début du XXème, par l’accès du grand public à l’électricité, aucune autre révolution de même nature n’a eu lieu jusqu’à la mise à disposition commerciale d’Internet, au milieu des années 90, et donc l’entrée de nos civilisations dans le monde numérique.

D’abord, des faits : même si certaines données chiffrées actuelles seront rapidement caduques, elles décrivent aisément la vitesse de diffusion d’internet à la fois en termes d’accession et d’usage. A l’automne 1997, moins de 1 % des foyers vivant sur le territoire français sont connectés à Internet. Cette proportion passe à 4,7 % en 1999, 27,4 % en 2003, 35,5 % en 2005. Dix ans plus tard, en septembre-octobre 2015, 85 % des ménages y ont accès. Et au-delà de l’accès au réseau, sa pratique est devenue massive : 45 millions de français se connectent à Internet au moins une fois par mois, tout écran.

Internet ne s’est pas construit en un jour

La conception d’Internet n’est pas récente. Il a fallu du temps entre l’apparition du socle technologique – Ray Tomlinson, décédé en mars 2016, a inventé l’e-mail en 1971 – et la mise en œuvre de services adaptés au public. Par comparaison, on a su transporter du son via les ondes hertziennes dès les années 1890, alors que la radio, en tant que média, n’existera qu’à partir de 1922-1923. Le transfert d’une image mobile par ces mêmes ondes date des années 30, le média télévision étant officiellement créé en France en 1949.

Internet n’échappe pas à cette règle. Son point de départ est la crise des missiles de Cuba, en octobre 1962, en pleine guerre froide entre Etats-Unis et URSS. Elle révèle au président Kennedy la faiblesse d’un système centralisé. En 1964 apparaît l’idée de réseau décentralisé, moins vulnérable. Une première ébauche est conçue en 1969, dénommé Arpanet (Advanced Research Projects Agency) ; elle relie les universités de Stanford, UCLA, Santa Barbara et Utah. Le courrier électronique existe dès 1971. Les bases techniques des protocoles TCP et TCP/IP datent des années 70. Dans cette même décennie naissent Microsoft (1975) et Apple (1976). En 1983, Arpanet est scindé en Milnet, intégré au réseau militaire américain, et un nouvel Arpanet universitaire, renommé Internet en 1986. 1990 voit l’émergence du protocole http et du langage html, du concept de www. Les créations des futurs acteurs majeurs se multiplient : Yahoo! et Amazon en 1994, Google en 1998, FaceBook en 2004, Twitter en 2006. Fin des années 2000, les voies menant au monde digital se sont diversifiées : à l’historique micro-ordinateur se sont ajoutés les nouveaux écrans : smartphone depuis 2007, tablette depuis 2010, favorisant la mobilité. L’individu devient ATAWAD : Any Time, Any Where, Any Device. En 2015, le ménage français moyen dispose de 6,4 écrans.

[à suivre..]