Tag Archives: risk

Exchangeability, credit risk and risk measures

Exchangeability is an extremely concept, since (most of the time) analytical expressions can be derived. But it can also be used to observe some unexpected behaviors, that we will discuss later on with a more general setting. For instance, in a old post, I discussed connexions between correlation and risk measures (using simulations to illustrate, but in the context of exchangeable risk, calculations can be performed more accurately). Consider again the standard credit risk problem, where the quantity of interest is the number of defaults in a portfolio. Consider an homogeneous portfolio of exchangeable risk. The quantity of interest is here

http://freakonometrics.hypotheses.org/files/2016/11/credit-01.gif

or perhaps the quantile function of the sum (since the Value-at-Risk is the standard risk measure). We have seen yesterday that – given the latent factor – http://freakonometrics.hypotheses.org/files/2016/11/exch67.gif (either the company defaults, or not), so that

http://freakonometrics.hypotheses.org/files/2016/11/exch66.gif

i.e. we can derive the (unconditional) distribution of the sum

http://freakonometrics.hypotheses.org/files/2016/11/exch60.gif

so that the probability function of the sum is, assuming that http://freakonometrics.hypotheses.org/files/2016/11/exch76.gif

http://freakonometrics.hypotheses.org/files/2016/11/exch68.gif

Thus, the following code can be used to calculate the quantile function

> proba=function(s,a,m,n){
+ b=a/m-a
+ choose(n,s)*integrate(function(t){t^s*(1-t)^(n-s)*
+ dbeta(t,a,b)},lower=0,upper=1,subdivisions=1000,
+ stop.on.error =  FALSE)$value
+ }
> QUANTILE=function(p=.99,a=2,m=.1,n=500){
+ V=rep(NA,n+1)
+ for(i in 0:n){
+ V[i+1]=proba(i,a,m,n)}
+ V=V/sum(V)
+ return(min(which(cumsum(V)>p))) }

Now observe that since variates are exchangeable, it is possible to calculate explicitly correlations of defaults. Here

http://freakonometrics.hypotheses.org/files/2016/11/exch70.gif

i.e.

http://freakonometrics.hypotheses.org/files/2016/11/exch71.gif

Thus, the correlation between two default indicators is then

http://freakonometrics.hypotheses.org/files/2016/11/exch73.gif

http://freakonometrics.hypotheses.org/files/2016/11/exch75.gif

Under the assumption that the latent factor is beta distributed

http://freakonometrics.hypotheses.org/files/2016/11/exch78.gif

we get

http://freakonometrics.hypotheses.org/files/2016/11/exch80.gif

Thus, as a function of the parameter of the beta distribution (we consider beta distributions with the same mean, i.e. the same margin distributions, so we have only one parameter left, with is simply the correlation of default indicators), it is possible to plot the quantile function,

> PICTURE=function(P){
+ A=seq(.01,2,by=.01)
+ VQ=matrix(NA,length(A),5)
+ for(i in 1:length(A)){
+ VQ[i,1]=QUANTILE(a=A[i],p=.9,m=P)
+ VQ[i,2]=QUANTILE(a=A[i],p=.95,m=P)
+ VQ[i,3]=QUANTILE(a=A[i],p=.975,m=P)
+ VQ[i,4]=QUANTILE(a=A[i],p=.99,m=P)
+ VQ[i,5]=QUANTILE(a=A[i],p=.995,m=P)
+ }
+ plot(A,VQ[,5],type="s",col="red",ylim=
+ c(0,max(VQ)),xlab="",ylab="")
+ lines(A,VQ[,4],type="s",col="blue")
+ lines(A,VQ[,3],type="s",col="black")
+ lines(A,VQ[,2],type="s",col="blue",lty=2)
+ lines(A,VQ[,1],type="s",col="red",lty=2)
+ lines(A,rep(500*P,length(A)),col="grey")
+ legend(3,max(VQ),c("quantile 99.5%","quantile 99%",
+ "quantile 97.5%","quantile 95%","quantile 90%","mean"),
+ col=c("red","blue","black",
+"blue","red","grey"),
+ lty=c(1,1,1,2,2,1),border=n)
+}

e.g. with a (marginal) default probability of 15%,

> PICTURE(.15)

On this graph, we observe that the stronger the correlation (the more on the left), the higher the quantile… Note that the same graph can be plotted with on the X-axis the correlation,


Which is quite intuitive, somehow. But if the marginal probability of default decreases, increasing the correlation might decrease the risk (i.e. the quantile function),

> PICTURE(.05)

(with the modified code to visualize the quantile as a function of the underlying default correlation) or even worse,

> PICTURE(.0075)

And it because all the more counterintuitive that the default probability decreases ! So in the case of a portfolio of non-very risky bond issuers (with high ratings), assuming a very strong correlation will lower risk based capital !

Variable annuities is not a systemic risk ?

The Geneva Association just published on its website an interesting report on variable annuities and systemic risk (online here). Based on a definition of potentially systemically risky activities, on interconnectedness or substitutability, the report claims that since “none of the criteria is triggered”, variables annuities is “not a potentially systemically risk activity”. Even if “short-term effects are conceivable”. I guess it is a diplomatic way to say it…

Note that a series of slides can also be downloaded (there) on insurance and systemic risk. But that deserves a more detailed post.

 

Tennis and risk management

As mentioned already here, while we were going to Québec City for the workshop, we had interesting discussions in the car, and Maciej mentioned an article recently published in The Actuary,

Hence, I wanted to discuss (extremely) rare event probabilities in tennis. The story is simple: in June 2010, at Wimbledon, Nicolas Mahut and John Isner have played the longest match ever. 980 points, 11 But first of all, we need a dataset. Thanks to Duncan Murdoch, I have been able to run a short code to build up a dataset:

CITIES=c("berlin","madrid","paris","rolandgarros","wimbledon","sydney",
"beijing","shanghai","singapore","tokyo","melbourne","melbourne-indoor")
YEARS=1970:2009
BASE0=data.frame(YEAR=NA,TRNMT=NA,LENGTH=NA,SETS=NA)
for(i in 1:length(CITIES)){
for(j in 1:length(YEARS)){
city=CITIES[i]
year=YEARS[j]
localization = paste("http://www.resultsfromtennis.com/",
year,"/atp/",city,".html",sep="")
essai = try(readLines(localization), silent=TRUE)
ERROR404=FALSE
if(inherits(essai, "try-error")){ERROR404=TRUE}
if(ERROR404==FALSE){
B=scan(localization,"character")
SETS=NA
LENGTH=NA
if(length(B)>270){
I=(substr(B,1,10)=="class=rez>")
sum(I)
X0=B[I]
X3=as.numeric(substr(X0,11,13))
X2=as.numeric(substr(X0,11,12))
X1=as.numeric(substr(X0,11,11))
X0=X3
X0[is.na(X3)==TRUE]=X2[is.na(X3)==TRUE]
X0[is.na(X2)==TRUE]=X1[is.na(X2)==TRUE]
JL=c(which(substr(B,1,9)=="class=nl>"),length(B))
IL=which(substr(B,1,10)=="class=rez>")
IC=cut(IL,JL)
base=data.frame(IC,X0)
LENGTH=as.numeric(tapply(X0,IC,sum))
SETS=as.numeric(tapply(X0,IC,length))/2}
BASE=data.frame(YEAR=year,TRNMT=city,LENGTH,SETS)
BASE0=rbind(BASE0,BASE)}}}
write.table(BASE0,"BASE-TENNIS-TOTAL.txt")

Here I consider only tournaments where players have to win 3 sets (and actually more tournaments than those in the code above), and I have something like a bit more than 72,000 matches,

> I=is.na(TENNIS$LENGTH)==FALSE
> BT=TENNIS[I,]
> nrow(BT)
[1] 72754
> maxr=function(x){max(x,na.rm=TRUE)}
> T=paste(BT$TRNMT,BT$YEAR)
> DUREE=tapply(BT$SETS,T,maxr)
> LISTE=names(DUREE[DUREE>3])
> BT=BT[T%in%LISTE,]

so, if we look briefly at matches over 35 years, we have the following boxplot (one boxplot per year),

The red line being the epic Isner-Mahut match in June 2010 (4-6, 6-3, 7-6, 6-7, 70-68, i.e. 183 games, here for the score card).

If we study theory (e.g. from Paul Newton and Kamran Aslam), a lot of results can be obtained for the expected value of the number of games, but if we want to study extremely rare events, we should generate Markov chains (with a lot of generation since the probability should be extremely small). But how many ? Consider below matches with more than 50 games,

The tail plot (over 50), i.e. the log-log Pareto plot indicates that it will be difficult to study tails,

and similarly with the Hill plot (assuming that tails are Pareto type….)

Anyway, if we want to study tails, we should consider a threshold high enough. For instance, with a threshold at 68 (we keep only 24 match), we have

> seuil=68+0.25
> GPD1=gpd(X,seuil,method = "ml")
> GPD2=gpd(X,seuil,method = "pwm")
>
> xi=GPD1$par.ests[1]
> mu=seuil
> beta=GPD1$par.ests[2]
> x=180
> P=exp((-1/xi)*log(1 + (xi * (x - mu))/beta))
> as.numeric((1-GPD1$p.less.thresh)*P)
[1] 5.621281e-09
>
> xi=GPD2$par.ests[1]
> mu=seuil
> beta=GPD2$par.ests[2]
> x=180
> P=exp((-1/xi)*log(1 + (xi * (x - mu))/beta))
> as.numeric((1-GPD2$p.less.thresh)*P)
[1] 3.027095e-09

I.e. the probability that one match last more than 183 games is 1 chance over a billion… With, say, 2500 match per year, that gives us a return period of 400 years. So yes, we might say that this way a rare event… So perhaps, generating several billions of chains, it should be possible to get a more precise estimation of the probability to play 183 games in a single match…

Millenium bridge, endogeneity and risk management

In less than 48 hours, last week two friends mentioned the Millennium Bridge as an illustration of a risk management concept. There are several documents with that example, here (for the initial idea of using the Millennium Bridge to illustrate issues in risk management) here or there, e.g.

When we mention resonance effects on bridges, we usually thing of the Tacoma Narrows Bridge (where strong winds set the bridge oscillating) or the Basse-Chaîne Bridge (in France, which collapsed on April 16, 1850, when 478 French soldiers marched across it in lockstep). In the first case, there is nothing we can do about it, but for the second one, this is why soldiers are required to break step on bridges.

But for the Millennium bridge, a ‘positive feedback‘ phenomenon (known as Synchronous Lateral Excitation in physics) has been observed: the natural sway motion of people walking caused small sideways oscillations in the bridge, which in turn caused people on the bridge to sway in step, increasing the amplitude of the oscillations and continually reinforcing the effect. That has been described in a nice paper in 2005 (here). In the initial paper by Jon Danielsson and Hyun Song Shin, they note that “what is the probability that a thousand people walking at random will end up walking exactly in step? It is tempting to say “close to zero”, or “negligible”. After all, if each person’s step is an independent event, then the probability of everyone walking in step would be the product of many small numbers – giving us a probability close to zero. Presumably, this is the reason why Arup – the bridge engineers – did not take this into account. However, this is exactly where endogenous risk comes in. What we must take into account is the way that people react to their environment. Pedestrians on the bridge react to how the bridge is moving. When the bridge moves under your feet, it is a natural reaction for people to adjust their stance to regain balance. But here is the catch. When the bridge moves, everyone adjusts his or her stance at the same time. This synchronized movement pushes the bridge that the people are standing on, and makes the bridge move even more. This, in turn, makes the people adjust their stance more drastically, and so on. In other words, the wobble of the bridge feeds on itself. When the bridge wobbles, everyone adjusts their stance, which sets off an even worse wobble, which makes the people adjust even more, and so on. So, the wobble will continue and get stronger even though the initial shock (say, a gust of wind) has long passed. It is an example of a force that is generated and amplified within the system. It is an endogenous response. It is very different from a shock that comes from a storm or an earthquake which are exogenous to the system.

And to go further, they point out that this event is rather similar to what is observed in financial markets (here) by quoting The Economist from October 12th 2000 “So-called value-at-risk models (VaR) blend science and art. They estimate how much a portfolio could lose in a single bad day. If that amount gets too large, the VAR model signals that the bank should sell. The trouble is that lots of banks have similar investments and similar VAR models. In periods when markets everywhere decline, the models can tell everybody to sell the same things at the same time, making market conditions much worse. In effect, they can, and often do, create a vicious feedback loop.

Course on risk measures (in French)

The course on risk measure, in Luminy, starts at 16.00 on Monday (here). The slides can be found here,

Note that additional references can be downloaded on the internet, e.g. the short course on risk measures by Freddy Delbaen (here) or the article from the Encyclopedia of quantitative finance, by Hans Föllmer and Alexander Schied (there). See also here for the paper by Jean Marc Tallon, Johanna Etner and Meglena Jeleva, on decision theory under uncertainty.

Lecture notes on risk and insurance

I just finished some lectures notes on risk and insurance. The notes, that can be downloaded [pdf], are in French, and will be used at the JES (Journées d’Etudes Statistiques), organised at the CIRM (mentioned here). Previous notes on risk measures [pdf] and copulas [pdf]. Again, all comments are welcome…

Discussion on stress scenarios

Friday morning, I had the honor to discuss a presentation by Alexander McNeil, on Stress Testing and Reverse Stress Testing, at the Financial Risks International Forum on Risk Dependencies (here).

This was an opportunity to rediscover techniques I have studied briefly a few years ago, on outliers detection, namely the bagplot (I will probably upload a post on that topic soon, in French unfortunately). The slides of my discussion are available here.

http://freakonometrics.hypotheses.org/5338

Emerging risks, an actuarial perspective

Talk at the conference assessment and mitigation of emerging risks, at the conference organized by the AXA Chair on Large Risks in Insurance (Ecole Polytechnique, ENSAE, Paris Dauphine), through the help of Johanna Etner (Université Paris 5), Meglena Jeleva (Université du Maine), Stéphane Rossignol Université Paris 5), Jean-Marc Tallon (Paris School of Economics, Université Paris 1). The conference at the Institut Louis Bachelier, Palais Brongniart. I have uploaded my slides there.

Additional material can be found here and there: the KPMG report on asbestos can be found here, the OECD report on emerging risks there, and  the book by Joel Cohen,  Kenneth Manton and Eric Stallard can be downloaded here. Finally a lot of interesting documents can be found on the Lloyd’s website, with realistic disaster scenarios (here) or on nanotechnology (there). And to conclude on emerging risks, I let Donald say what unknown risks are,

Calculs de SCR, Solvency Capital Requirements

Pour reprendre le contexte général, Solvency II (l’analogue de la directive CRD pour les banques*) repose sur 3 piliers,

  1. définir des seuils quantitatifs de calcul des provisions techniques des fonds propres, seuils qui seront à terme réglementaires, à savoir le MCR (Minimum Capital Requirement, niveau minimum de fonds propres en-dessous duquel l’intervention de l’autorité de contrôle sera automatique) et le SCR (Solvency Capital Requirement, capital cible nécessaire pour absorber le choc provoqué par une sinistralité exceptionnelle),
  2. fixer des normes qualitatives de suivi des risques en interne aux sociétés, et définir comment l’autorité de contrôle doit exercer ses pouvoirs de surveillance dans ce contexte. Notons qu’en principe, les autorités de contrôle auront la possibilité de réclamer à des sociétés “trop risquées” de détenir un capital plus élevé que le montant suggéré par le calcul du SCR, et pourra les forcer àréduire leur exposition aux risques,
  3. définir un ensemble d’information que les autorités de contrôle jugeront nécessaires pour exercer leur pouvoir de surveillance.

Cette histoire de pilliers peut s’illustrer de la manière suivante

Sur le premier pilier, assureurs et réassureurs devront mesurer les risques, et devront s’assurer qu’ils détiennent suffisamment de capital pour les couvrir. En pratique, le CEIOPS et la Commission Européenne ont retenu une probabilité de ruine de 0,5%. Les calculs de capital se font alors de deux manières, au choix,

  1. utiliser une formule standard. La formule ainsi que la calibration des paramètres ont été abordé à l’aide des QIS.
  2. utiliser un modèle interne. Là dessus, le CEIOPS étudie les modalités d’évaluation.

En avril 2007, QIS3 a été lancé, afin de proposer une formule standard pour le calcul des MCR et SCR, en étudiant la problématique spécifique des groupes. En particulier, on trouve dans les documents la formule suivante (pour un calcul de basic SCR)

Cette formule sort du QIS3, mais on trouve des choses analogues dans Sandström (2004), par exemple,

Avec une contrainte forte sur la forme du SCR, il obtient alors

D’où sort cette formule ? Certains ont tenté des éléments de réponse, par exemple

Ce résultat n’est malheureusement pas très probant car il n’est jamais rien évoqué sur la dépendance entre les composantes, ce qui est troublant. Sandstôrm écrit quelque chose de similaire, même si pour lui “normalité” est ici entendu dans un cadre multivarié.

Une explication peut être trouvée dans un papier de Dietmar Pfeiffer et Doreen Straßburger (ici) paru dans le Scandinavian Actuarial Journal (téléchargeable ici). Il cherche à expliquer comment calculer le SCR,

Il note, et c’est effectivement l’intuition que l’on avait, que dans un monde Gaussien (multivarié), cette formule marche, aussi bien pour un SCR basé sur la VaR que la TVaR. En particulier, ils citent un livre de Sven Koryciorz, correspondant à sa thèse de doctorat, intitulée “Sicherheitskapitalbestimmung und –allokation in der Schadenversicherung. Eine risikotheoretische Analyse auf der Basis des Value-at-Risk und des Conditional Value-at-Risk“, publiée en 2004.
Sinon, pour aller un peu plus loin, on peut aussi noter, dans les rapports du CEIOPS des déclarations un peu troublantes, par exemple

Il est pourtant facile de montrer que ce n’est pas le cas (même si c’est effectivement ce que préconise la “formule standard“). Le graphique ci-dessous montre l’évolution de la VaR d’une somme de risques corrélés (échangeables) en fonction de la corrélation sous-jacente: sur cet exemple, les risques très très corrélés sont moins risqués que des risques moyennement corrélés.

(la loi sous-jacente est une copule de Student). En revanche pour la TVaR, sur le même exemple, la TVaR de la somme est effectivement une fonction croissante avec la corrélation,


(plus de compléments dans les slides de l’école d’été à Lyon l’été dernier, ici).

* Pour reprendre des éléments de la page de wikipedia (ici), la directive européenne CRD (Capital Requirements Directive, i.e. Fonds Propres Réglementaires) transpose dans le droit européen les recommandations des accords de Bâle II, visant à calculer les fonds propres exigés pour les établissements financiers (i.e. directives 2006/48/CEet 2006/49/CE) .

Dynamic and Multivariate Risk measures

Exposé dans le cadre du groupe de travail Dynamic and multivariate risk measures, organisé par Rose-Anna Dana (Univ. Paris Dauphine) et Alfred Galichon (Ecole Polytechnique). La conférence a lieu du jeudi 16 au vendredi 17, à l’Institut Henri Poincare.

Les orateurs sont Imen Bentahar, Arthur Charpentier, Rama Cont, Nicole El Karoui, Paul Embrechts, Damir Filipovic, Jean-Charles Rochet, Ludger Ruschendorf, Marco Scarsini, et Walter Schachermayer, et le programme est en ligne.