In our data-science class, after discussing limitations of the logistic regression, e.g. the fact that the decision boundary line was a straight line, we’ve mentioned possible natural extensions. Let us consider our (now) standard dataset
clr1 <- c(rgb(1,0,0,1),rgb(0,0,1,1)) clr2 <- c(rgb(1,0,0,.2),rgb(0,0,1,.2)) x <- c(.4,.55,.65,.9,.1,.35,.5,.15,.2,.85) y <- c(.85,.95,.8,.87,.5,.55,.5,.2,.1,.3) z <- c(1,1,1,1,1,0,0,1,0,0) df <- data.frame(x,y,z) plot(x,y,pch=19,cex=2,col=clr1[z+1])
One can consider a quadratic function of the covariates (instead of a linear one)
reg=glm(z~x+y+I(x^2)+I(y^2)+I(x*y), data=df,family=binomial) summary(reg) pred_1 <- function(x,y){ predict(reg,newdata=data.frame(x=x, y=y),type="response")>.5 } x_grid<-seq(0,1,length=101) y_grid<-seq(0,1,length=101) z_grid <- outer(x_grid,y_grid,pred_1) image(x_grid,y_grid,z_grid,col=clr2) points(x,y,pch=19,cex=2,col=clr1[z+1])
But one can also consider some additive model, with splines
library(splines) reg=glm(z~bs(x)+bs(y),data=df,family=binomial)
or even more general, a model with some bivariate splines,
library(mgcv) reg=gam(z~s(x,y,k=3),data=df,family=binomial)
With a (generalized) linear model, with nonlinear transformation, we can get very general classifier.
We did also mention connexions between the multinomial regression model, and multiple logistic. Here we consider three classes, say,
clr1=c(rgb(1,0,0,1),rgb(1,1,0,1),rgb(0,0,1,1)) clr2=c(rgb(1,0,0,.2),rgb(1,1,0,.2), rgb(0,0,1,.2)) x=c(.4,.55,.65,.9,.1,.35,.5,.15,.2,.85) y=c(.85,.95,.8,.87,.5,.55,.5,.2,.1,.3) z=c(1,2,2,2,1,0,0,1,0,0) df=data.frame(x,y,z) plot(x,y,pch=19,cex=2,col=clr1[z+1])
Can’t we consider three (binomial) logistic regression, with , and
reg1=glm((z==1)~x+y,data=df,family=binomial) summary(reg1) reg0=glm((z==0)~x+y,data=df,family=binomial) summary(reg0) reg2=glm((z==2)~x+y,data=df,family=binomial) summary(reg2)
If we look at seperation lines
pred_0 <- function(x,y){ predict(reg0,newdata=data.frame(x=x, y=y),type="response")>.5 } z_grid0 <- outer(x_grid,y_grid,pred_0) pred_1 <- function(x,y){ predict(reg1,newdata=data.frame(x=x, y=y),type="response")>.5 } z_grid1 <- outer(x_grid,y_grid,pred_1) pred_2 <- function(x,y){ predict(reg2,newdata=data.frame(x=x, y=y),type="response")>.5 } z_grid2 <- outer(x_grid,y_grid,pred_2)
and if we consider a multinomial regression
library(nnet) reg=multinom(z~x+y,data=df) plot(x,y,pch=19,cex=2,col=clr1[z+1])
we get
pred_3class <- function(x,y){ which.max(predict(reg, newdata=data.frame(x=x,y=y),type="probs")) } Outer <- function(x,y,fun) { mat <- matrix(NA, length(x), length(y)) for (i in seq_along(x)) { for (j in seq_along(y)) mat[i,j]=fun(x[i],y[j])} return(mat)} z_grid <- Outer(x_grid,y_grid,pred_3class) image(x_grid,y_grid,z_grid,col=clr2) points(x,y,pch=19,cex=2,col=clr1[z+1]) contour(x_grid,y_grid,z_grid0,levels=.5,add=TRUE) contour(x_grid,y_grid,z_grid1,levels=.5,add=TRUE) contour(x_grid,y_grid,z_grid2,levels=.5,add=TRUE)
which is slightly different. Since all the isoprobability curves are parallel with a logistic regression, we should focus on the slope of the lines, here. But except of the one on the left, the logistics and the multinomial regression generate different classifiers.
OpenEdition suggests that you cite this post as follows:
Arthur Charpentier (March 5, 2015). Supervised Classification, beyond the logistic. Freakonometrics. Retrieved October 8, 2024 from https://doi.org/10.58079/ouyw
2 thoughts on “Supervised Classification, beyond the logistic”