In statistics, the backfitting algorithm is a simple iterative procedure used to fit a generalized additive model. It was introduced in 1985 by Leo Breiman and Jerome Friedman along with generalized additive models. In most cases, the backfitting algorithm is equivalent to the Gauss–Seidel method, an algorithm used for solving a certain linear system of equations.

Algorithm edit

Additive models are a class of non-parametric regression models of the form:

 

where each   is a variable in our  -dimensional predictor  , and   is our outcome variable.   represents our inherent error, which is assumed to have mean zero. The   represent unspecified smooth functions of a single  . Given the flexibility in the  , we typically do not have a unique solution:   is left unidentifiable as one can add any constants to any of the   and subtract this value from  . It is common to rectify this by constraining

  for all  

leaving

 

necessarily.

The backfitting algorithm is then:

   Initialize  , 
   Do until   converge:
       For each predictor j:
           (a)   (backfitting step)
           (b)   (mean centering of estimated function)

where   is our smoothing operator. This is typically chosen to be a cubic spline smoother but can be any other appropriate fitting operation, such as:

In theory, step (b) in the algorithm is not needed as the function estimates are constrained to sum to zero. However, due to numerical issues this might become a problem in practice.[1]

Motivation edit

If we consider the problem of minimizing the expected squared error:

 

There exists a unique solution by the theory of projections given by:

 

for i = 1, 2, ..., p.

This gives the matrix interpretation:

 

where  . In this context we can imagine a smoother matrix,  , which approximates our   and gives an estimate,  , of  

 

or in abbreviated form

 

An exact solution of this is infeasible to calculate for large np, so the iterative technique of backfitting is used. We take initial guesses   and update each   in turn to be the smoothed fit for the residuals of all the others:

 

Looking at the abbreviated form it is easy to see the backfitting algorithm as equivalent to the Gauss–Seidel method for linear smoothing operators S.

Explicit derivation for two dimensions edit

Following,[2] we can formulate the backfitting algorithm explicitly for the two dimensional case. We have:

 

If we denote   as the estimate of   in the ith updating step, the backfitting steps are

 

By induction we get

 

and

 

If we set   then we get

 
 

Where we have solved for   by directly plugging out from  .

We have convergence if  . In this case, letting  :

 
 

We can check this is a solution to the problem, i.e. that   and   converge to   and   correspondingly, by plugging these expressions into the original equations.

Issues edit

The choice of when to stop the algorithm is arbitrary and it is hard to know a priori how long reaching a specific convergence threshold will take. Also, the final model depends on the order in which the predictor variables   are fit.

As well, the solution found by the backfitting procedure is non-unique. If   is a vector such that   from above, then if   is a solution then so is   is also a solution for any  . A modification of the backfitting algorithm involving projections onto the eigenspace of S can remedy this problem.

Modified algorithm edit

We can modify the backfitting algorithm to make it easier to provide a unique solution. Let   be the space spanned by all the eigenvectors of Si that correspond to eigenvalue 1. Then any b satisfying   has   and   Now if we take   to be a matrix that projects orthogonally onto  , we get the following modified backfitting algorithm:

   Initialize  , ,  
   Do until   converge:
       Regress   onto the space  , setting  
       For each predictor j:
           Apply backfitting update to   using the smoothing operator  , yielding new estimates for  

References edit

  1. ^ Hastie, Trevor, Robert Tibshirani and Jerome Friedman (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, ISBN 0-387-95284-5.
  2. ^ Härdle, Wolfgang; et al. (June 9, 2004). "Backfitting". Archived from the original on 2015-05-10. Retrieved 2015-08-19.
  • Breiman, L. & Friedman, J. H. (1985). "Estimating optimal transformations for multiple regression and correlations (with discussion)". Journal of the American Statistical Association. 80 (391): 580–619. doi:10.2307/2288473. JSTOR 2288473.
  • Hastie, T. J. & Tibshirani, R. J. (1990). "Generalized Additive Models". Monographs on Statistics and Applied Probability. 43.
  • Härdle, Wolfgang; et al. (June 9, 2004). "Backfitting". Archived from the original on 2015-05-10. Retrieved 2015-08-19.

External links edit