## How to evaluate the different fitting function for a set of data?

## How to evaluate the different fitting function for a set of data?

(OP)

I fit a set of data (around 10 points) with several polynomial functions. All of them are fitting well with similar coefficient of correlation. I want to compare these functions and choose a best one.

Are there some other parameters or methods can be used to compare the effects of these fitting function?

Thanks in advance.

Are there some other parameters or methods can be used to compare the effects of these fitting function?

Thanks in advance.

## RE: How to evaluate the different fitting function for a set of data?

Unless you have some other theoretical basis, you should simply pick the simplest one

TTFN

## RE: How to evaluate the different fitting function for a set of data?

corus

## RE: How to evaluate the different fitting function for a set of data?

## RE: How to evaluate the different fitting function for a set of data?

As a general rule, higher order polynomials will fit the same curve as a simpler one, but will more bumps and wiggles that are not physically meaningful.

TTFN

## RE: How to evaluate the different fitting function for a set of data?

In nonlinear best fit procedures a criterium (one among others) that has been widely adopted is so called Aikake information criterium (AIC)and is ussually calculated as AIC=-2*(LOG(-likelihood))+2*(number of parameters in the model).We select the model with the smallest AIC.

m777182

## RE: How to evaluate the different fitting function for a set of data?

## RE: How to evaluate the different fitting function for a set of data?

With such a small number of data points, just remove a single point, and recalculate the fit. If the curve fit is good, then removing a single data point should not alter the fit greatly. You can try removing each of the data points to see what effect they have on the fit returned. What you are doing really is determining how sensitive your fit is to the data you have used to create the fit.

An example of this would be a linear fit on a tight cluster of points, with one outlier well outside the cluster. If you calculate a linear fit on this, you get R squared close to 1, but if you then remove that outlier, the descriptors (slope and intercept) change majorly, as does the fit quality.

This method is quite good at testing fitting graphically, and requires very little knowledge of statistics.