Phil - THere is an abundance of methodology of this sort. The most commonly known one is called regression, in which the criterion of fit is the sum of squared "mistakes" (difference between the value given by your function/equation and the value in the data = the mistake, called the "residual"). The method of least squares minimizes the sum of the squared residuals by choice of the unknown parameters in the equations. See almost any applied low-level statistics textbook (say, sophomore level). I'm not certain where you are going with this. I have a PhD in the field of statistics and can steer you to reading if you want to learn about the large amount of methodology which has been developed over many decades to handle such problems. There are a variety of different criteria (pros and cons of these can be described), and the associated algorithms, etc... The properties of the lease squares estimators (ordinary regression) are those discussed in most elementary statistics textbooks. Under the acronym "GLM" you can find a broad variety of other methodologies, of which the above is a special case. Let me know what is of interest... Bill P