Non-linear least squares: Difference between revisions
(Created page with "==Definition== '''Non-linear least squares''' ('''NLLS''') is a generalized problem type related to the problem of linear least squares. It occurs frequently in the conte...") |
No edit summary |
||
| Line 2: | Line 2: | ||
'''Non-linear least squares''' ('''NLLS''') is a generalized problem type related to the problem of [[linear least squares]]. It occurs frequently in the context of optimization problems. | '''Non-linear least squares''' ('''NLLS''') is a generalized problem type related to the problem of [[linear least squares]]. It occurs frequently in the context of optimization problems. | ||
Consider the following setup: we have a model function <math>y = f(x,\\vec{\beta})</math> (here, <math>x</math> may be a scalar or vector variable, but <math>y</math> must be scalar; for simplicity, we will notationally treat <math>x</math> as a scalar). The vector <math>\vec{\beta}</math> is an unknown parameter vector with <math>n</math> coordinates <math>\beta_1, \beta_2, \dots, \beta_n</math>. We are given a set of <math>m</math> data points <math>(x_1,y_1), (x_2,y_2), \dots, (x_m,y_m)</math> with <math>m \ge n</math>. | |||
For <math>1 \le i \le m</math>, we define the residual <math>r_i</math> as follows: | |||
<math>r_i = y_i - f(x_i,\vec{\beta})</math> | |||
Our goal is to find a choice of the parameter vector <math>\vec{\beta}</math> for which the sum is minimized: | |||
<math>\sum_{i=1}^m r_i^2</math> | |||
In other words, we want to minimize the sum: | |||
<math>\sum_{i=1}^m (y_i - f(x_i,\vec{\beta}))^2</math> | |||
===How linear least squares is a special case=== | |||
The case of [[linear least squares]] is the case where the function <math>f(x,\vec{\beta})</math> is linear as a function of the vector <math>\vec{\beta}</math> for each value of <math>x</math>. It need not be linear in <math>x</math>. | |||
Revision as of 22:02, 9 May 2014
Definition
Non-linear least squares (NLLS) is a generalized problem type related to the problem of linear least squares. It occurs frequently in the context of optimization problems.
Consider the following setup: we have a model function Failed to parse (syntax error): {\displaystyle y = f(x,\\vec{\beta})} (here, may be a scalar or vector variable, but must be scalar; for simplicity, we will notationally treat as a scalar). The vector is an unknown parameter vector with coordinates . We are given a set of data points with .
For , we define the residual as follows:
Our goal is to find a choice of the parameter vector for which the sum is minimized:
In other words, we want to minimize the sum:
How linear least squares is a special case
The case of linear least squares is the case where the function is linear as a function of the vector for each value of . It need not be linear in .