[Home] . . . Search by [Problem] [Package] [Name or Keyword] . . . [Math at NIST]

Class K1b2a: Linearly constrained nonlinear least squares approximation

General Information

Parent Class
K1b2
Top of Tree
GAMS
Keywords
Linear constraints

Modules

Package IMSLM (Installed on ITL)

BCLSF
Solve a nonlinear least squares problem subject to bounds on the variables using a modified Levenberg-Marquardt algorithm and a finite-difference Jacobian.
BCLSJ
Solve a nonlinear least squares problem subject to bounds on the variables using a modified Levenberg-Marquardt algorithm and a user-supplied Jacobian.
BCNLS
Solves nonlinear least-squares problems subject to bounds on the variables and general linear constraints. The Jacobian is approximated using finite differences. (Based on DQED by Hanson and Krogh.).
BCONF
Minimize a function of N variables subject to bounds on the variables using a quasi-Newton method and a finite-difference gradient.
DBCLSF
Solve a nonlinear least squares problem subject to bounds on the variables using a modified Levenberg-Marquardt algorithm and a finite-difference Jacobian.
DBCLSJ
Solve a nonlinear least squares problem subject to bounds on the variables using a modified Levenberg-Marquardt algorithm and a user-supplied Jacobian.
DBCNLS
Solves nonlinear least-squares problems subject to bounds on the variables and general linear constraints. The Jacobian is approximated using finite differences. (Based on DQED by Hanson and Krogh.).
DBCONF
Minimize a function of N variables subject to bounds on the variables using a quasi-Newton method and a finite-difference gradient.

Package NLR (Downloadable)

DGLFGB
Solves statistical parameter estimation problems for general nonlinear models with simple bounds on the parameters, e.g., nonlinear least squares, maximum likelihood, maximum quasi-likelihood, generalized nonlinear least squares, and some robust fitting problems.
SGLFGB
Solves statistical parameter estimation problems for general nonlinear models with simple bounds on the parameters, e.g., nonlinear least squares, maximum likelihood, maximum quasi-likelihood, generalized nonlinear least squares, and some robust fitting problems.

Package OPT (Downloadable)

DQED
Solves nonlinear least squares problems with linear constraints and simple bounds using a quadratic-tensor local model. Some problems with nonlinear constraints can also be solved. Derivatives required. Forward or reverse communication. (See R.J. Hanson, SIAM J. Sci. Stat. Comput. 7 (1986) pp. 826-834.).

Package PORT (Downloadable)

DN2FB
Minimize a twice-continuously differentiable nonlinear sum of squares using residual values only, subject to simple upper and lower bounds on the variables. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian are used to aid convergence from poor starting values.
DN2GB
Minimize a twice-continuously differentiable nonlinear sum of squares using user-supplied residual values and their derivatives, subject to simple upper and lower bounds on the variables. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian are used to aid convergence from poor starting values.
DN2PB
Minimize a twice-continuously differentiable nonlinear sum of squares using residual values only, subject to simple upper and lower bounds on the variables. User supplies residual in chunks to reduce storage. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian are used to aid convergence from poor starting values.
DNSFB
Solve separable nonlinear least squares problem (i.e., one with both linear and nonlinear parameters) subject to simple upper and lower bound constraints on the nonlinear parameters. Derivatives are approximated using finite differences. Uses variable-projection technique to eliminate linear parameters, thus obtaining a simpler nonlinear least squares problem.
DNSGB
Solve separable nonlinear least squares problem (i.e., one with both linear and nonlinear parameters) subject to simple upper and lower bound constraints on the nonlinear parameters. User supplies derivatives. Uses variable-projection technique to eliminate linear parameters, thus obtaining a simpler nonlinear least squares problem.
DRN2GB
Reverse communication version of PORT''s N2GB. Minimize a twice continuously differentiable nonlinear sum of squares using user-supplied residual values and derivatives, subject to simple upper and lower bounds. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian aid convergence from poor starting values.
DRNSGB
Reverse communication version of PORT''s NSGB. Solve separable nonlinear least squares problem (i.e., one with both linear and nonlinear parameters) subject to simple upper and lower bound constraints on the nonlinear parameters. User supplies derivatives. Uses variable-projection technique to eliminate linear parameters, thus obtaining a simpler nonlinear least squares problem.
N2FB
Minimize a twice-continuously differentiable nonlinear sum of squares using residual values only, subject to simple upper and lower bounds on the variables. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian are used to aid convergence from poor starting values.
N2GB
Minimize a twice-continuously differentiable nonlinear sum of squares using user-supplied residual values and their derivatives, subject to simple upper and lower bounds on the variables. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian are used to aid convergence from poor starting values.
N2PB
Minimize a twice-continuously differentiable nonlinear sum of squares using residual values only, subject to simple upper and lower bounds on the variables. User supplies residual in chunks to reduce storage. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian are used to aid convergence from poor starting values.
NSFB
Solve separable nonlinear least squares problem (i.e., one with both linear and nonlinear parameters) subject to simple upper and lower bound constraints on the nonlinear parameters. Derivatives are approximated using finite differences. Uses variable-projection technique to eliminate linear parameters, thus obtaining a simpler nonlinear least squares problem.
NSGB
Solve separable nonlinear least squares problem (i.e., one with both linear and nonlinear parameters) subject to simple upper and lower bound constraints on the nonlinear parameters. User supplies derivatives. Uses variable-projection technique to eliminate linear parameters, thus obtaining a simpler nonlinear least squares problem.
RN2GB
Reverse communication version of PORT''s N2GB. Minimize a twice continuously differentiable nonlinear sum of squares using user-supplied residual values and derivatives, subject to simple upper and lower bounds. Uses variant of Newton''s method where Hessian is part exact and part approximated by a secant update. A model/trust-region approach and adaptive Hessian aid convergence from poor starting values.
RNSGB
Reverse communication version of PORT''s NSGB. Solve separable nonlinear least squares problem (i.e., one with both linear and nonlinear parameters) subject to simple upper and lower bound constraints on the nonlinear parameters. User supplies derivatives. Uses variable-projection technique to eliminate linear parameters, thus obtaining a simpler nonlinear least squares problem.
Comments? gams@nist.gov