logo
    A Parallel Iterative Method for Solving Nonlinear Least-Squares Problems
    0
    Citation
    5
    Reference
    10
    Related Paper
    Abstract:
    In this paper, we are concerned with a parallel iterative method for nonlinear least-squares problems, which can be viewed as an alterative ordering of the variables in the Jacobian method, the variables are divided into non-disjoint groups. Furthermore, a parallel algorithm is proposed and performed on HP-SPP1600. The numerical results show that the algorithm has nice speedup and parallel efficiency.
    Keywords:
    Speedup
    Disjoint sets
    Least-squares function approximation
    In this paper, we define a QLY total order ≤Q over Dm to compare the magnitude of dual vectors. Then we consider the QLY least-squares problem and give its compact formula. Meanwhile, by comparing with a least-squares and the least-squares minimal-norm solutions, we can investigate a QLY least-squares and the QLY least-squares minimal-norm of linear dual least-squares problems. In particular, in the presence of a least-squares solution, we can get a QLY least-squares solution to be more accurate than a least-squares solution under the QLY total order.
    Least-squares function approximation
    In addition to the three existing nonlinear squares algorithmsGauss-Newton method, damped lwast squares method and quasi-Newton method on least squares, a better algorithmSQPM (Sequential Quadratic Programming Method) as one of the most powerful algorthms of nonlinear programming is applied. And the step-length policy of SQPM is improved in order to advance the iterative convergency. The improved SQPM becomes a useful and effective algorithm to solve parameters problems by nonlinear least squares adjustment without exactly computing the approximation of parameters.
    Least-squares function approximation
    Sequential quadratic programming
    Citations (0)
    Abstract The sections in this article are Preliminaries More Basic Concepts Basic Linear‐Algebra Concepts Least‐Squares Curve Fitting The General Linear Least‐Squares Problem Solving the Linear Least‐Squares Problem Weighted Least Squares Polynomial Fitting and Spline Interpolation Nonlinear Least Squares Sequential Least Squares Predictive Least Squares The Bootstrap Method
    Least-squares function approximation
    Interpolation
    Citations (5)
    Least squares is by far the simplest and most commonly applied computational method in many fields. In almost all applications, the least squares objective is rarely the true objective. We account for this discrepancy by parametrizing the least squares problem and automatically adjusting these parameters using an optimization algorithm. We apply our method, which we call least squares auto-tuning, to data fitting.
    Least-squares function approximation
    Citations (4)
    Abstract : Least squares problems arise when one attempts to fit a model y = n(x,beta) to points (y1,x1),...,(yn,xn). Solutions to such problems are obtained by optimizing the sum of squared deviations over an admissible region. This paper discusses the basic theory of optimization for a general objective function and applies this material to both the linear and nonlinear least squares problems. In linear least squares normal equations for both the full rank and less than full rank cases are considered and the Kuhn-Tucker conditions are used to obtain the normal equations under linear inequality constraints. In nonlinear least squares, different iterative procedures, which may be used to obtain a solution, are discussed. The methods considered are steepest descent, Newton-Raphson, Gauss-Newton, Hartley's modified Gauss-Newton, and that of Marquardt. Results are obtained which relate Marquardt's method to equality constrained least squares. (Author)
    Least-squares function approximation
    Rank (graph theory)
    Levenberg–Marquardt algorithm
    Citations (0)
    Least-squares function approximation
    Levenberg–Marquardt algorithm
    Nonlinear least squares estimations have been widely applied in positioning. However, nonlinear least squares estimations are generally biased. As the Gauss-Newton method has been widely applied to obtain a nonlinear least squares solution, we propose an iterative procedure for obtaining unbiased estimations with this method. The characteristics of the linearization error are discussed and a systematic error source of the linearization error needs to be removed to guarantee the unbiasedness. Both the geometrical condition and the statistical condition for unbiased nonlinear least squares estimations are revealed. It is shown that for long-distance observations of high precision, or for a positioning configuration with the lowest Geometric Dilution Of Precision (GDOP), the nonlinear least squares estimations tend to be unbiased; but for short-distance cases, the bias in the nonlinear least squares solution should be estimated to obtain unbiased values by removing the bias from the nonlinear least squares solution. The proposed results are verified by the Monte Carlo method and this shows that the bias in nonlinear least squares solution of short-distance distances cannot be ignored.
    Least-squares function approximation
    Linearization
    Citations (6)