Newton with Line Search Algorithm: Difference between revisions

From OpenSeesWiki
Jump to navigation Jump to search
(Created page with '{{CommandManualMenu}} This command is used to construct a NewtonLineSearch algorithm object which is introduces line search to the Newton-Raphson algorithm to solve the nonlinea...')
 
No edit summary
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{CommandManualMenu}}
{{CommandManualMenu}}


This command is used to construct a NewtonLineSearch algorithm object which is introduces line search to the Newton-Raphson algorithm to solve the nonlinear residual equation. Line search increases the effectiveness of the Newton method when convergence is slow due to roughness of the residual.
This command is used to construct a NewtonLineSearch algorithm object which introduces line search to the [[Newton_Algorithm | Newton-Raphson]] algorithm to solve the nonlinear residual equation. Line search increases the effectiveness of the Newton method when convergence is slow due to roughness of the residual.
The command is of the following form:
The command is of the following form:


Line 15: Line 15:
| || Bisection, Secant, RegulaFalsi, InitialInterpolated
| || Bisection, Secant, RegulaFalsi, InitialInterpolated
|-
|-
| '''-tol''' || optional flag to indicate to use initial stiffness on first steo, then use current stiffness for subsequent steps
| '''$tol''' || tolerance for search. optional, defeulat = 0.8
|-
| '''$maxIter''' || max num of iterations to try. optional, default = 10
|-
| '''$minEta''' || a min <math>\eta\!</math> value. optional, default = 0.1
|-
| '''$maxEta''' || a max <math>\eta\!</math> value. optional, default = 10.0
|}
|}


----
NOTES:




Line 28: Line 30:
REFERENCES:
REFERENCES:


M.A. Crisfield, "Nonlinear Finite Element Analysis of Solids and Structures", Vol(1), Wiley
M.A. Crisfield, "Nonlinear Finite Element Analysis of Solids and Structures, Volume 1:Essentials", Wiley, 1991.


----
----
Line 35: Line 37:


The rationale behin line search is that:
The rationale behin line search is that:
* the direction <math>\Delta U\,\!</math> found by the Newton method is often a good direction, but the step size  
* the direction <math>\Delta U\,\!</math> found by the [[Newton_Algorithm | Newton-Raphson method]] is often a good direction, but the step size <math>\parallel\Delta U\parallel</math> is not.  
<math>|| \Delta U|| \,\!</math> is not.  
* It is cheaper to compute the residual for several points along  <math>\Delta U\,\!</math> rather than form and factor a new system Jacobian
* It is cheaper to compute the residual for several points along  <math>\Delta U\,\!</math> rather than form and factor a new system Jacobian


In NewtonLineSearch the regular Newton-Raphson method is used to compute the <math>\Delta U\,\!</math>, but the update that is used is modified. The modified update is:
In NewtonLineSearch the regular [[Newton_Algorithm | Newton-Raphson method]] is used to compute the <math>\Delta U\,\!</math>, but the update that is used is modified. The modified update is:


:<math> U_{n+1} = U_n + \eta \Delta U\,\!</math>
:<math> U_{n+1} = U_n + \eta \Delta U\,\!</math>




The different line search algorithms use different root finding methods to obtain <math>\eta\,\!</math>:
The different line search algorithms use different root finding methods to obtain <math>\eta\,\!</math>, a root to the function <math>s(\eta)</math> defined as:
 
:<math> s0 = \Delta U R(U_n),\!</math>
 
:<math> s = \Delta U R(U_{n} + \Delta U)\,\!</math>
 
InterpolatedLineSearch:
 
while (r > tolerance && count < maxIter} {
:<math> \eta_{n+1} = \frac{\eta_n *s0}{s0 -s} ,\!</math>
:<math> \X = \eta  
:<math> s = \Delta U R(U_{n} + \Delta U)\,\!</math>
 
ReguaFalsi Line Search:
 
:<math>U_{n+1} = U_n + \eta \Delta U\,\!</math>
 
:<math>\eta\,\!</math>
 
Keeping terms only to first order,
 
:<math>f(x_n+\Delta x) \approx f(x_n)+r^'(x_n)\Delta x  = f(x_n)+ \frac{df(x_n)}{dx}\Delta x</math>


and since at the root we wish to find <math>x_n + \Delta x</math>, the function equates to 0, i.e. <math>f(x_n+\Delta x) = 0</math>, we can solve for an approximate <math>\Delta x</math>
:<math> s(\eta) = \Delta U R(U_{n} + \eta \Delta U)\,\!</math>


:<math> \Delta x \approx  -\frac{f(x_n)}{f^'(x_n)} = - \frac{df(x_n)}{dx}^{-1}f(x_n)</math>
with


The Newmark method is thus an iterative method in which, starting at a good initial guess <math>x_0\,\!</math> we keep iterating until <math>\Delta x</math> is small enough using the following:
:<math> s_0 = \Delta U R(U_n),\!</math>


:<math> \Delta x = - \frac{df(x_n)}{dx}^{-1}f(x_n)\,\!</math>


:<math> x_{n+1} = x_n + \Delta x\,\!</math>
== Interpolated Line Search: ==


while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {
:<math> \eta_{n+1} = \frac{\eta_n *s0}{s0 -s_{n+1}} ,\!</math>
}


__NOTOC__


The method is generalized to n unknowns by replacing the above scalar equations with matrix ones.
== RegulaFalsi Line Search: ==


:<math>R(U_n+\Delta x) = R(U_n)+\frac{\partial R(U_n)}{\partial U} \Delta U + O(\Delta U ^2) \,\!</math>  
while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {
:<math> \eta_{n+1} = \eta_U - \frac{s_U*(\eta_L-\eta_U)}{s_L-S_U} ,\!</math>
:if <math> s_{n+1} * s_L < 0 \Rightarrow \eta_U = \eta_{n+1},  s_U = s_{n+1},\!</math>
:if <math> s_{n+1} * s_U < 0 \Rightarrow \eta_L = \eta_{n+1},  s_L = s_{n+1},\!</math>
}


The matrix <math>\frac{\partial R(U_n)}{\partial U}\,\!</math> is called the system Jacobian matrix and will be denoted K:


:<math>K = \frac{\partial R(U_n)}{\partial U}\,\!</math>
== Bisection Line Search: ==




resulting in our iterative procedure where starting from a good initial guess we iterate until <math> \Delta U ,\!</math> is small enough with the following:
while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {
:<math> \eta_{n+1} = \frac{\eta_L - \eta_U}{2.0} ,\!</math>
:if <math> s_{n+1} * s_L < 0 \Rightarrow \eta_U = \eta_{n+1},  s_U = s_{n+1},\!</math>
:if <math> s_{n+1} * s_U < 0 \Rightarrow \eta_L = \eta_{n+1},  s_L = s_{n+1},\!</math>
}


:<math> \Delta U = - K^{-1}R(U_n),\!</math>


:<math> U_{n+1} = U_n + \Delta U\,\!</math>
== Secant Line Search: ==


while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {
:<math> \eta_{n+1} = \eta_j - \frac{s_j*(\eta_{j-1}-\eta_j)}{s_{j-1}-S_j} ,\!</math>
:if <math> s_{n+1} * s_L < 0 \Rightarrow \eta_U = \eta_{n+1},  s_U = s_{n+1},\!</math>
:if <math> s_{n+1} * s_U < 0 \Rightarrow \eta_L = \eta_{n+1},  s_L = s_{n+1},\!</math>
}


----
----


Code Developed by: <span style="color:blue"> fmk </span>
Code Developed by: <span style="color:blue"> fmk </span>

Latest revision as of 19:06, 14 September 2010




This command is used to construct a NewtonLineSearch algorithm object which introduces line search to the Newton-Raphson algorithm to solve the nonlinear residual equation. Line search increases the effectiveness of the Newton method when convergence is slow due to roughness of the residual. The command is of the following form:

algorithm NewtonLineSearch <-type $typeSearch> <-tol $tol> <-maxIter $maxIter> <-minEta $minEta> <-maxEta $maxEta>


$typeSearch line search algorithm. optional default is InitialInterpoled. valid types are:
Bisection, Secant, RegulaFalsi, InitialInterpolated
$tol tolerance for search. optional, defeulat = 0.8
$maxIter max num of iterations to try. optional, default = 10
$minEta a min <math>\eta\!</math> value. optional, default = 0.1
$maxEta a max <math>\eta\!</math> value. optional, default = 10.0



REFERENCES:

M.A. Crisfield, "Nonlinear Finite Element Analysis of Solids and Structures, Volume 1:Essentials", Wiley, 1991.


THEORY:

The rationale behin line search is that:

  • the direction <math>\Delta U\,\!</math> found by the Newton-Raphson method is often a good direction, but the step size <math>\parallel\Delta U\parallel</math> is not.
  • It is cheaper to compute the residual for several points along <math>\Delta U\,\!</math> rather than form and factor a new system Jacobian

In NewtonLineSearch the regular Newton-Raphson method is used to compute the <math>\Delta U\,\!</math>, but the update that is used is modified. The modified update is:

<math> U_{n+1} = U_n + \eta \Delta U\,\!</math>


The different line search algorithms use different root finding methods to obtain <math>\eta\,\!</math>, a root to the function <math>s(\eta)</math> defined as:

<math> s(\eta) = \Delta U R(U_{n} + \eta \Delta U)\,\!</math>


with

<math> s_0 = \Delta U R(U_n),\!</math>


Interpolated Line Search:

while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {

<math> \eta_{n+1} = \frac{\eta_n *s0}{s0 -s_{n+1}} ,\!</math>

}


RegulaFalsi Line Search:

while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {

<math> \eta_{n+1} = \eta_U - \frac{s_U*(\eta_L-\eta_U)}{s_L-S_U} ,\!</math>
if <math> s_{n+1} * s_L < 0 \Rightarrow \eta_U = \eta_{n+1}, s_U = s_{n+1},\!</math>
if <math> s_{n+1} * s_U < 0 \Rightarrow \eta_L = \eta_{n+1}, s_L = s_{n+1},\!</math>

}


Bisection Line Search:

while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {

<math> \eta_{n+1} = \frac{\eta_L - \eta_U}{2.0} ,\!</math>
if <math> s_{n+1} * s_L < 0 \Rightarrow \eta_U = \eta_{n+1}, s_U = s_{n+1},\!</math>
if <math> s_{n+1} * s_U < 0 \Rightarrow \eta_L = \eta_{n+1}, s_L = s_{n+1},\!</math>

}


Secant Line Search:

while (<math>\frac{s_n}{s_0}\!</math> > $tol && count < $maxIter} {

<math> \eta_{n+1} = \eta_j - \frac{s_j*(\eta_{j-1}-\eta_j)}{s_{j-1}-S_j} ,\!</math>
if <math> s_{n+1} * s_L < 0 \Rightarrow \eta_U = \eta_{n+1}, s_U = s_{n+1},\!</math>
if <math> s_{n+1} * s_U < 0 \Rightarrow \eta_L = \eta_{n+1}, s_L = s_{n+1},\!</math>

}


Code Developed by: fmk