Modified Newton Algorithm: Difference between revisions

From OpenSeesWiki
Jump to navigation Jump to search
(Created page with '{{CommandManualMenu}} This command is used to construct a ModifiedNewton algorithm object, which uses the modified newton algorithm to solve The command is of the following for...')
 
No edit summary
 
(7 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{CommandManualMenu}}
{{CommandManualMenu}}


This command is used to construct a ModifiedNewton algorithm object, which uses the modified newton algorithm to solve The command is of the following form:
This command is used to construct a ModifiedNewton algorithm object, which uses the modified newton-raphson algorithm to solve the nonlinear residual equation. The command is of the following form:


{|  
{|  
| style="background:yellow; color:black; width:800px" | '''algorithm Newton'''
| style="background:lightgreen; color:black; width:800px" | '''algorithm ModifiedNewton <-initial> '''
|}
 
 
{|
|  style="width:150px" | '''-initial''' || optional flag to indicate to use initial stiffness iterations.
|}
|}


Line 13: Line 18:




----
REFERENCES:
[http://en.wikipedia.org/wiki/Newton%27s_method Read the page at Wikipedia]


----
----
Line 23: Line 23:
THEORY:
THEORY:


The Newton (also known as Newton-Raphson) method is the most widely used and robust method for solving nonlinear algebraic equations. The Newton method used in finite element analysis is identical to that taught in basic calculus courses. It is just extended for the n unknown degrees-of-freedom. The method as taught in basic calculus, is a root-finding algorithm that uses the first few terms of the Taylor series of a function <math>f(x)\,\!</math> in the vicinity of a suspected root <math>x_n\,\!</math> to find the root <math>x_{n+1}\,\!</math>. Newton's method is sometimes also known as Newton's iteration, although in this work the latter term is reserved to the application of Newton's method for computing square roots.
The theory for the ModifiedNewton method is similar to that for the [[Newton_Algorithm | Newton-Raphson method]]. The difference is that the tangent at the initial guess is used in the iterations, instead of the current tangent. The Modified Newmark method is thus an iterative method in which, starting at a good initial guess <math>U_0</math> we keep iterating until <math>\Delta U</math> is small enough using the following:


The Taylor series of <math>r(x)\,\!</math> about the point <math>x=x_n+\Delta x\,\!</math> is given by
:<math> \Delta U = - K_0^{-1}R(U_n),\!</math>


:<math>f(x_n+\Delta x) = f(x_n)+r^{'}(x_n)\Delta x + 1/2r^{''}(x_n) \Delta x^2+....\,\!</math>
:<math> U_{n+1} = U_n + \Delta U\,\!</math>
 
Keeping terms only to first order,
 
:<math>f(x_n+\Delta x) \approx f(x_n)+r^'(x_n)\Delta x  = f(x_n)+ \frac{df(x_n)}{dx}\Delta x</math>
 
and since at the root we wish to find <math>x_n + \Delta x</math>, the function equates to 0, i.e. <math>f(x_n+\Delta x) = 0</math>, we can solve for an approximate <math>\Delta x</math>


:<math> \Delta x \approx  -\frac{f(x_n)}{f^'(x_n)} = - \frac{df(x_n)}{dx}^{-1}f(x_n)</math>
where:


The Newmark method is thus an iterative method in which, starting at a good initial guess <math>x_0\,\!</math> we keep iterating until <math>\Delta x</math> is small enough using the following:
:<math>K_0 = \frac{\partial R(U_0)}{\partial U}\,\!</math>


:<math> \Delta x = - \frac{df(x_n)}{dx}^{-1}f(x_n)\,\!</math>
The advantage of this method over the regular Newton method, is that the system Jacobian is formed only once at the start of the step and factored only once if a direct solver is used. The drawback of this method is that it requires more iterations than Newton's method.
 
:<math> x_{n+1} = x_n + \Delta x\,\!</math>
 
 
The process starts with an initial guess <math>x_0</math> and providing the function is well behaved we arrive at a better approx <math>x_n</math>
 
:
 
The method is generalized to n unknowns by replacing the above scalar equations with matrix ones.
 
:<math>R(U_n+\Delta x) = R(U_n)+\frac{\partial R(U_n)}{\partial U} \Delta U + O(\Delta U ^2) \,\!</math>
 
The matrix <math>\frac{\partial R(U_n)}{\partial U}\,\!</math> is called the system Jacobian matrix and will be denoted K:
 
:<math>K = \frac{\partial R(U_n)}{\partial U}\,\!</math>
 
 
resulting in our iterative procedure where starting from a good initial guess we iterate until <math> \Delta U ,\!</math> is small enough with the following:
 
:<math> \Delta U = - K^{-1}R(U_n),\!</math>
 
:<math> U_{n+1} = U_n + \Delta U\,\!</math>


note: when -initial flag is provided <math>K_0</math> is Jacobian from undeformed configuration.


----
----


Code Developed by: <span style="color:blue"> fmk </span>
Code Developed by: <span style="color:blue"> fmk </span>

Latest revision as of 18:24, 17 May 2013




This command is used to construct a ModifiedNewton algorithm object, which uses the modified newton-raphson algorithm to solve the nonlinear residual equation. The command is of the following form:

algorithm ModifiedNewton <-initial>


-initial optional flag to indicate to use initial stiffness iterations.



NOTES:



THEORY:

The theory for the ModifiedNewton method is similar to that for the Newton-Raphson method. The difference is that the tangent at the initial guess is used in the iterations, instead of the current tangent. The Modified Newmark method is thus an iterative method in which, starting at a good initial guess <math>U_0</math> we keep iterating until <math>\Delta U</math> is small enough using the following:

<math> \Delta U = - K_0^{-1}R(U_n),\!</math>
<math> U_{n+1} = U_n + \Delta U\,\!</math>

where:

<math>K_0 = \frac{\partial R(U_0)}{\partial U}\,\!</math>

The advantage of this method over the regular Newton method, is that the system Jacobian is formed only once at the start of the step and factored only once if a direct solver is used. The drawback of this method is that it requires more iterations than Newton's method.

note: when -initial flag is provided <math>K_0</math> is Jacobian from undeformed configuration.


Code Developed by: fmk