Newton Algorithm: Difference between revisions

From OpenSeesWiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 12: Line 12:
NOTES:
NOTES:


----
REFERENCES:
[http://en.wikipedia.org/wiki/Newton%27s_method Read the page at Wikipedia]


----
----
Line 19: Line 25:
The Newton (also known as Newton-Raphson) method is the most widely used and robust method for slving nonlinear algrebraic equations. The Newton method used in finite element analysis is identical to that taught in basic calculus courses. The method as taught in basic calculus, is a root-finding algorithm that uses the first few terms of the Taylor series of a function f(x) in the vicinity of a suspected root to find the root. Newton's method is sometimes also known as Newton's iteration, although in this work the latter term is reserved to the application of Newton's method for computing square roots.
The Newton (also known as Newton-Raphson) method is the most widely used and robust method for slving nonlinear algrebraic equations. The Newton method used in finite element analysis is identical to that taught in basic calculus courses. The method as taught in basic calculus, is a root-finding algorithm that uses the first few terms of the Taylor series of a function f(x) in the vicinity of a suspected root to find the root. Newton's method is sometimes also known as Newton's iteration, although in this work the latter term is reserved to the application of Newton's method for computing square roots.


The Taylor series of <math>F(x)</math> about the point <math>x=x_0+\epsilon</math> is given by


The Taylor series of <math>f(x)</math> about the point <math>x=x_0+\epsilon</math> is given by
:<math>f(x_0+\epsilon) = f(x_0)+f^{'}(x_0)\epsilon + 1/2f^{''}(x_0) \epsilon^2+....</math>


<math> R_{t+\Delta t}^i = F_{t + \Delta t}^{ext} -  M \ddot U_{t+ \Delta t}^{i-1}</math>
Keeping terms only to first order,


:<math>f(x_0+\epsilon) \approx f(x_0)+f^'(x_0)\epsilon </math>


<math>f(x_0 + \epsilon) = f(x_0)+f^'(x_0) \epsilon + 1/2f^('')(x_0) \epsilon^2+....</math>
and since at the root we wish to find <math>x_0 + \epsilon</math>, the function equates to 0, <math>f(x_0+\epsilon) = 0</math>, we can solve for an approximate <math>\epsilon</math>


Keeping terms only to first order,
:<math> \epsilon \approx - \frac{f(x_0)}{f'(x_0)}</math>
<math>f(x_0+\epsilon) \approx f(x_0)+f^'(x_0)\epsilon <math>
 
(2)
The Newmark method is thus an iterative method in which we keep iterating until <math>\epsilon</math> is small enough.
 
The process starts with an initial guess <math>x_0</math> and providing the function is well behaved we arrive at a better approx <math>x_1</math>
 
:<math>x_{1} = x_0 - \frac{f(x_0)}{f'(x_0)}.\,\!</math>
 
and so on ...


Equation (2) is the equation of the tangent line to the curve at (x_0,f(x_0)), so (x_1,0) is the place where that tangent line intersects the x-axis. A graph can therefore give a good intuitive idea of why Newton's method works at a well-chosen starting point and why it might diverge with a poorly-chosen starting point.
:<math>x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}.\,\!</math>


This expression above can be used to estimate the amount of offset epsilon needed to land closer to the root starting from an initial guess x_0. Setting f(x_0+epsilon)=0 and solving (2) for epsilon=epsilon_0 gives
epsilon_0=-(f(x_0))/(f^'(x_0)),
(3)


which is the first-order adjustment to the root's position. By letting x_1=x_0+epsilon_0, calculating a new epsilon_1, and so on, the process can be repeated until it converges to a fixed point (which is precisely a root) using
epsilon_n=-(f(x_n))/(f^'(x_n)).


----
----


Code Developed by: <span style="color:blue"> fmk </span>
Code Developed by: <span style="color:blue"> fmk </span>

Revision as of 00:26, 3 March 2010




This command is used to construct a NewtonRaphson algorithm object. The command is of the follwoing form:

algorithm Newton



NOTES:



REFERENCES:

Read the page at Wikipedia


THEORY:

The Newton (also known as Newton-Raphson) method is the most widely used and robust method for slving nonlinear algrebraic equations. The Newton method used in finite element analysis is identical to that taught in basic calculus courses. The method as taught in basic calculus, is a root-finding algorithm that uses the first few terms of the Taylor series of a function f(x) in the vicinity of a suspected root to find the root. Newton's method is sometimes also known as Newton's iteration, although in this work the latter term is reserved to the application of Newton's method for computing square roots.

The Taylor series of <math>F(x)</math> about the point <math>x=x_0+\epsilon</math> is given by

<math>f(x_0+\epsilon) = f(x_0)+f^{'}(x_0)\epsilon + 1/2f^{}(x_0) \epsilon^2+....</math>

Keeping terms only to first order,

<math>f(x_0+\epsilon) \approx f(x_0)+f^'(x_0)\epsilon </math>

and since at the root we wish to find <math>x_0 + \epsilon</math>, the function equates to 0, <math>f(x_0+\epsilon) = 0</math>, we can solve for an approximate <math>\epsilon</math>

<math> \epsilon \approx - \frac{f(x_0)}{f'(x_0)}</math>

The Newmark method is thus an iterative method in which we keep iterating until <math>\epsilon</math> is small enough.

The process starts with an initial guess <math>x_0</math> and providing the function is well behaved we arrive at a better approx <math>x_1</math>

<math>x_{1} = x_0 - \frac{f(x_0)}{f'(x_0)}.\,\!</math>

and so on ...

<math>x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}.\,\!</math>



Code Developed by: fmk