Originally posted by: RaynorWolfcastle
Use newton's method, you can do it by hand and it converges quickly if you start with a decent approximation
to use it, let
f(x) = e^(-x)-1+x/3
f'(x) = 1/3 - e^(-x)
x[n+1] = x[n] + f(x[n])/f'(x[n])
where x[n] is the nth iteration
so if you pick x = 2 as an initial guess
x[1] = 2
x[2] = 3
x[3] = 2.8244
x[4] = 2.8214
Voila. There are probably methods that converge more quickly but this one is really easy to work out with a scientific calculator 🙂
Indeed. For a problem like this, where f/f' is well behaved (specifically, f' doesn't get near 0, so the quotient won't blow up), Newton's Method is pretty much preferred.
It has quadratic convergence properties here (i.e. you double your accuracy each time, more or less).
There are faster methods that take advantage of higher order terms of the Taylor Series expansion (Newton uses up to the linear term), such as the Halley Method, which takes the quadratic term as well. Though the denominator in Halley's method looks something like [f']^2 - f*f''...so if that term is ~0, you get big problems & convergence falls to linear. The payoff is that the convergence is cubic--3 times the accuracy per iteration (after the first 2 or so runs though if your initial guess isn't "good" enough).
But in general, Newton's is preferred among numerical folks b/c it's fast + requires relatively few function evals. By comparision, Halley's method requires something like 2 times the number of function evaluations, in addition to the constraints on f, f', and f''.
In general, there's something called Laguerre's Method that yields the general formula for starting from an n-th order Taylor approximation. Laguerre w/n=1 is Newton, n=2 is Halley, and so forth.
edit: oh yeah, these are some of the more famous superlinear root-finders. Other nifty algorithms include the Secant method, which has a convergence rate = golden ratio (i thought that was cute when I derived it), or the inverse quadratic method (better than secant, worse than Newton, riskier than both). There's also the fixed-point method (what
dullard has), which is the slowest of them all (linear convergence)...but there are cute ways of speeding it up.