Search

Taylor Expansion and approximating roots of polynomials over the rational field...

I answered a nice question on WyzAnt about approximating roots to a quadratic equation that had real but irrational roots.  
 
Calculus enables us to find rational approximations to these irrational roots.  I will re-present the solution to the problem, and continue to find a general expression to do this for an arbitrary polynomial.
 
"Problem: approximate roots of -9x^2+8x+5.
 
There is a nice approach using calculus to estimate/approximate a function without a square root and calculator.

We can use the concept of moments to get an approximation to a function. For this example, we have a quadratic function in (x) with coefficients, a=-9, b=8, and c=5, as indicated in a previous solution.

Thus f(x)=ax^2+bx+c.

First we need to find a general idea of where a root (an x where f(x)=0) is located. To do this we can check the value of the function for some easy numbers, x=0 and x=1 are always good choices:

f(1)=a+b+c=4
f(0)=c=5

We notice the function is increasing from 0 to 1 (the value decreased as x decreased). So if we try x>1 maybe we will find a 0...next evaluate f(2)

f(2)=a*4+b*2+c=-36+16+5 < 0

Awesome, since this is a continuous function we know there must be a root between 1 and 2...lets call this root x=1+\eps, for some little \eps that is between 0 and 1.

We are now ready to use a linear approximation and a first derivative to approximate where this root is!

Let (x_0,y_0)=(1,4), and we set \delta x=\epsilon.

We need to recall the derivative f'(x) denotes the slope of the function at a point, so we will approximate the derivative at the root using f'(1)...Since f'(x)=2ax+b...f'(1)=2a+b=-10...

So near to x=1, say like the root we seek at x=1+\epsilon, the slope of a line approximating this function should be about -10...thus when y=0 at x=1+\epsilon a line approximating the curve should be:

y-y_0=m(x-x_0)...0-4=-10(\epsilon)...so \epsilon=4/10=2/5...and the root should be close by to x=1.4.

Now evaluate f(1.4)= -9*1.4^2+8*1.4+5 = -9*1.96+11.2+5=-17.64+16.2 = -1.44...

So we overshot by a little, our \epsilon was too large, it made our y value decrease from 4 to something past 0, a negative number...thus the true \epsilon should of been smaller. We have a choice now, with given possible values above we can quickly see which choice has an x greater than 1 and less than 1.4 {1.31,-.42}.

Of course evaluating all the choices initially works too...but this allows us to apply calculus to get an approximation, in case no choices were available and a linear approximation would be acceptable.

We can use this new point (x_1,y_1)=(1.4,-1.44)=(7/5,-36/25) to find a better approximation.

Use x=7/5-\delta, for some small \delta between 0 and 1, and repeat the procedure:

y-y_1=m(x-x_1), where we now let m be the deviated at x_1=7/5...

Thus 0-(-36/25)=f'(7/5)*(-\delta)...and \delta=18/215...so x=1+\epsilon-\delta=1+2/5-18/215=(215+86-18)/215=283/215...doing long divison to a few decimal places yields 1.316.

This shows how we can by hand...using no machinery, calculators, excel sheets...produce a better approximation over the field of rational numbers than the choices given offers!"
 
 
We see that in general for a function y=f(x) that we are looking for a root of...if we find an x_0 where f(x_0)>0 and f(x_0+1)<0 (or vice versa)...we know there exists an \epsilon in (0,1), with x=x_0+\epsilon and f(x)=0.
 
We can find this by writing down the linear approximation:
 
0-f(x_0)=f'(x_0)(\epsilon), and solve for \epsilon=-f(x_0)/f'(x_0)...this is the first moment, an approximation of first order:
root ~ x_0-f(x_0)/f'(x_0)
 
We can now evaluate f at x_0+\epsilon=x_1 and find a new base point to approximate near....x_1=x_0-f(x_0)/f'(x_0)...
 
We first need to check if f(x_1) is positive or negative and then proceed by adding or subtracting appropriately...but we end up with a telescoping sum of epsilons and deltas that get better and better approximations for our root...
 
WLOG assume that f(x_1)<0, meaning our epsilon was too large and we should subtract off a \delta in (0,1), and let
x = x_1 - \delta = x_0 + \epsilon - \delta.
 
We then have a linear approximation of this x by looking at the line her x_1=x_0+\epsilon:
 
0-f(x_1)=f'(x_1)(-\delta)...so \delta = f(x_1)/f'(x_1)...and a second order approximation for the irrational root becomes:
 
root ~ x_0+\epsilon-\delta = x_0-f(x_0)/f'(x_0)-f(x_1)/f'(x_1)
 
Note how this is a sort of generating function for moments...substitution our formula for x_1 in terms of x_0, x_1=x_0-f(x_0)/f'(x_0), yields:
 
root ~ x_0+\epsilon-\delta = x_0 - f(x_0)/f'(x_0) - f(x_0-f(x_0)/f'(x_0))/f'(x_0-f(x_0)/f'(x_0))
 
Note the nested derivative in the second order term here.
 
Untangling this using what we know about derivatives leads to a general taylor expansion for a function near a point x=a when the function is infinitely differentiable at x=a.  
 
if (isMyPost) { }