Ok, I have this problem to do and I can't even begin to figure out what it means: (sub will be represented by _)
Construct a sequence of interpolating values y_n to f(1 + sqrt(10)) where f(x) = 1/(1+ x^2) for -5 <= x<=5, as follows: For each n = 1,2,...,10, let h=10/n and y_n = P_n (1 + sqrt(10)), where P_n(x) is the interpolating polynomial for f(x) at the nodes (x_0)^(n), (x_1)^(n),..., (x_n)^(n) and (x_j)^(n) = 05 +j*h for each j = 0,1,2,...,n. Does the sequence {y_n} appear to converge on f(1 + sqrt(10))?
Help please 🙁
EDIT: This is in the section of the book after Langrange Polynomials and Neville's iteration so I assume we have to use that somehow.
EDIT v2.0: Scan of problem from book located at here
Construct a sequence of interpolating values y_n to f(1 + sqrt(10)) where f(x) = 1/(1+ x^2) for -5 <= x<=5, as follows: For each n = 1,2,...,10, let h=10/n and y_n = P_n (1 + sqrt(10)), where P_n(x) is the interpolating polynomial for f(x) at the nodes (x_0)^(n), (x_1)^(n),..., (x_n)^(n) and (x_j)^(n) = 05 +j*h for each j = 0,1,2,...,n. Does the sequence {y_n} appear to converge on f(1 + sqrt(10))?
Help please 🙁
EDIT: This is in the section of the book after Langrange Polynomials and Neville's iteration so I assume we have to use that somehow.
EDIT v2.0: Scan of problem from book located at here