Linear Algebra and Its Applications (5th Edition)

Published by Pearson
ISBN 10: 032198238X
ISBN 13: 978-0-32198-238-4

Chapter 2 - Matrix Algebra - Supplementary Exercises - Page 162: 11

Answer

see work step by step

Work Step by Step

(a) By supposition, we know that \[ \left[\begin{array}{ccccc} 1 & x_{1} & x_{1}^{2} & \cdots & x_{1}^{n-1} \\ 1 & x_{2} & x_{2}^{2} & \cdots & x_{2}^{n-1} \\ \vdots & \vdots & \vdots & & \vdots \\ 1 & x_{n} & x_{n}^{2} & \cdots & x_{n}^{n-1} \end{array}\right]\left[\begin{array}{c} c_{0} \\ c_{1} \\ \vdots \\ c_{n-1} \end{array}\right]=\left[\begin{array}{c} y_{1} \\ y_{2} \\ \vdots \\ y_{n} \end{array}\right] \] Let's multiply out the matrices on the left-hand side of this equation to get \[ \left[\begin{array}{c} c_{0}+c_{1} x_{1}+c_{2} x_{1}^{2}+\cdots+c_{n-1} x_{1}^{n-1} \\ c_{0}+c_{1} x_{2}+c_{2} x_{2}^{2}+\cdots+c_{n-1} x_{2}^{n-1} \\ \vdots \\ c_{0}+c_{1} x_{n}+c_{2} x_{n}^{2}+\cdots+c_{n-1} x_{n}^{n-1} \end{array}\right]=\left[\begin{array}{c} y_{1} \\ y_{2} \\ \vdots \\ y_{n} \end{array}\right] \] Comparing the rows on the left-hand side of this equation to $p(t),$ we can see that \[ \left[\begin{array}{c} p\left(x_{1}\right) \\ p\left(x_{2}\right) \\ \vdots \\ p\left(x_{n}\right) \end{array}\right]=\left[\begin{array}{c} y_{1} \\ y_{2} \\ \vdots \\ y_{n} \end{array}\right] \] From which we conclude that $p\left(x_{i}\right)=y_{i}$ for each $i=1,2, \ldots, n$ We note that the Fundamental Theorem of Algebra FTA tells us that the maximum number of real zeros that a degree $n$ polynomial can have is $n$ Notice that any nonzero solution $\mathbf{c}$ to the homogeneous equation $V \mathbf{c}=$ 0 will be the coefficients of some degree $n-1$ polynomial given by \[ p(t)=c_{0}+c_{1} t+\cdots+c_{n-1} t^{n-1} \] But then, from part (a), we know that all of the $n$ numbers $x_{1}, \ldots, x_{n}$ are zeros of this polynomial*. From the FTA we know that it's not possible for a degree $n-1$ or small polynomial to have $n$ zeros. Hence the only solution to this equation is the trivial solution $(\mathbf{c}=\mathbf{0}) .$ The Invertible Matrix Theorem (IMT) then tells us that the columns of $V$ are linearly independent. $*$ Set all of the $y$ 's in part (a) to 0 to see that $p\left(x_{i}\right)=0$ for all $i .$ I.e. each $x_{i}$ is a zero of the polynomial $p$ (c) From part (b), we know that if $x_{1}, \ldots, x_{n}$ are distinct numbers, then the columns of $V$ are linearly independent and from the IMT we then see that $V$ is invertible. since $V$ is invertible, there is a solution $\mathbf{c}$ to the equation $V \mathbf{c}=\mathbf{y}$ from every $\mathbf{y}$ in $\mathbb{R}^{n}$ From part (a) we know that a solution to $V \mathbf{c}=\mathbf{y}$ are the coefficients of an interpolating polynomial of degree $n-1$ (or smaller) for the points $\left(x_{1}, y_{1}\right), \ldots,\left(x_{n}, y_{n}\right)$ From part (a) we know that a solution to $V \mathbf{c}=\mathbf{y}$ are the coefficients of an interpolating polynomial of degree $n-1$ (or smaller) for the points $\left(x_{1}, y_{1}\right), \ldots,\left(x_{n}, y_{n}\right)$ Hence we find that if $x_{1}, \ldots, x_{n}$ are distinct numbers, then for any numbers $y_{1}, \ldots, y_{n},$ there is an interpolating polynomial of degree $n-1$ or smaller, as required.
Update this answer!

You can help us out by revising, improving and updating this answer.

Update this answer

After you claim an answer you’ll have 24 hours to send in a draft. An editor will review the submission and either publish your submission or provide feedback.