Calculus (3rd Edition)

Published by W. H. Freeman
ISBN 10: 1464125260
ISBN 13: 978-1-46412-526-3

Chapter 15 - Differentiation in Several Variables - 15.7 Optimization in Several Variables - Exercises - Page 823: 56

Answer

We show that the minimum value of $E$ occurs for $m$ and $b$ satisfying the two equations $m\left( {\sum\limits_{j = 1}^n {{x_j}} } \right) + bn = \sum\limits_{j = 1}^n {{y_j}}$ $m\sum\limits_{j = 1}^n {{x_j}^2} + b\sum\limits_{j = 1}^n {{x_j}} = \sum\limits_{j = 1}^n {{x_j}} {y_j}$

Work Step by Step

We have the sum of the squares given by $E\left( {m,b} \right) = \mathop \sum \limits_{j = 1}^n {\left( {{y_j} - f\left( {{x_j}} \right)} \right)^2}$, where $f\left( x \right) = mx + b$. Substituting $f\left( {{x_j}} \right)$ in $E$ gives $E\left( {m,b} \right) = \mathop \sum \limits_{j = 1}^n {\left( {{y_j} - m{x_j} - b} \right)^2}$ The partial derivatives of $E$ with respect to $m$ and $b$ are ${E_m} = - 2\mathop \sum \limits_{j = 1}^n \left( {{y_j} - m{x_j} - b} \right){x_j}$ ${E_b} = - 2\mathop \sum \limits_{j = 1}^n \left( {{y_j} - m{x_j} - b} \right)$ ${E_{mm}} = 2\mathop \sum \limits_{j = 1}^n {x_j}^2$, ${\ \ }$ ${E_{bb}} = 2n$, ${\ \ }$ ${E_{mb}} = 2\mathop \sum \limits_{j = 1}^n {x_j}$ We find the critical points of $E$ by solving the equations ${E_m} = 0$ and ${E_b} = 0$: (1) ${\ \ \ }$ ${E_m} = - 2\mathop \sum \limits_{j = 1}^n \left( {{y_j} - m{x_j} - b} \right){x_j} = 0$ (2) ${\ \ \ }$ ${E_b} = - 2\mathop \sum \limits_{j = 1}^n \left( {{y_j} - m{x_j} - b} \right) = 0$ 1. Equation (1) becomes $\mathop \sum \limits_{j = 1}^n \left( {{x_j}{y_j} - m{x_j}^2 - b{x_j}} \right) = 0$ $\mathop \sum \limits_{j = 1}^n {x_j}{y_j} - \mathop \sum \limits_{j = 1}^n m{x_j}^2 - \mathop \sum \limits_{j = 1}^n b{x_j} = 0$ Hence, $m\mathop \sum \limits_{j = 1}^n {x_j}^2 + b\mathop \sum \limits_{j = 1}^n {x_j} = \mathop \sum \limits_{j = 1}^n {x_j}{y_j}$ 2. Equation (2) becomes $\mathop \sum \limits_{j = 1}^n \left( {{y_j} - m{x_j} - b} \right) = 0$ $\mathop \sum \limits_{j = 1}^n {y_j} - \mathop \sum \limits_{j = 1}^n m{x_j} - \mathop \sum \limits_{j = 1}^n b = 0$ Hence, $m\left( {\mathop \sum \limits_{j = 1}^n {x_j}} \right) + bn = \mathop \sum \limits_{j = 1}^n {y_j}$ Next, we show that $E$ has minimum at the critical point: The discriminant of $E$ is $D = {E_{mm}}{E_{bb}} - {E_{mb}}^2$ $D = 4n\mathop \sum \limits_{j = 1}^n {x_j}^2 - 4{\left( {\mathop \sum \limits_{j = 1}^n {x_j}} \right)^2}$ If we write in vector notations ${\bf{v}} = {\bf{w}} = \left( {{x_1},{x_2},...,{x_n}} \right)$, then the dot product: ${\bf{v}}\cdot{\bf{w}} = {x_1}^2 + {x_2}^2 + ...,{x_n}^2 = \mathop \sum \limits_{j = 1}^n {x_j}^2$ $||{\bf{v}}|| = \sqrt {\mathop \sum \limits_{j = 1}^n {x_j}^2} $, ${\ \ \ }$ $||{\bf{w}}|| = \sqrt {\mathop \sum \limits_{j = 1}^n {x_j}^2} $ By the Cauchy-Schwarz inequality (on page 668) we have: $\left| {{\bf{v}}\cdot{\bf{w}}} \right| \le ||{\bf{v}}||||{\bf{w}}||$ $\left| {\mathop \sum \limits_{j = 1}^n {x_j}^2} \right| \le \mathop \sum \limits_{j = 1}^n {x_j}^2$ In general, $n$ is much smaller than the term $\mathop \sum \limits_{j = 1}^n {x_j}^2$. Hence, $4n\mathop \sum \limits_{j = 1}^n {x_j}^2 < 4{\left( {\mathop \sum \limits_{j = 1}^n {x_j}} \right)^2}$ Therefore, $D<0$. Since ${E_{mm}} > 0$, by the Second Derivative Test, $E$ has a local minimum at the critical point. Hence, the minimum value of $E$ occurs for $m$ and $b$ satisfying the two equations $m\left( {\sum\limits_{j = 1}^n {{x_j}} } \right) + bn = \sum\limits_{j = 1}^n {{y_j}}$ $m\sum\limits_{j = 1}^n {{x_j}^2} + b\sum\limits_{j = 1}^n {{x_j}} = \sum\limits_{j = 1}^n {{x_j}} {y_j}$
Update this answer!

You can help us out by revising, improving and updating this answer.

Update this answer

After you claim an answer you’ll have 24 hours to send in a draft. An editor will review the submission and either publish your submission or provide feedback.