College Algebra (6th Edition)

Published by Pearson
ISBN 10: 0-32178-228-3
ISBN 13: 978-0-32178-228-1

Chapter 8 - Sequences, Induction, and Probability - Exercise Set 8.3 - Page 742: 107



Work Step by Step

An infinite geometric series converges if and only if $|r|\lt1$, where $r$ is the common ratio. If it converges, then it equals $\frac{a_1}{1-r}$ where $a_1$ is the first term. The common ratio is the quotient of two consecutive terms: $r=\frac{a_2}{a_1}=\dfrac{-5}{10}=-\frac{1}{2}$, $|-\frac{1}{2}|=\frac{1}{2}\lt1$, thus it converges. Hence the sum (since $a_1=10$): $\dfrac{10}{1-(-\frac{1}{2})}=\dfrac{10}{1+\frac{1}{2}}$, thus the statement is false.
Update this answer!

You can help us out by revising, improving and updating this answer.

Update this answer

After you claim an answer you’ll have 24 hours to send in a draft. An editor will review the submission and either publish your submission or provide feedback.