Answer
False.
Work Step by Step
An infinite geometric series converges if and only if $|r|\lt1$, where $r$ is the common ratio. If it converges, then it equals $\frac{a_1}{1-r}$ where $a_1$ is the first term.
The common ratio is the quotient of two consecutive terms: $r=\frac{a_2}{a_1}=\dfrac{-5}{10}=-\frac{1}{2}$, $|-\frac{1}{2}|=\frac{1}{2}\lt1$, thus it converges.
Hence the sum (since $a_1=10$): $\dfrac{10}{1-(-\frac{1}{2})}=\dfrac{10}{1+\frac{1}{2}}$, thus the statement is false.