Answer
Proof given below.
Work Step by Step
Let $f(n)$ and $g(n)$ be integer functions, both ever rising, and let their function values be the indices of two subseqences such that
$ a_{f(n)}\rightarrow L_{1} \quad$ and $\quad a_{g(n)}\rightarrow L_{2} \quad$ and $\quad L_{1}\neq L_{2}.$
The result of exercise 130 tells us that, if $\{a_{n}\}$ is convergent, then, given an arbitrarily small positive number $\epsilon$, there is an index N after which the distance between any two terms that follow, is less than $\epsilon$.
If we prove that no such N exists for some chosen $\epsilon, $ the sequence would be divergent.
Since $ a_{f(n)}\rightarrow L_{1} \quad$ and $\quad a_{g(n)}\rightarrow L_{2}$,
$| a_{f(n)}-a_{g(n)}|\rightarrow|L_{1} -L_{2} |,$
so, if we take $\displaystyle \epsilon=\frac{|L_{1} -L_{2} |}{2},$
then for any N, however large, we can find two members of the sequence with indices greater than N, whose distance is greater than $\epsilon$.
(one term is a member of $\{a_{f(n)}\}$ and the other of $\{a_{g(n)}\}$, so their distance becomes closer and closer to $ 2\epsilon$ as n grows very large).
Thus, such an N does not exist.
Hence, $\{a_{n}\}$ is divergent.