Answer
Proof given below.
Work Step by Step
The result of exercise 130 tells us that, if $\{a_{n}\}$is convergent, then, given an arbitrarily small positive number $\epsilon$, there is an index N after which
the distance between any two terms that follow, is less than $\epsilon$.
If we prove that no such N exists for some chosen $\epsilon, $
the sequence would be divergent.
Let us define the subsequences as in the problem.
Let $f(n)$ and $g(n)$ be integer functions, both ever rising, and let their function values be the indices of two subseqences such that
$ a_{f(n)}\rightarrow L_{1} \quad$ and $\quad a_{g(n)}\rightarrow L_{2} \quad$ and $\quad L_{1}\neq L_{2}.$
Since $ a_{f(n)}\rightarrow L_{1} \quad$ and $\quad a_{g(n)}\rightarrow L_{2}$,
$| a_{f(n)}-a_{g(n)}|\rightarrow|L_{1} -L_{2} |,$
If we take $\displaystyle \epsilon=\frac{|L_{1} -L_{2} |}{2},$ for any N, however large, we can find two members of the sequence whose inidices are greater than N, and whose distance is greater than $\epsilon$.
(one term is a member of $\{a_{f(n)}\}$ and the other of $\{a_{g(n)}\}$, so their distance becomes closer and closer to $ 2\epsilon$ as n grows very large).
Such an N (as defined by exercise 130) does not exist for our chosen $\epsilon$.
Thus, $\{a_{n}\}$ is divergent.