Answer
The interval of convergence for the series is: $(-1,1)$, and the radius of convergence is $R=1$ and $\Sigma_{k=0}^{\infty} (-1)^k x^k = \dfrac{1}{1+x}$
Work Step by Step
Ratio Test: Let us consider a series $\Sigma a_k$ whose limit $l$ can be obtained as: $ l=\lim\limits_{k \to \infty} |\dfrac{a_{k+1}}{a_k}|$
1. For $l \lt 1$, the series is absolutely convergent.
2. For $l \gt 1$, the series is divergent.
3. For $l = 1$, the series is inconclusive.
Therefore, $ L=\lim\limits_{k \to \infty} |\dfrac{a_{k+1}}{a_k}|=\lim\limits_{k \to \infty} \dfrac{x^{k+1}}{x^k} \\=\lim\limits_{k \to \infty} |x|$
So, we can conclude that the given series converges absolutely for $|x| \lt 1$ and diverges for $|x| \gt 1$ by the ratio test. But the test fails for the value of $|x|=1$.
$\bf{Case -1}:$ For $x=1$, the given series becomes: $\Sigma_{k=0}^{\infty} (-1)^k (1)^k = \Sigma_{k=0}^{\infty}(-1^k)$; a divergent series.
$\bf{Case -2}:$ For $x=-1$, the given series becomes: $\Sigma_{k=0}^{\infty} (-1)^k (-1)^k = \Sigma_{k=0}^{\infty}(1)$; a divergent series.
Thus, the interval of convergence for the series is: $(-1,1)$ and the radius of convergence is $R=1$.
Now, we have: $\Sigma_{k=0}^{\infty} (-1)^k x^k = \dfrac{1}{1-(-x)}=\dfrac{1}{1+x}$