Answer
$\approx 29\;\rm s$
Work Step by Step
We have a wave that moves 100 m horizontally that starts when the water depth is 5 m and ends at a shore of 0 m depth. We need to find the time this wave would take to hit the 0-m depth shore.
We can imagine this as a straight line formula, since the depth increases linearly, as seen in the graph below, at which $y=mx+b$ where $y=d$, $m={\rm solpe}=\dfrac{5}{100}=0.05$, and $b=0$
Thus,
$$d=0.05x\tag 1$$
Recalling that the speed of this kind of waves is given by $$v=\sqrt{gd}$$
where $v=dx/dt$
Plugging $d$ from (1);
$$\dfrac{dx}{dt}=\sqrt{g(0.05x)}$$
And to find the time the wave takes $t$
$$dt=\dfrac{dx}{\sqrt{g(0.05x)}}$$
integrating both sides;
$$\int_0^tdt=\int_0^{x}\dfrac{dx}{\sqrt{g(0.05x)}}$$
$$t=\frac{1}{\sqrt{0.05g}}\int_0^{x}x^{-\frac{1}{2}}dx=\frac{2\sqrt{x}}{\sqrt{0.05g}}\bigg|_0^{100}$$
$$t =\frac{2\sqrt{100}}{\sqrt{0.05(9.8)}}-0=\color{red}{\bf 28.6}\;\rm s$$