Continuous but Not Differentiable Functions: Weierstrass Function
Theorem
There exists a continuous function that cannot be differentiated anywhere.
Proof
Strategy: Consider continuous functions $g_{1} (x) := | x - 1 |$ and $g_{2} (x) := | x - 2 |$. $g_{1}$ is not differentiable at $x=1$, and $g_{2}$ is not differentiable at $x=2$. $(g_{1} + g_{2})$ is not differentiable at both points $x = 1$ and $x = 2$. In this way, if we construct $\displaystyle G: = \sum_{k=1}^{\infty} g_{k}$, then $G$ will not be differentiable at $x \in \mathbb{N}$. Of course, this is too differentiable in many places to be called a Weierstrass function. The actual Weierstrass function $F$ is made up of the sum of $f_{k}$, in which the points of non-differentiability rapidly increase while maintaining continuity.
Part 1. Continuity of $F$
$$ f_{0} (x) := \begin{cases} x &, 0 \le x < {{ 1 } \over { 2 }} \\ 1 - x &, {{1} \over {2}} \le x < 1 \end{cases} $$
$$ f_{0} (x) := f_{0} (x + 1) $$
Define a periodic function $f_{0}$ like the above and then define $f_{k}$ and $F$ as follows.
$$ f_{k} (x) := {{ f_{0} ( 2_{k} x ) } \over { 2^{k} }} $$
$$ F (x) := \sum_{k=0}^{\infty} f_{k} (x) $$
$f_{k}$ is continuous at $\mathbb{R}$ as shown in the figure above.
Weierstrass M-test: If there exists a sequence of positive numbers $M_{n}$ for the sequence of functions $\left\{ f_{n} \right\}$ and $x \in E$ satisfying $|f_{n}(z)| \le M_{n}$ and $\displaystyle \sum_{n=1}^{\infty} M_{n}$ converges, then $\displaystyle \sum_{n=1}^{\infty} f_{n}$ absolutely converges and uniformly converges in $E$.
Properties of Function Series: Let’s say $E$ uniformly converges in $\displaystyle F := \sum_{k=1}^{ \infty } f_{k}$. If $f_{n}$ is continuous in $x_{0} \in E$, then $F$ is also continuous in $x_{0} \in E$.
$$ \begin{align*} M_{n} := {{ 1 } \over { 2^{n+1} }} \end{align*} $$
If we say
$$ | f_{n} (x) | \le M_{n} $$
$$ \sum_{n=0}^{\infty} M_{n} = 1 $$
Hence, $F$ uniformly converges, and $F$ is continuous.
Part 2. Non-differentiability of $F$
Since $F$ has a period of $1$, it suffices to show that it is not differentiable at $[ 0 , 1 )$ to prove it is not differentiable at $\mathbb{R}$. Suppose $F$ is differentiable at some $x_{0} \in [0,1)$.
Part 2-1. Divergence of $\displaystyle \sum_{k=0}^{\infty} c_{k}$
$$ \displaystyle \alpha_{n} := {{p} \over {2^{n} }} $$
$$ \displaystyle \beta_{n} := {{p+1} \over {2^{n} }} $$
If set this way, it should be possible to choose $p \in \mathbb{Z}$ for $n \in \mathbb{N}$ so that $x_{0} \in [ \alpha_{n} , \beta_{n} )$ holds. $[ \alpha_{n} , \beta_{n} )$ is a length of $\displaystyle {{1} \over {2^{n}}}$ and includes $x_{0}$ as shown in the next figure, being the $(p+1)$st interval of $[0,1)$.
As $n$ grows, $[ \alpha_{n} , \beta_{n} )$ halves each time, and if we set
$$ [ \alpha_{n} , \beta_{n} ] \subseteq [ \alpha_{k+1} , \beta_{k+1} ] $$
Then, since $[ \alpha_{k+1} , \beta_{k+1} ]$ only increases or decreases in $f_{k}$, it does so in an even smaller or equal interval of $[ \alpha_{n} , \beta_{n} ]$. Therefore, if we define $c_{k}$ as
$$ c_{k} := {{ f_{k} ( \beta_{n} ) - f_{k} ( \alpha_{n} ) } \over { \beta_{n} - \alpha_{n} }} $$
Then $c_{k}$ must be either $c_{k} = 1$ or $c_{k} = -1$ regardless of $n$. According to the Properties of Infinite Series, if $c_{k}$ does not converge, then $\displaystyle \sum_{k=0}^{\infty} c_{k}$ must diverge. This can be shown in the same manner regardless of how $n \in \mathbb{N}$ is given, hence it can be stated that $\displaystyle \sum_{k=0}^{\infty} c_{k}$ does not converge irrespective of $n$.
Part 2-2. $\displaystyle F ' (x_{0}) = \sum_{k=0}^{\infty} c_{k}$
Assuming $F$ is differentiable at $x_{0}$ and when $n \to \infty$, then $[ \alpha_{n} , \beta_{n} ] \to [x_{0} , x_{0}]$ thus
$$ F ' (x_{0}) = \lim_{n \to \infty} {{ F ( \beta_{n} ) - F ( \alpha_{n} ) } \over { \beta_{n} - \alpha_{n} }} $$
Meanwhile, for $k \ge n$, since $f_{k} ( \alpha_{n} ) = f_{k} ( \beta_{n} ) = 0$, it is possible to express $F$ as a finite series like
$$ F ( \alpha_{n} ) = \sum_{k=0}^{\infty} f_{k} ( \alpha_{n} ) = \sum_{k=0}^{n-1} f_{k} ( \alpha_{n} ) $$
$$ F ( \beta_{n} ) = \sum_{k=0}^{\infty} f_{k} ( \beta_{n} ) = \sum_{k=0}^{n-1} f_{k} ( \beta_{n} ) $$
Then
$$ \begin{align*} \sum_{k=0}^{\infty} c_{k} =& \lim_{n \to \infty} \sum_{k=0}^{n-1} c_{k} \\ =& \lim_{n \to \infty} {{ \sum_{k=0}^{n-1} f_{k} ( \beta_{n} ) - \sum_{k=0}^{n-1} f_{k} ( \alpha_{n} ) } \over { \beta_{n} - \alpha_{n} }} \\ =& \lim_{n \to \infty} {{ F ( \beta_{n} ) - F ( \alpha_{n} ) } \over { \beta_{n} - \alpha_{n} }} \\ =& F ' ( x_{0} ) \end{align*} $$
Despite $\displaystyle F ' (x_{0}) = \sum_{k=0}^{\infty} c_{k}$ in Part 2-2, since $\displaystyle \sum_{k=0}^{\infty} c_{k}$ diverges as shown in Part 2-1, it contradicts the assumption.
■