logo

Rigorous Definition of Dynamical Systems 📂Dynamics

Rigorous Definition of Dynamical Systems

Definition 1

For a space $X$ and a moment $t \in T$, an operator $\varphi^{t}$ is called a Flow. If the set of flows $F := \left\{ \varphi^{t} \right\}_{t \in T}$ satisfies $\left( F , \circ \right)$ with respect to the function composition operation $\circ$, then the triple $\left( T, X, \varphi^{t} \right)$ is called a Dynamic System. $$ \begin{align*} \varphi^{0} =& \text{id} \\ \varphi^{t+s} =& \varphi^{t} \circ \varphi^{s} \end{align*} $$

Explanation

It is often said that maps correspond to $T = \mathbb{Z}$ and differential equations correspond to $T = \mathbb{R}$. This implies that dynamic systems are not strictly defined by only maps and differential equations.

Many explain dynamic systems as systems where the State at any moment can be represented by past states, but this is neither rigorous nor particularly intuitive. For understanding the concept without mathematical expressions, it is better to study examples like dynamic systems represented by maps or dynamic systems represented by differential equations. And for those who like mathematical expressions, the above definition should be appealing.

See also


  1. Kuznetsov. (1998). Elements of Applied Bifurcation Theory(2nd Edition): p27. ↩︎