logo

Asymptotic Notation of the Cost of Algorithms 📂Algorithm

Asymptotic Notation of the Cost of Algorithms

Definition

The cost of an algorithm for data of size $n$ is denoted as follows:

  1. $O$ notation: $$ O(g(n)) := \left\{ f(n) \ | \ \exists c > 0, n_{0} \in \mathbb{N} : \forall n \ge n_{0} \implies f(n) \le c g(n) \right\} $$
  2. $\Omega$ notation: $$ \Omega (g(n)) := \left\{ f(n) \ | \ \exists c > 0, n_{0} \in \mathbb{N} : \forall n \ge n_{0} \implies f(n) \ge c g(n) \right\} $$
  3. $\Theta$ notation: $$ \Theta (g(n)) := O (g(n)) \cap \Omega (g(n)) $$

Explanation

Asymptotic notation is a mathematical representation of the cost of an algorithm. It won’t be too difficult or unnatural to grasp if one understands the epsilon-delta argument. To add a more intuitive explanation:

  • $O(g(n))$ represents the maximum-worst cost an algorithm can incur.
  • $\Omega (g(n))$ represents the minimum-best cost an algorithm can incur.
  • $\Theta (g(n))$ represents the average cost incurred by an algorithm.

Since these are sets, they are correctly represented as $7n^3 \in O (n^3)$, but for convenience, they are often expressed as $7n^3 = O (n^3)$. Also, since $O(n^3) \subset O(n)$, expressions like $O(n^3) + O(n) = O(n^3)$ may appear in the development of equations. To put it intuitively, if the cost is an enormous amount of $O(n^3)$, then a cost of about $O(n)$ is negligible, so it doesn’t matter if it is included in $O(n^3)$.

From the perspective of a mathematician, the most interesting thing might be the limit of cost, which is $O(g(n))$. On the other hand, engineers, who need to use algorithms appropriately, should be concerned about the practical cost, which is $\Theta (g(n))$. Going further, if one is optimizing at the low-level language, even $\Omega (g(n))$ should be considered.