Asymptotic Notation of the Cost of Algorithms
Definition
The cost of an algorithm for data of size is denoted as follows:
- notation:
- notation:
- notation:
Explanation
Asymptotic notation is a mathematical representation of the cost of an algorithm. It won’t be too difficult or unnatural to grasp if one understands the epsilon-delta argument. To add a more intuitive explanation:
- represents the maximum-worst cost an algorithm can incur.
- represents the minimum-best cost an algorithm can incur.
- represents the average cost incurred by an algorithm.
Since these are sets, they are correctly represented as , but for convenience, they are often expressed as . Also, since , expressions like may appear in the development of equations. To put it intuitively, if the cost is an enormous amount of , then a cost of about is negligible, so it doesn’t matter if it is included in .
From the perspective of a mathematician, the most interesting thing might be the limit of cost, which is . On the other hand, engineers, who need to use algorithms appropriately, should be concerned about the practical cost, which is . Going further, if one is optimizing at the low-level language, even should be considered.