logo

Time Complexity of Comparison Sorting Algorithms 📂Algorithm

Time Complexity of Comparison Sorting Algorithms

Theorem

Given $n$ pieces of data, the time complexity of comparison sorting algorithms is as follows:

  • [1] Bubble sort: $$ \Theta ( n^2 ) \\ O ( n^2 ) $$
  • [2] Selection sort: $$ \Theta ( n^2 ) \\ O ( n^2 ) $$
  • [3] Insertion sort: $$ \Theta ( n^2 ) \\ O ( n^2 ) $$
  • [4] Heap sort: $$ \Theta ( n \log n ) \\ O ( n \log n ) $$
  • [5] Merge sort: $$ \Theta ( n \log n ) \\ O ( n \log n ) $$
  • [6] Quick sort: $$ \Theta ( n \log n ) \\ O ( n^2 ) $$

Description

We won’t delve into the principles of each algorithm introduced here. There are already too many documents that explain them well, and the perspective of the Fresh Shrimp Sushi Shop is somewhat distant from those principles.

Sorting algorithms are almost always mobilized in all programming, but simply being ‘implemented’ and ‘generally trusting the most superior algorithm’ alone cannot literally ‘optimize’. As you can see, simple sorting algorithms can successfully sort but waste too much time, whereas under certain conditions, they can outperform more complex algorithms. Understanding these differences clearly and using algorithms properly in the right place is the programmer’s role.

[1] Bubble Sort

Bubble\_sort\_animation.gif

Named for its bubble-rising-like appearance beneath the surface, as seen in the gif. It is the most inferior among comparison sorting algorithms in terms of time complexity, but its code implementation is very simple and can be quickly drafted and used in underprivileged environments where one cannot rely on packages or internal functions.

[2] Selection Sort

Selection\_sort\_animation.gif

Since it only moves the largest number from the unsorted area, remembering only the largest number is enough, aside from remembering the whole data. It is inferior in terms of time complexity, but very superior in terms of space complexity.

[3] Insertion Sort

Insertion\_sort\_animation.gif

Since it involves taking out one from the array ($n$ times) and finding where it should belong in a new array ($n$ times) to insert it, it is indeed $O(n^2)$. However, if the array is almost sorted, there’s no need to spend time finding a new position, so it can take almost $\Theta ( n)$ time. In real programming, ’the array being almost sorted’ means the array is small. For instance, if it is about $n < 20$, the array might often appear almost sorted by sheer luck.

[4] Heap Sort

Sorting\_heapsort\_anim.gif

Using the heap data structure allows for sorting in just $O ( \log n )$ time, as long as the heap is structured. Building the heap requires $O( n)$ time, but considering this, it is very fast compared to [1]~[3]. Heap sort is particularly advantageous in situations where sorting is not processed at once. The most time-consuming part of heap sorting is constructing the heap, but if data is added a little at a time, repairing an already constructed heap is not a big burden, and sorting is extremely fast as long as the heap is structured.

[5] Merge Sort

Merge-sort-example-300px.gif

Developed by John von Neumann, it is a very fast algorithm with a time complexity of $O ( n \log n )$ and belongs to the stable sorting algorithms where the order of elements with the same value is maintained even after the algorithm. Simply put, if there are $(1, 3_{2}, 2, 3_{1}, 5)$, it remains as $(1, 2, 3_{2}, 3_{1}, 5)$, $3_{2}$, $3_{1}$ in order. This might be useless for a one-dimensional array, but for a data table that needs to be sorted by some representative value, this stability is necessary. The order of $3_{2}$, $3_{1}$ itself doesn’t matter, but the data they carry could be arranged in some meaningful order.

[6] Quick Sort

Sorting\_quicksort\_anim.gif

Quick sort’s biggest advantage is, as its name implies, its speed. Theoretically, it is inferior to heap and merge sorts with a time complexity of $O( n^2)$, and $\Theta ( n \log n )$ doesn’t seem to be an advantage either, but in practical use, it shows superior performance in almost all cases. The ‘generally most superior algorithm’ mentioned earlier refers to this quick sort. Similarly, if the features of other mentioned sorting algorithms are understood, one will wisely choose quick sort while smartly opting for other algorithms where applicable.