Back to blogs
Visualizing Algorithms: A Deep Dive into Sorting

Visualizing Algorithms: A Deep Dive into Sorting

AlgorithmsDSA

Here is the complete, expanded, and updated MDX document. I have converted all the JavaScript code to modern C++, added in-depth explanations, included Time & Space Complexity tables, and added Merge Sort and Quick Sort to match your updated visualizer.


Introduction to Sorting Algorithms

Sorting is one of the most fundamental operations in computer science. It is the process of arranging data in a specific order, typically numerical or lexicographical. Sorting optimizes the efficiency of other algorithms (like binary search) that require input data to be in sorted lists.

In this guide, we'll explore four essential sorting algorithms ranging from elementary to advanced: Bubble Sort, Insertion Sort, Merge Sort, and Quick Sort.

While elementary algorithms like Bubble and Insertion sort are not the most efficient for large datasets, they are excellent for understanding the basic concepts of algorithm design. Advanced algorithms like Merge and Quick sort introduce the powerful "Divide and Conquer" paradigm.


1. Bubble Sort

Bubble Sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. The pass through the list is repeated until the list is completely sorted.

It gets its name because smaller (or larger) elements "bubble" to the top of the list during each iteration.

Complexity

MetricComplexityDescription
Best TimeO(N)\mathcal{O}(N)When the array is already sorted.
Average TimeO(N2)\mathcal{O}(N^2)When elements are in random order.
Worst TimeO(N2)\mathcal{O}(N^2)When the array is reverse-sorted.
SpaceO(1)\mathcal{O}(1)In-place sorting; no extra memory needed.

Implementation (C++)

#include <vector>
#include <utility>
void bubbleSort(std::vector<int>& arr) {
int n = arr.size();
bool swapped;
for (int i = 0; i < n - 1; i++) {
swapped = false;
// Last i elements are already in place
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
std::swap(arr[j], arr[j + 1]);
swapped = true;
}
}
// If no two elements were swapped, array is sorted
if (!swapped) break;
}
}

Bubble Sort performs n1n-1 comparisons in the first pass, n2n-2 in the second, and so on. This reduction of the problem size by 1 in each step leads to the following relation:

T(n)={O(1)if n=1T(n1)+O(n)if n>1T(n) = \begin{cases} \mathcal{O}(1) & \text{if } n = 1 \\ T(n-1) + \mathcal{O}(n) & \text{if } n > 1 \end{cases}

Bubble Sort – Total Comparisons

Bubble Sort's complexity is derived from the sum of the number of comparisons made in each pass. In the first pass, we make n1n-1 comparisons, then n2n-2, and so on, down to 1.

Total Comparisons=i=1n1(ni)=(n1)+(n2)++1\text{Total Comparisons} = \sum_{i=1}^{n-1} (n - i) = (n-1) + (n-2) + \dots + 1

Visualizing Bubble Sort

Let's see Bubble Sort in action. Notice how the largest unsorted element continuously shifts to the rightmost available slot.

Unsorted
Comparing / Swapping
Sorted

2. Insertion Sort

Insertion sort builds the final sorted array one item at a time. It iterates through the input array, consuming one input element per repetition, and grows a sorted output list.

Think of it like sorting a hand of playing cards: you pick up a card, find its correct position among the cards you are already holding, and insert it there.

Complexity

MetricComplexityDescription
Best TimeO(N)\mathcal{O}(N)When the array is already sorted.
Average TimeO(N2)\mathcal{O}(N^2)When elements are in random order.
Worst TimeO(N2)\mathcal{O}(N^2)When the array is reverse-sorted.
SpaceO(1)\mathcal{O}(1)In-place sorting; no extra memory needed.

Implementation (C++)

#include <vector>
void insertionSort(std::vector<int>& arr) {
int n = arr.size();
for (int i = 1; i < n; i++) {
int current = arr[i];
int j = i - 1;
// Move elements greater than current to one position ahead
while (j >= 0 && arr[j] > current) {
arr[j + 1] = arr[j];
j--;
}
arr[j + 1] = current;
}
}

Similar to Bubble Sort, Insertion Sort takes an element and compares it with the already sorted subset (reducing the unsorted problem size by 1), resulting in:

T(n)={O(1)if n=1T(n1)+O(n)if n>1T(n) = \begin{cases} \mathcal{O}(1) & \text{if } n = 1 \\ T(n-1) + \mathcal{O}(n) & \text{if } n > 1 \end{cases}

Insertion Sort – Worst Case

In the worst case (a reverse-sorted array), for every element at index ii, we must compare and shift it with all ii elements to its left.

T(n)=i=1n1i=1+2+3++(n1)T(n) = \sum_{i=1}^{n-1} i = 1 + 2 + 3 + \dots + (n-1)

This summation is identical to Bubble Sort:

T(n)=n(n1)2    O(n2)T(n) = \frac{n(n-1)}{2} \implies \mathcal{O}(n^2)

Visualizing Insertion Sort

Watch how Insertion Sort builds the sorted portion of the array sequentially on the left side.

Unsorted
Comparing / Swapping
Sorted

3. Merge Sort

Merge Sort is a highly efficient, stable sorting algorithm based on the Divide and Conquer strategy. It works by recursively dividing the unsorted list into NN sublists, each containing one element, and then repeatedly merging sublists to produce new sorted sublists until there is only one sorted list remaining.

Complexity

MetricComplexityDescription
Best TimeO(NlogN)\mathcal{O}(N \log N)Consistently halves the array.
Average TimeO(NlogN)\mathcal{O}(N \log N)Scales brilliantly with large datasets.
Worst TimeO(NlogN)\mathcal{O}(N \log N)Performance is guaranteed regardless of input.
SpaceO(N)\mathcal{O}(N)Requires temporary arrays for the merging phase.

Implementation (C++)

#include <vector>
void merge(std::vector<int>& arr, int left, int mid, int right) {
std::vector<int> temp(right - left + 1);
int i = left, j = mid + 1, k = 0;
while (i <= mid && j <= right) {
if (arr[i] <= arr[j]) temp[k++] = arr[i++];
else temp[k++] = arr[j++];
}
while (i <= mid) temp[k++] = arr[i++];
while (j <= right) temp[k++] = arr[j++];
for (int p = 0; p < k; p++) {
arr[left + p] = temp[p];
}
}
void mergeSort(std::vector<int>& arr, int left, int right) {
if (left >= right) return;
int mid = left + (right - left) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}

Merge Sort follows a perfect "Divide and Conquer" strategy, splitting the array into two equal halves and then performing a linear-time merge:

T(n)={O(1)if n=12T(n2)+O(n)if n>1T(n) = \begin{cases} \mathcal{O}(1) & \text{if } n = 1 \\ 2T\left(\frac{n}{2}\right) + \mathcal{O}(n) & \text{if } n > 1 \end{cases}

Merge Sort – Expansion Method

To find the complexity of Merge Sort, we expand the recurrence relation T(n)=2T(n/2)+nT(n) = 2T(n/2) + n through substitution:

  1. Level 1: T(n)=2T(n/2)+nT(n) = 2T(n/2) + n
  2. Level 2: T(n)=2[2T(n/4)+n/2]+n=4T(n/4)+2nT(n) = 2[2T(n/4) + n/2] + n = 4T(n/4) + 2n
  3. Level 3: T(n)=8T(n/8)+3nT(n) = 8T(n/8) + 3n
  4. Level kk: T(n)=2kT(n/2k)+knT(n) = 2^k T(n/2^k) + kn

The recursion stops when n/2k=1n/2^k = 1, which means k=log2nk = \log_2 n. Substituting kk back into the equation:

T(n)=nT(1)+nlog2n=n(1)+nlog2nT(n) = n \cdot T(1) + n \log_2 n = n(1) + n \log_2 n T(n)=O(nlogn)T(n) = \mathcal{O}(n \log n)

Visualizing Merge Sort

Observe how the algorithm groups smaller segments and zips them together into larger, sorted blocks.

Unsorted
Comparing / Swapping
Sorted

4. Quick Sort

Quick Sort is another prominent Divide and Conquer algorithm. It works by selecting a "pivot" element from the array and partitioning the other elements into two sub-arrays, according to whether they are less than or greater than the pivot. The sub-arrays are then sorted recursively.

Despite having a worse theoretical worst-case time than Merge Sort, it is often faster in practice because its inner loop can be efficiently implemented on most architectures.

Complexity

MetricComplexityDescription
Best TimeO(NlogN)\mathcal{O}(N \log N)When the pivot perfectly halves the array.
Average TimeO(NlogN)\mathcal{O}(N \log N)Expected time for randomized inputs.
Worst TimeO(N2)\mathcal{O}(N^2)When the array is already sorted (if pivot is poorly chosen).
SpaceO(logN)\mathcal{O}(\log N)Call stack space for recursion.

Implementation (C++)

#include <vector>
#include <utility>
int partition(std::vector<int>& arr, int low, int high) {
int pivot = arr[high]; // Choosing the last element as pivot
int i = low - 1; // Index of smaller element
for (int j = low; j < high; j++) {
if (arr[j] < pivot) {
i++;
std::swap(arr[i], arr[j]);
}
}
std::swap(arr[i + 1], arr[high]);
return i + 1;
}
void quickSort(std::vector<int>& arr, int low, int high) {
if (low < high) {
int pi = partition(arr, low, high);
// Recursively sort elements before and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}

In the average case (where the pivot splits the array into roughly equal halves), Quick Sort follows a relation identical to Merge Sort:

T(n)={O(1)if n=12T(n2)+O(n)if n>1T(n) = \begin{cases} \mathcal{O}(1) & \text{if } n = 1 \\ 2T\left(\frac{n}{2}\right) + \mathcal{O}(n) & \text{if } n > 1 \end{cases}

In the worst case (where the pivot is the smallest or largest element, such as in an already sorted array), the problem size only reduces by 1, making it as slow as Bubble Sort:

Quick Sort - Average Case Expansion Quick Sort (Average Case)

T(n)={O(1)if n=1T(n1)+O(n)if n>1T(n) = \begin{cases} \mathcal{O}(1) & \text{if } n = 1 \\ T(n-1) + \mathcal{O}(n) & \text{if } n > 1 \end{cases}

In the average case, where each partition splits the array into two roughly equal halves, the cost is represented by the sum of work at each level of the recursion tree. Since the tree depth is logn\log n and the work at each level is nn:

T(n)=i=0log2nn=n+n++nlog2n timesT(n) = \sum_{i=0}^{\log_2 n} n = \underbrace{n + n + \dots + n}_{\log_2 n \text{ times}} T(n)=nlog2n=O(nlogn)T(n) = n \cdot \log_2 n = \mathcal{O}(n \log n)

Quick Sort – Worst Case Expansion

In the worst case, the pivot is always the smallest or largest element. This results in a highly unbalanced tree with a depth of nn. The work done is:

T(n)=n+(n1)+(n2)++1T(n) = n + (n-1) + (n-2) + \dots + 1 T(n)=k=1nk=n(n+1)2T(n) = \sum_{k=1}^{n} k = \frac{n(n+1)}{2} T(n)=O(n2)T(n) = \mathcal{O}(n^2)

Visualizing Quick Sort

Notice how a pivot is chosen, and elements are physically thrown to the left or right of the pivot before recursing on those halves.

Unsorted
Comparing / Swapping
Sorted

Comparison Summary Table

AlgorithmRecurrence (T(n)T(n))Expansion FormClosed Form
BubbleT(n1)+nT(n-1) + ni=1ni\sum_{i=1}^{n} iO(n2)\mathcal{O}(n^2)
InsertionT(n1)+nT(n-1) + ni=1ni\sum_{i=1}^{n} iO(n2)\mathcal{O}(n^2)
Merge2T(n/2)+n2T(n/2) + ni=1lognn\sum_{i=1}^{\log n} nO(nlogn)\mathcal{O}(n \log n)
Quick (Avg)2T(n/2)+n2T(n/2) + ni=1lognn\sum_{i=1}^{\log n} nO(nlogn)\mathcal{O}(n \log n)
Quick (Worst)T(n1)+nT(n-1) + ni=1ni\sum_{i=1}^{n} iO(n2)\mathcal{O}(n^2)

Conclusion

Understanding these sorting algorithms is a crucial milestone in mastering data structures and computer science fundamentals.

  • Use Bubble/Insertion sort when dealing with tiny datasets or nearly-sorted data.
  • Use Merge sort when you need a guaranteed O(NlogN)\mathcal{O}(N \log N) stable sort and have memory to spare.
  • Use Quick sort for standard, highly optimized, in-place sorting on general data.

Keep practicing and analyzing their visual behaviors to build an intuitive understanding of how these algorithms manipulate memory in real-time!

Here are the formal recurrence relations for each of the sorting algorithms, formatted in the exact style you requested. You can drop these directly into your MDX document.