Module 5 · Lesson 1

Sorting Algorithms

Implement and compare bubble sort, merge sort, and quicksort. Understand why O(n log n) is the gold standard.

Audio: Sorting Algorithms
0:000:00

Sorting Algorithms

Sorting is one of the most studied problems in computer science. Understanding sorting algorithms teaches you fundamental concepts like divide-and-conquer, time complexity, and algorithm tradeoffs.

Bubble Sort — O(n^2)

The simplest sorting algorithm: repeatedly walk through the list, compare adjacent elements, and swap them if they're in the wrong order. Like bubbles rising to the surface.

Pros: Simple to understand and implement Cons: Extremely slow for large datasets

Merge Sort — O(n log n)

Divide the array in half, recursively sort each half, then merge the sorted halves back together. This is divide and conquer in action.

Pros: Guaranteed O(n log n) performance, stable sort Cons: Uses O(n) extra memory for the merge step

The O(n log n) Barrier

No comparison-based sorting algorithm can do better than O(n log n) in the worst case. This is a mathematically proven lower bound. Merge sort and quicksort achieve this optimal performance.

Try It Yourself

  • Add a counter to track how many comparisons each algorithm makes
  • Implement quicksort (pick a pivot, partition, recurse on each side)
  • Test with sorted, reverse-sorted, and random input — how does performance differ?

Code Playground

Edit the code below and click Run to see the output. Switch between languages using the tabs.

Loading editor...

Enjoying the lesson? Unlock the full Algorithms — Sorting & Recursion from $4.99/mo.

See plans →