Sorting and Algorithms - From Chaos to Order
Introduction: When I Finally Got It
I didn't understand why algorithms mattered. I just wrote loops to sort data. Sure, my code was slow sometimes, but it worked.
Then I interviewed at a company and they asked me to sort 1 million numbers. I wrote bubble sort. They said "okay, how long would that take?" I said "a few seconds probably."
They explained: wrong. About 15 minutes on a decent computer.
Then they showed me merge sort. Same 1 million numbers: 20 milliseconds.
That's when it clicked. Algorithm choice doesn't just matter—it's the difference between "this works" and "this works great vs doesn't work at all."
I started learning sorting algorithms, complexity analysis, Big O notation. Not for interviews (though that helped). For actual coding. Because picking the right algorithm changed everything.
This guide is practical algorithmic thinking.
1. Big O Notation and Complexity Analysis
Understanding Big O
Big O measures how algorithm performance scales as input grows
Notation: O(f(n)) - upper bound on growth
Common Classes (fastest to slowest):
O(1) - Constant time (array access, hash lookup)
O(log n) - Logarithmic (binary search)
O(n) - Linear (simple loop through array)
O(n log n) - Linearithmic (merge sort, quick sort)
O(n²) - Quadratic (nested loops)
O(2ⁿ) - Exponential (recursive without memoization)
O(n!) - Factorial (permutations)
Visual Scalability:
Input Size | O(1) | O(n) | O(n²) | O(2ⁿ)
1000 | 1 | 1K | 1M | 10³⁰⁰ (impossible)
10,000 | 1 | 10K | 100M | ∞
100,000 | 1 | 100K | 10B | ∞Analyzing Complexity
// O(1) - Constant time
function getFirst(arr) {
return arr[0];
}
// O(n) - Linear time (loop once)
function findMax(arr) {
let max = arr[0];
for (let i = 1; i < arr.length; i++) {
if (arr[i] > max) max = arr[i];
}
return max;
}
// O(n²) - Quadratic time (nested loops)
function bubbleSort(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < arr.length - 1; j++) {
if (arr[j] > arr[j + 1]) {
[arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
}
}
}
return arr;
}
// O(log n) - Binary search on sorted array
function binarySearch(arr, target) {
let left = 0, right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1;
}
// O(n log n) - Merge sort
function mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}
function merge(left, right) {
const result = [];
let i = 0, j = 0;
while (i < left.length && j < right.length) {
if (left[i] <= right[j]) result.push(left[i++]);
else result.push(right[j++]);
}
return result.concat(left.slice(i), right.slice(j));
}2. Sorting Algorithms
Bubble Sort - O(n²)
Concept: Repeatedly swap adjacent elements if they're in wrong order
Process (sorting [5, 2, 8, 1]):
Pass 1: [2, 5, 1, 8] (largest bubbles to end)
Pass 2: [2, 1, 5, 8] (second largest in place)
Pass 3: [1, 2, 5, 8] (sorted)
Time Complexity:
- Best: O(n) - already sorted with optimization
- Average: O(n²)
- Worst: O(n²) - reverse order
Space: O(1) - sorts in-placefunction bubbleSort(arr) {
const n = arr.length;
for (let i = 0; i < n - 1; i++) {
let swapped = false;
// Last i elements already sorted
for (let j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
[arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
swapped = true;
}
}
// Optimization: if no swaps, already sorted
if (!swapped) break;
}
return arr;
}
// Visualization: [5, 2, 8, 1, 9]
// Pass 1: compare 5-2, 2-8, 8-1, 1-9 → [2, 5, 1, 8, 9]
// Pass 2: compare 2-5, 5-1, 1-8 → [2, 1, 5, 8, 9]
// Pass 3: compare 2-1, 1-5 → [1, 2, 5, 8, 9]
// Done!When to use:
- Educational purposes
- Nearly sorted data
- Small datasets
- NOT for production (too slow)
Selection Sort - O(n²)
Concept: Find minimum, move to sorted position, repeat
Process:
1. Find minimum in unsorted portion
2. Swap with first unsorted position
3. Move boundary left
4. Repeat
Time: O(n²) always - no optimization possible
Space: O(1) - in-placefunction selectionSort(arr) {
const n = arr.length;
for (let i = 0; i < n - 1; i++) {
let minIdx = i;
// Find minimum in remaining array
for (let j = i + 1; j < n; j++) {
if (arr[j] < arr[minIdx]) {
minIdx = j;
}
}
// Swap minimum to sorted position
[arr[i], arr[minIdx]] = [arr[minIdx], arr[i]];
}
return arr;
}Insertion Sort - O(n²) Average
Concept: Build sorted array one item at a time
Like sorting playing cards in hand:
- Take next card
- Find correct position
- Insert in sorted array
- Repeat
Time:
- Best: O(n) - already sorted
- Average: O(n²)
- Worst: O(n²)
Space: O(1) - in-place
Stable: Yes (maintains order of equal elements)function insertionSort(arr) {
for (let i = 1; i < arr.length; i++) {
const key = arr[i];
let j = i - 1;
// Find correct position for key
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j--;
}
arr[j + 1] = key;
}
return arr;
}
// Why it's good for nearly sorted data:
// [1, 2, 3, 4, 5, 0]
// 0 moves to position 0 → O(n) because mostly in orderMerge Sort - O(n log n) Guaranteed
Concept: Divide-and-conquer
1. Divide array in half recursively
2. Merge sorted halves
Time: O(n log n) - always
Space: O(n) - requires extra space
Stable: Yesfunction mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}
function merge(left, right) {
const result = [];
let i = 0, j = 0;
while (i < left.length && j < right.length) {
if (left[i] <= right[j]) {
result.push(left[i++]);
} else {
result.push(right[j++]);
}
}
// Add remaining elements
return result.concat(left.slice(i), right.slice(j));
}
// Visualization:
// [38, 27, 43, 3, 9, 82, 10]
// Divide: [38, 27, 43] | [3, 9, 82, 10]
// Divide: [38] [27, 43] | [3, 9] [82, 10]
// Merge: [27, 38, 43] | [3, 9, 10, 82]
// Merge: [3, 9, 10, 27, 38, 43, 82]Advantages:
- Predictable O(n log n)
- Stable
- Good for linked lists
- Parallelizable
Disadvantages:
- Requires O(n) extra space
- Slower for small arrays than insertion sort
Quick Sort - O(n log n) Average, O(n²) Worst
Concept: Partition-based sorting
1. Pick pivot element
2. Partition: smaller left, larger right
3. Recursively sort partitions
Time:
- Best: O(n log n) - balanced partitions
- Average: O(n log n)
- Worst: O(n²) - bad pivot choice
Space: O(log n) - recursion depth
Stable: Depends on implementationfunction quickSort(arr, low = 0, high = arr.length - 1) {
if (low < high) {
const pi = partition(arr, low, high);
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
return arr;
}
function partition(arr, low, high) {
// Pivot strategy: pick last element
const pivot = arr[high];
let i = low - 1;
for (let j = low; j < high; j++) {
if (arr[j] < pivot) {
i++;
[arr[i], arr[j]] = [arr[j], arr[i]];
}
}
[arr[i + 1], arr[high]] = [arr[high], arr[i + 1]];
return i + 1;
}
// Better pivot strategy: random
function partition(arr, low, high) {
const randomIdx = low + Math.floor(Math.random() * (high - low + 1));
[arr[randomIdx], arr[high]] = [arr[high], arr[randomIdx]];
const pivot = arr[high];
let i = low - 1;
for (let j = low; j < high; j++) {
if (arr[j] < pivot) {
i++;
[arr[i], arr[j]] = [arr[j], arr[i]];
}
}
[arr[i + 1], arr[high]] = [arr[high], arr[i + 1]];
return i + 1;
}Why Quick Sort is preferred:
- Faster in practice than merge sort (lower constants)
- O(log n) space vs merge sort's O(n)
- Cache-friendly (works in-place)
- Used in production (Java, Python, C++)
Heap Sort - O(n log n)
Time: O(n log n) - always
Space: O(1) - in-place
Stable: No (not suitable when stability needed)function heapSort(arr) {
const n = arr.length;
// Build max heap
for (let i = Math.floor(n / 2) - 1; i >= 0; i--) {
heapify(arr, n, i);
}
// Extract elements from heap
for (let i = n - 1; i > 0; i--) {
[arr[0], arr[i]] = [arr[i], arr[0]];
heapify(arr, i, 0);
}
return arr;
}
function heapify(arr, n, i) {
let largest = i;
const left = 2 * i + 1;
const right = 2 * i + 2;
if (left < n && arr[left] > arr[largest]) largest = left;
if (right < n && arr[right] > arr[largest]) largest = right;
if (largest !== i) {
[arr[i], arr[largest]] = [arr[largest], arr[i]];
heapify(arr, n, largest);
}
}Radix Sort - O(d * n)
Non-comparison sort: sorts by individual digits
Time: O(d * n) where d = number of digits
Space: O(n + k) where k = range of digits
Great for: Integers, strings of fixed length
Example: Sorting [170, 45, 75, 90, 2, 802, 24, 2, 66]
Least Significant Digit (LSD) approach:
1. Sort by ones place
2. Sort by tens place
3. Sort by hundreds place
...
Result: [2, 2, 24, 45, 66, 75, 90, 170, 802]function radixSort(arr) {
const max = Math.max(...arr);
let exp = 1;
while (Math.floor(max / exp) > 0) {
countingSortByDigit(arr, exp);
exp *= 10;
}
return arr;
}
function countingSortByDigit(arr, exp) {
const n = arr.length;
const output = new Array(n);
const count = new Array(10).fill(0);
for (let i = 0; i < n; i++) {
const digit = Math.floor((arr[i] / exp) % 10);
count[digit]++;
}
for (let i = 1; i < 10; i++) {
count[i] += count[i - 1];
}
for (let i = n - 1; i >= 0; i--) {
const digit = Math.floor((arr[i] / exp) % 10);
output[count[digit] - 1] = arr[i];
count[digit]--;
}
for (let i = 0; i < n; i++) {
arr[i] = output[i];
}
}3. Searching Algorithms
Linear Search - O(n)
function linearSearch(arr, target) {
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) return i;
}
return -1;
}
// Use when: Array unsorted or small
// Time: O(n) - worst case entire arrayBinary Search - O(log n)
Prerequisite: Array must be sorted
Concept: Eliminate half of remaining elements each iteration
Time: O(log n) - 1M items = 20 comparisons max
Space: O(1) for iterative, O(log n) for recursive// Iterative (more efficient)
function binarySearch(arr, target) {
let left = 0, right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1; // Not found
}
// Recursive
function binarySearchRec(arr, target, left = 0, right = arr.length - 1) {
if (left > right) return -1;
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) return binarySearchRec(arr, target, mid + 1, right);
return binarySearchRec(arr, target, left, mid - 1);
}
// Variations:
// Find leftmost position: bisect_left
// Find rightmost position: bisect_right
// Check if exists: Any binary search variant4. Algorithm Complexity Comparison
Algorithm | Best | Average | Worst | Space | Stable
Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes
Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No
Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes
Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes
Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No
Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No
Radix Sort | O(dn) | O(dn) | O(dn) | O(n+k) | Yes5. Real-World Performance
Arrays of 1000 elements (average case, milliseconds):
Bubble Sort: 15ms
Selection Sort: 12ms
Insertion Sort: 3ms (nearly sorted data)
Merge Sort: 0.5ms
Quick Sort: 0.1ms
Radix Sort: 0.08ms
Key insights:
- O(n²) not acceptable for n > 10,000
- Quick sort 100x faster than bubble sort
- For nearly sorted: insertion sort competitive
- For production: use language built-in (highly optimized)6. Interview Problems
Problem 1: Sort Colors (Medium)
// Given array with 0, 1, 2 → sort in-place
// Input: [2, 0, 2, 1, 1, 0]
// Output: [0, 0, 1, 1, 2, 2]
// Approach: Dutch National Flag (3-way partition)
function sortColors(nums) {
let left = 0, mid = 0, right = nums.length - 1;
while (mid <= right) {
if (nums[mid] === 0) {
[nums[left], nums[mid]] = [nums[mid], nums[left]];
left++;
mid++;
} else if (nums[mid] === 1) {
mid++;
} else { // nums[mid] === 2
[nums[mid], nums[right]] = [nums[right], nums[mid]];
right--;
}
}
return nums;
}
// Time: O(n), Space: O(1)Problem 2: Merge K Sorted Lists (Hard)
// Input: [[1, 4, 5], [1, 3, 4], [2, 6]]
// Output: [1, 1, 2, 3, 4, 4, 5, 6]
// Approach: Min Heap (Priority Queue)
class MinHeap {
constructor() {
this.heap = [];
}
push(val) {
this.heap.push(val);
this.bubbleUp(this.heap.length - 1);
}
pop() {
if (this.heap.length === 0) return null;
const min = this.heap[0];
const last = this.heap.pop();
if (this.heap.length > 0) {
this.heap[0] = last;
this.bubbleDown(0);
}
return min;
}
bubbleUp(idx) {
while (idx > 0) {
const parentIdx = Math.floor((idx - 1) / 2);
if (this.heap[parentIdx] > this.heap[idx]) {
[this.heap[parentIdx], this.heap[idx]] = [this.heap[idx], this.heap[parentIdx]];
idx = parentIdx;
} else break;
}
}
bubbleDown(idx) {
while (2 * idx + 1 < this.heap.length) {
let smallerIdx = 2 * idx + 1;
if (2 * idx + 2 < this.heap.length && this.heap[2 * idx + 2] < this.heap[smallerIdx]) {
smallerIdx = 2 * idx + 2;
}
if (this.heap[smallerIdx] < this.heap[idx]) {
[this.heap[smallerIdx], this.heap[idx]] = [this.heap[idx], this.heap[smallerIdx]];
idx = smallerIdx;
} else break;
}
}
}
function mergeKLists(lists) {
const heap = new MinHeap();
const result = [];
for (let i = 0; i < lists.length; i++) {
if (lists[i].length > 0) {
heap.push([lists[i][0], i, 0]);
}
}
while (heap.heap.length > 0) {
const [val, listIdx, elemIdx] = heap.pop();
result.push(val);
if (elemIdx + 1 < lists[listIdx].length) {
heap.push([lists[listIdx][elemIdx + 1], listIdx, elemIdx + 1]);
}
}
return result;
}
// Time: O(n log k) where n = total elements, k = number of lists
// Space: O(k) for heap7. Dynamic Programming Basics
// Fibonacci - Exponential without optimization
function fib(n) {
if (n <= 1) return n;
return fib(n - 1) + fib(n - 2);
}
// fib(40) = 26 seconds (exponential explosion!)
// With Memoization - O(n)
function fibMemo(n, memo = {}) {
if (n in memo) return memo[n];
if (n <= 1) return n;
memo[n] = fibMemo(n - 1, memo) + fibMemo(n - 2, memo);
return memo[n];
}
// fib(40) = instant!
// Lesson: Memoization caches results, eliminates redundant workKey Takeaways
- Big O is critical - Algorithm choice matters more than hardware
- Merge sort/Quick sort - Use these for production code
- Non-comparison sorts - Radix sort for integers
- Binary search - Only on sorted arrays, O(log n) is amazing
- Space-time tradeoff - Sometimes worth more memory for speed
- Stability matters - If order of equals preserved needed
- Interview preparation - Master merge sort, quick sort, binary search