Glossary

Space Complexity

Space complexity measures how much additional memory an algorithm uses as a function of input size, expressed in Big O notation. It includes auxiliary space (extra memory beyond the input) and sometimes input storage.

Explanation

Space complexity counts the extra memory your algorithm allocates: variables, data structures, and call stack frames. Like time complexity, it's expressed as a function of input size n. The key distinction is between total space (input + auxiliary) and auxiliary space (only the extra memory your algorithm allocates, not counting the input itself). Most modern analysis uses auxiliary space. Common space complexities: O(1) auxiliary space (in-place algorithms like bubble sort, selection sort — only a constant number of extra variables), O(log n) (recursive divide-and-conquer like merge sort's recursion depth, or balanced BST operations), O(n) (storing a copy of the input, a hash map of all elements, the output of a transformation), O(n²) (a 2D matrix of all element pairs, adjacency matrix). The space-time tradeoff is fundamental: you can often reduce time complexity by using more space (cache results in a hash map) or reduce space by accepting more time (recompute rather than cache). Dynamic programming trades O(n) or O(n²) space for polynomial time. Hash tables trade O(n) space for O(1) lookup time. Choosing the right tradeoff depends on the constraints: memory-limited systems (embedded devices, Lambda functions with 128MB) may prefer more time for less space. Stack space from recursion is easy to forget: a recursive function called n levels deep uses O(n) stack space. Converting to iterative eliminates this. Merge sort uses O(n) auxiliary space for the merge step AND O(log n) stack space for recursion — the O(n) dominates.

Code Example

javascript
// Space complexity analysis

// O(1) space — in-place reversal (no extra arrays)
function reverseInPlace(arr) {
  let lo = 0, hi = arr.length - 1;
  while (lo < hi) {
    [arr[lo], arr[hi]] = [arr[hi], arr[lo]]; // one swap variable only
    lo++; hi--;
  }
  // Extra variables: lo, hi — O(1) regardless of input size
}

// O(n) space — creates a new array
function reverseNew(arr) {
  return arr.slice().reverse();
  // New array of n elements — O(n) space
}

// O(n) space — hash map stores up to n entries
function twoSum(nums, target) {
  const seen = new Map(); // grows with input
  for (let i = 0; i < nums.length; i++) {
    const complement = target - nums[i];
    if (seen.has(complement)) return [seen.get(complement), i];
    seen.set(nums[i], i);
  }
}

// O(log n) stack space — recursive binary search
function binarySearch(arr, target, lo = 0, hi = arr.length - 1) {
  if (lo > hi) return -1;
  const mid = (lo + hi) >> 1;
  if (arr[mid] === target) return mid;
  return arr[mid] < target
    ? binarySearch(arr, target, mid + 1, hi)  // log n stack frames
    : binarySearch(arr, target, lo, mid - 1);
}

Why It Matters for Engineers

Space complexity matters in three practical scenarios: Lambda/serverless functions with strict memory limits (128MB–1GB), browser-side JavaScript where memory usage affects page performance and GC pressure, and large-scale data processing where O(n²) space on 10M rows requires 100TB — obviously infeasible. Understanding space complexity also explains why some algorithms are "in-place" (O(1) extra space, modifying input) while others create copies. This distinction directly affects your code's memory footprint in production. Choosing an O(1)-space solution over an O(n)-space solution can cut memory usage by 100x for large inputs.

Learn This In Practice

Go deeper with the full module on Beyond Vibe Code.

Algorithms Fundamentals → →