Performance Analysis in Algorithms — Understanding Space Complexity
When we study how good an algorithm is, we generally focus on performance analysis, which has two major components:
-
Time Complexity – How long the algorithm takes to run.
-
Space Complexity – How much extra memory the algorithm uses.
In this post, we will concentrate entirely on space complexity, because memory usage becomes a key factor when working with large data, recursion-heavy algorithms, or memory-constrained environments.
What Is Space Complexity?
Space complexity refers to the total memory an algorithm needs during execution, including input, variables, data structures, recursion stack, and auxiliary storage.
We calculate it using the formula:
Space = C + Sp
Where:
-
C (Fixed Part) – Memory required for constants, simple variables, fixed-size arrays, and program code. This does not change with input size.
-
Sp (Variable Part) – Memory that depends on the input size n, such as dynamic arrays, recursion stack frames, and temporary variables inside loops.
Understanding the Fixed Part (C)
The fixed part includes:
-
Basic variables like
i,sum,temp -
Constant-size arrays (e.g.,
int arr[10]) -
Instructions and compiled code
-
Memory required by the environment
This portion is independent of input size. Even if the input grows, C remains constant.
Example:
An algorithm that uses 3 integer variables will always take the same amount of memory, regardless of whether the input has 10 numbers or 10,000 numbers.
So C contributes O(1) to space complexity.
Understanding the Variable Part (Sp)
The variable part depends completely on the input. This includes:
-
Arrays whose size depends on n
-
Dynamic lists
-
Data structures built during execution
-
Memory used by recursive function calls
This is where actual space complexity grows from O(1) to O(n), O(log n), O(n²), etc.
Examples to Understand Space Complexity
Let’s explore examples starting from very basic to more advanced, covering both non-recursive and recursive algorithms.
1. Non-Recursive Example
Sum of first n numbers
Algorithm uses:
-
Three variables:
i,n,sum -
No extra storage based on input size
Space usage:
C = (memory for i, n, sum)
Sp = 0 (no variable-sized structure)
So Space = C + 0 = O(1)
This is constant space complexity.
2. Non-Recursive Example
Reading an array of size n
Algorithm uses:
-
Variables:
i,n -
Array of size n → depends on input
Space usage:
C = memory for fixed variables
Sp = memory for array of size n
Space = C + Sp = O(1) + O(n) = O(n)
This is linear space complexity.
3. Recursive Example
Factorial using recursion
Each recursive call requires:
-
Parameter
n -
Local variables inside the function
-
Return address
For n calls, recursion depth = n.
So space = C + Sp
C = constant variables
Sp = memory for n recursive stack frames = O(n)
So total = O(n) space.
4. Recursive Example
Merge Sort
Merge sort splits data into halves, recursively.
Space usage:
-
Temporary arrays used during merging ⇒ O(n)
-
Recursion depth = log n
-
Stack memory = O(log n)
Total space = O(n + log n) = O(n)
(We drop the smaller term.)
5. Recursive Example
Exponential recursion
Naive Fibonacci makes:
-
Two recursive calls per step
-
Recursion tree height = n
-
Stack grows to depth n
Space = O(n)
Time becomes exponential but space is still linear because stack depth grows linearly.
Why Space Complexity Matters
-
Recursion-heavy algorithms might run out of stack memory.
-
Large input sizes can crash programs if auxiliary arrays are too big.
-
Real-world constraints (mobile devices, embedded systems) require memory-efficient design.
-
Space vs Time tradeoffs appear frequently (like in dynamic programming).






Comments
Post a Comment