Time complexity
1. Definition
Time Complexity quantifies the amount of time an algorithm takes to run as a function of the length of the input.
-
It does not measure actual seconds (which depend on hardware).
-
It measures the number of operations (growth rate) relative to the input size (\(N\)).
2. The Notation: Big O (\(O\))
We uses Big O Notation to describe the Worst-Case Scenario (the upper bound).
3. The Hierarchy (Fastest to Slowest)
| Notation | Name | Description | Example |
|---|---|---|---|
\(O(1)\) |
Constant | Instant. No matter how much data, time is the same. | Accessing arr[5]; Hash Map lookup. |
\(O(\log N)\) |
Logarithmic | Cuts the problem in half repeatedly. Very fast. | Binary Search. |
\(O(N)\) |
Linear | Loops through the data once. | Finding a max value; Hash Sort. |
\(O(N\log N)\) |
Linearithmic | The standard for efficient sorting. | Merge Sort; Quick Sort; Heap Sort. |
\(O(N^2)\) |
Quadratic | Nested loops (Loop inside a loop). Slow for big data. | Bubble Sort; Insertion Sort. |
\(O(2^N)\) |
Exponential | Doubling work with every element. Unusable for large \(N\). | Recursive Fibonacci; Brute-force passwords. |
4. The Space-Time Trade-off
In computer science, we often trade one resource for another.
-
Hash Sort: Uses massive space (\(O(N)\)) or \(O(K)\)) to achieve blazing fast time (\(O(N)\)).
-
Bubble Sort: Uses minimal space (\(O(1)\)) but takes a long time (\(O(N^2)\)).