From Constraints to Code: Learning Time Complexity
Explore the link between problem constraints and algorithmic solutions. Gain a deep understanding of time complexity and how it influences your coding decisions !
n <= 10
Expected Time Complexity: Likely factorial or exponential with a base larger than 2.
Examples:
(n^2 * n!)
orO(4^n).
10 < n <= 20
Suggested Approach: Consider backtracking or brute-force recursive algorithms. Algorithms that correctly find the answer will usually be fast enough for
n <= 10
.In cases where
10 < n ≤ 20,
the expected time complexity often involvesO(2^n)
.Choosing a higher base or a factorial becomes impractical due to the rapidly growing numbers (e.g., 3^20 ≈ 3.5 billion, and 20! is even larger). When you encounter a time complexity of O(2^n), it typically indicates that you're dealing with a problem that requires considering all possible subsets or subsequences of a collection of elements.
For each element, you have two choices: either include it or exclude it from the current subset or subsequence.
While this bound may seem small, most algorithms that correctly address these problems will usually perform fast enough. Consider employing backtracking and recursion techniques to navigate such scenarios effectively.
20 < n <= 100
As the input size grows beyond 20 but remains within the range of
20 < n ≤ 100
, exponential time complexities become impractical. In most cases, the upper bound will likely involveO(n^3)
.It's important to note that problems marked as "easy" on platforms like LeetCode often fall within this time complexity bound, which can be deceptive. While there may be solutions that run in linear time
O(n)
, the small constraint allows brute force solutions to pass as "easy." It's worth noting that finding a linear time solution might not necessarily be considered "easy."In this range, consider exploring brute force solutions that involve nested loops. If you come up with a brute force solution, analyze the algorithm to identify the "slow" steps, and aim to optimize those steps using data structures like hash maps or heaps.
100 < n <= 1,000
Expected Time Complexity: Quadratic time complexity
O(n^2)
is generally sufficient.Suggested Approach: Similar to the previous range, consider nested loops. However, O(n^2) is usually the expected/optimal time complexity here.
1,000 < n < 100,000
Expected Time Complexity: The slowest acceptable common time complexity is
O(n*logn)
, butO(n)
is commonly the goal.Suggested Approach: Consider sorting the input or using a heap. If not, aim for an O(n) algorithm. Avoid nested loops running in O(n^2) and instead apply techniques like:
Hash map
A two pointers implementation like sliding window.
Monotonic stack
Binary search
Heap
A combination of any of the above
Note: An
O(n)
algorithm can have a reasonably large constant factor (around 40). String problems sometimes involve iterating over the alphabet characters, resulting in a time complexity ofO(26n).
100,000 < n < 1,000,000
Expected Time Complexity: Rare constraint, usually requiring
O(n) or O(n*logn)
with a small constant factor.Suggested Approach: Incorporate hash maps as needed.
1,000,000 < n
When dealing with enormous inputs, typically in the range of
10^9
or more, the most common acceptable time complexity will be logarithmicO(logn)
or constantO(1)
.In these problems, you must either significantly reduce your search space at each iteration (usually through binary search) or use clever tricks to find information in constant time (such as mathematical insights or creative utilization of hash maps.
Last updated
Was this helpful?