Saturday, May 17, 2025

Time Complexity




O(1) < O(log n) < O(n) < O(n log n) < O(n ^ 2) ..... < O(2 ^ n) < O(n!)

Try to keep TC till O(n ^ 2)


 ===========================================================================


O(n) * O(log n) = O( n * (log n) )


O(a) + O(b) = O(a+b) = O(a), if a > b

eg. O( n ^ 2) + O (2 ^ n) = O (2 ^ n) 

since, O( n ^ 2) < O (2 ^ n)


===========================================================================

        // Time Complexity: O(log n)

        // This loop doubles i: i = 1, 2, 4, 8, 16 (stops when i >= n)

        for (int i = 1; i < n; i = i / 2) {

            System.out.println("i = " + i);

        }

===========================================================================

๐Ÿ” Loop Complexity Comparison: O(n) vs O(log n) vs O(log log n)

1. ✅ O(n) — Linear Growth


for (int i = 0; i < n; i++) { System.out.println(i); }
  • Iterates from 0 to n-1

  • Runs n times

  • ๐Ÿ“ˆ Time Complexity: O(n)


2. ✅ O(log n) — Exponential Shrinkage


for (int i = n; i > 0; i = i / 2) { System.out.println(i); }
  • Each step divides i by 2

  • Runs about log₂(n) times

  • ๐Ÿ“ˆ Time Complexity: O(log n)


3. ✅ O(log log n) — Double Logarithmic Growth


for (int i = 2; i <= n; i = i * i) { System.out.println(i); }

๐Ÿ’ก Why O(log log n)?

Let’s break it down:

  • i = 2

  • i = 4 ()

  • i = 16 ()

  • i = 256 (16²)

  • i = 65536 (256²)

  • ...

If you start from i = 2 and repeatedly do i = i², the number of iterations to reach n is:

log log n

Because:

  • 2^2 = 4

  • 2^4 = 16

  • 2^16 = 65536

  • So you're doing a logarithm inside a logarithm

๐Ÿ“ˆ Time Complexity: O(log log n)



๐Ÿ” Code:

for (int i = 1; i < n; i = i * 2) { System.out.println(i); }

✅ Time Complexity: O(log n)

๐Ÿง  Explanation:

  • i starts at 1 and doubles on each iteration:
    1, 2, 4, 8, 16, ..., < n

  • The number of times you can double i before it reaches or exceeds n is about log₂(n).



๐Ÿ” Code:

for (int i = 1; i < n; i = i * 3) { System.out.println(i); }

Time Complexity: O(log₃ n)

๐Ÿง  Explanation:

  • The value of i increases by multiplying it by 3 each time.

  • So the values go like:
    1, 3, 9, 27, 81, ..., < n

  • If we denote the number of iterations as k, then:

    3k<nk<log3(n)3^k < n \Rightarrow k < \log_3(n)
  • Therefore, it runs about log₃(n) times, which is asymptotically still O(log n) — just with a different base.





Does O(n) supports till 10 ^ 9

In theory, yes — O(n) means the time grows linearly with input size.
But in practice, O(n) does not support input size n = 10⁹.

๐Ÿง  Why Not in Practice?

Because on a modern machine or coding platform (like LeetCode, Codeforces, etc.), you are typically allowed 1 to 2 seconds to run your code.

And most systems can safely perform about:

  • 10⁷ to 10⁸ operations per second


๐Ÿ“‰ What Happens If n = 10⁹?

  • An O(n) algorithm would need ~10⁹ operations

  • That would take ~10 seconds or moretoo slow

  • Your solution will likely Time Out (TLE)


✅ Practical Limit for O(n)

Time LimitMax Input Size for O(n)
1 sec~10⁷ (safe), maybe up to 10⁸ (tight)
2 sec~2 × 10⁷2 × 10⁸
10⁹❌ Not feasible for O(n) in competitive coding

๐Ÿงช Exception: When is n = 10⁹ OK?

Only if you're not actually doing work on every element, such as:

  • Just reading input of size 10⁹ (rare)

  • Doing O(1) or O(log n) work (e.g., binary search)

✅ Summary:

ScenarioO(n) Feasible?
n ≤ 10⁷✅ Safe
n ≈ 10⁸⚠️ Risky
n = 10⁹❌ Too large

Let me know the context (e.g., array processing, loops, I/O), and I can give a more specific answer.

No comments:

Post a Comment

Diagonal Traversal

 eg.  1       2       3       4 5      6       7       8 9    10    11     12 13  14   15    16 Output: 1 6 11 16 2 7 12 3 8 4  Approach:...