### Introduction to Time Complexity

Have you ever wondered about the amount of time needed by an algorithm or a piece of code, to execute to completion? Are you curious about how complex a code or algorithm might be from execution time perspective? And most importantly, do you always feel like comparing the possible execution time of two different piece of code giving same output? If yes, then read on…

Often there are multiple solutions to a programming problem. We must learn how to compare the performance of various algorithms and select the best one to tackle a specific problem. We usually consider **time complexity and space complexity** of the algorithms while comparing the efficiency of two or more algorithms. In this article we will discuss time complexity in detail.

Let’s start with the most basic question, how do we compare the time execution of two piece of code giving same output. Simply, which code is going to take less time? A simple answer would be to run the two programs on a machine and check the time execution!

Wait… It is not that easy 🙂

Well, time taken by a piece of code depends on factors mentioned below:

- Programming Language, yes some programming language is faster than others
- Machine on which we are running the code, a good processor will run the code faster
- Load on the machine at the given time, a free machine will run the code faster

Considering above factors, then how should we compare the execution time of two or more programs or piece of code? There comes the term “Time Complexity”.

Let us discuss what is Time complexity of a code in the next section.

### What is Time Complexity?

Time complexity is a theoretical analysis of time taken by a code to run in terms of input size. Note that for calculating the time complexity of a code we do not need to actually run it.

The number of actions an algorithm must carry out to complete a task is referred to as time complexity, with each operation taking the same amount of time into account. In terms of time complexity, the algorithm that completes the task with the fewest operations is regarded as the most effective one.

A similar concept you may like to read is **Space Complexiy of Code.**

### Why to use Time Complexity?

Whether it is feasible to apply an algorithm to large input data depends significantly on how much time it takes. So it is always advisable to know the time complexity of your code so that you know it will not take impractical amount of execution time when used on the real data.

Once we know, we need to measure the time complexity of algorithms, we need a notation to express the time complexity. For this, one notation we use is called “Big O notation”. Lets discuss Big O notation in the next section.

### What is Big O Notation?

We can represent measurements of time complexity using a variety of notations. Big-O notation is the most popular, thus that will be our main emphasis.

The complexity’s upper bound is described by the Big-O notation. Therefore, it provides the worst-case scenario for the complexity growth rate of an algorithm. We can predict that the size of this algorithm will not increase any faster than the function represented by Big O notation, although it may increase slightly more slowly.

Big O Notation is a technique for expressing how time-consuming an algorithm is. As the input size increases, it calculates how long it will take to perform an algorithm. In other words, it determines an algorithm’s worst-case time complexity. The maximum runtime of an algorithm is expressed using the Big O Notation in data structures.

For example, 0(1) represants constant time complexity. Which means even if you will increase the input size, n, the time taken by the code will be same.

Similary 0(n), represants linear time complexity. Which means if you will increase the input size, n, the time taken by the code will increase proportional to n.

Similary 0(n2), represants quadratic time complexity. Which means if you will increase the input size, n, the time taken by the code will increase proportional to square of n.

We discussed enough theory, now lets jump into practical python code to understand time complexity of most of the common loops scenarios.

### Time Complexity of Loops with Examples in Python

In this section we will discuss the time complexity of while loops written in python programing language.

Note that here variable **n **indicates the **input size **and the variable **count **stores and prints the **number of steps** taken by a loop before completing its task.

#### Loop with Time Complexity – Constant Time – O(1)

Let us create a while loop in python such that, the number of steps taken by loop is indpendant of input size n. The code is written below:

```
```**# Example of Loop with Time Complexity O(1)**
**# Python code 1 **
# Note: n NOT in loop condition
# Counter counts the total number of operations
n = 100; i = 0; count = 0;
while i < 10:
count = count+1
i = i + 1
print(count)
# output = 10
###############################
**# Python Code 2**
# Find if a number is Even or Odd
def isEven(n):
return n % 2 == 0
print(isEven(20));
print(isEven(21));

The above code is a while loop, and the steps taken by the while loop here does NOT depend on the input size n. Therefore this is an example of O(1) time complexity.

A practical example of O(1) time complexity would be, given an input number, find if the **number is even or odd**.

#### Loop with Time Complexity – Linear Time – O(n)

Let us create a while loop in python such that, the number of steps taken by loop is directly proportional to input size n. The code is written below:

```
#
```**Example of Loop with Time Complexity O(n) **
**# Python code 1**
# Note: n in loop condition
# Counter counts the total number of operations
n = 100; i = 0; count = 0;
while i < n:
count = count+1
i = i + 1
print(count)
# output = 100
####################################
**# Python code 2**
**# Find largest element in an array**
sample_array = [10, 20, 30, 88, 40]
length = len(sample_array)
max_value = sample_array[0]
for i in range(1, length):
if sample_array[i] > max_value:
max_value = sample_array[i]
print(max_value)
# output = 88

The above code is a while loop, and the number of steps taken by the while loop here is directly proportional to the input size n. Therefore this is an example of O(n) time complexity.

A practical example of O(n) time complexity would be, given an input array of numbers, find the maximum value number in the list. That is equivalent to **traversing an array** or **sequential or linear search in an array**.

#### Loop with Time Complexity – Quadratic Time – O(n2)

Let us create a while loop in python such that, the number of steps taken by loop is directly proportional to the square of the input size n. The code is written below:

```
# Python code
```**# Example of Nested Loop with Time Complexity O(n2) n square**
# Counter counts the total number of operations
n = 100; i = 1; j = 1; count = 0;
while i < n:
while j < n:
j = j + 1
count = count + 1
i = i + 1
j = 1
count = count + 1
print(count)
# output = 9900

The above code is a **nested while loop**, and the number of steps taken by the nested while loop here is directly proportional to the square of the input size n. Therefore this is an example of O(n2) time complexity.

Practical examples of O(n2) time complexity problem is worst case time complexity of **Bubble sort**, **Selection sort** and **Insertion sort**.

#### Loop with Time Complexity – Logarithmic Time – O(log n)

Let us create a while loop in python such that, the number of steps taken by loop is directly proportional to the logarithm of the input size n. The code is written below:

```
# Python code
```**# Example of Loop with Time Complexity O(log n ) **
# Note: Division by 2 inside loop
# Loop runs 7 times for n = 100
# O(log n ) is better than 0(n) as log(n) value is smaller than n
# Counter counts the total number of operations
n = 1; i = 100; count = 0;
while i > n:
count = count+1
i = i / 2 **# Division by 2**
print(count)
# output = 7

The above code is a** while loop** **with division by 2** , and the number of steps taken by the nested while loop here is directly proportional to the logarithm of the input size n. Therefore this is an example of O(log n) time complexity.

Another practical example of logarithmic time complexity problem is **Binary search in a sorted array of n elements**.

#### Loop with Time Complexity – Linear Logarithmic Time – O(n log n)

Let us create a while loop in python such that, the number of steps taken by loop is directly proportional to the input size n multiplied by logarithm of the input size n. The code is written below:

```
# Python Code
```**# Example of Loop with Time Complexity O(n log n) **
# Counter counts the total number of operations
n = 100; i = 1; j = 1; count = 0;
while i < n:
while j < n:
j = j + 1
count = count + 1
i = i * 2
j = 1
count = count + 1
print(count)
# output = 700

The above code is a** nested while loop** with outer loop proprtional to input size n and inner loop has power of 2 , and the number of steps taken by the nested while loop here is directly proportional to the **n times logarithm **of the input size n. Therefore this is an example of **O(n log n) **time complexity.

Note that O(n log n) is better time complexity O(n2) because n log(n) value is smaller than n2. Smaller the worst case value, better the program is.

A practical example of O(n log n) time complexity is **MergeSort**.

#### Loop as Time Complexity – O(log log n) – Logarithmic of Logarithmic Time

Let us create a while loop in python such that, the number of steps taken by loop is directly proportional to the **logarithm of the logarithm** of the input size n. The code is written below:

```
# Python code
```**# Example of Loop with Time Complexity O(log log n)**
# Power of 2(or more) inside loop
# Loop runs 3 times for n = 100
# O(log log n) is better time complexity O(log n) because log log n is smaller than log n.
# Smaller the Worst case , better the program is.
# Counter counts the total number of operations
n = 100; i = 2; count = 0;
while i < n:
count = count+1
i = i ** 2 **# Power by 2**
print(count)
# output = 3

The above code is a** while loop** **with power of 2** , and the number of steps taken by the nested while loop here is directly proportional to the **logarithm of logarithm **of the input size n. Therefore this is an example of **O(log log n) **time complexity.

#### Loop with Time Complexity O( m + n )

Let us create two **sequential while loop** in python such that, the number of steps taken by loop is directly proportional to the **sum of the input sizes m and n** of the two loops. The code is written below:

```
# Python code
```**# Example of Consecutive Loops with Time Complexity O( m + n )**
# Counter counts the total number of operations
m = 100; n = 100; i = 0; j = 0;count = 0;
while i < m:
count = count+1
i = i + 1
while j < n:
count = count+1
j = j + 1
print(count)
# output = 200

The above code has two sequential **while loop**, and the number of steps taken by both the while loops togather is directly proportional to the sum of their indivizual input sizes. Therefore this is an example of **O(m + n) **time complexity.

#### Loop with Time Complexity O( m * n )

Let us create two **nested while loop** in python such that, the number of steps taken by loop is directly proportional to the **multiplication of the input sizes m and n** of the two loops. The code is written below:

```
# Python code
```**# Example of Consecutive Loops with Time Complexity O( m * n )**
# Counter counts the total number of operations
n = 100; m = 10; i = 1; j = 1; count = 0;
while i < n:
while j < m:
j = j + 1
count = count + 1
i = i + 1
j = 1
count = count + 1
print(count)
# output = 990

The above code has two **nested while loop**, and the number of steps taken by both the while loops togather is directly proportional to the multiplication of their indivisual input sizes. Therefore this is an example of **O(m * n) **time complexity.

#### Loop with Time Complexity = O(n2) + O(n) = O(n2)

Let us create two **sequential while loop** in python such that, the number of steps taken by loop is directly proportional to the **higher order time complexity** of the two loops. The code is written below:

```
# Python code
```**# Example of sequential while Loop and simple loop Time Complexity = O(n2) + O(n) = O(n2)**
# Counter counts the total number of operations
n = 100; i = 1; j = 1; count = 0;
while i < n:
while j < n:
j = j + 1
count = count + 1
i = i + 1
j = 1
count = count + 1
k = 1;
while k < n:
count = count + 1
k = k + 1
print(count)
# ouptput = 9999

The above code has two **sequential while loop**, and the number of steps taken by both the while loops togather is directly proportional to the sum of their indivisual input sizes. Therefore this is an example of **O(n) **+ **O(n2)** time complexity. However, as both has input n, so the higher order time complexity = **O(n2)** will be considered as the time complexity of the combined loop or complete code.

If a function is a sum of several terms, the overall time complexity is determined by the fastest-growing term.

#### Code with Time Complexity = O(2^n) = Exponential

Let us create a python function with **exponential **( 2^n) or **two to the power n** time complexity. The code is written below:

**# Example of exponential time Complexity O( 2^n )**
# calculation of Fibonacci numbers
def fibonacci_calculation(n):
if n <= 1:
return n
return fibonacci_calculation(n-1) + fibonacci_calculation(n-2)

The example code above is of an exponential time complexity algorithm which is a recursive function.

A practical examples of exponential time complexity is **Recursive Fibonacci implementation** as shown above and **Towers of Hanoi** problem.

#### Code with Time Complexity = O(n!) = Factorial

Let us create a python function named factorial_calculation that has **O( n! )** or **factorial** time complexity. The code is written below:

**# Example of factorial time Complexity O( n! )**
def factorial_calculation(n, count):
if n == 1:
return n
for item in range(n):
print(count)
count = count+1
factorial_calculation(n-1,count)
# Call the function
factorial_calculation(4, 1)

The example code above is of an factorial time complexity algorithm which is a factorial calculation function.

For a quick reference below is the relative values of time complexity of a program for problem where input size is **large**:

**O(1) < O(log log n) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(2^n)** **< O( n! )**

Note that the smaller the time complexity the better is the program for same input size. Also, the log in the (n log n) is base 2. So for any n >= 2, you have log n >= 1.

### Conclusion

In this article, we dicussed time complexity of various loops with example code in python. Knowing the **time complexity** and **space complexity** is necessary for creating any efficient algorithms. The time complexity can be optimised, making the code more effective and swift.

When can we say that an algorithm is working efficiently? The answer is straightforward: it needs to operate fast and consume the least amount of memory. Interestingly, algorithms go through space and time in the opposite directions. The most common outcome of increasing speed is increased memory utilisation, and vice versa.