/    /  DAA- ASYMPTOTIC NOTATIONS

ASYMPTOTIC NOTATIONS

 

Asymptotic notations:

While analyzing an algorithm we often end up with a formula that represents the time complexity or the amount of time taken by that algorithm to get executed. Also, this formula is often found to contain unnecessary details that rea of no use.

 

Hence, with the help of asymptotic notations, we compare the function that ignores such unnecessary details as small input sizes and constant factors. 

 

For example:

T(n)=  3n + 2

 

Here, 2 becomes an unnecessary part as it is a constant and as it does not make much of a difference. So, 3n becomes the significant part here.

 

Following are the most commonly used asymptotic notations for representing the time complexity of algorithms.

 

Big Theta:

The tight bounds or the lower bound and the upper bound of an algorithm’s running time are formally expressed as θ(n). 

 

The term tight bounds means that the time complexity represented by the notation is like the average value or a value that is within the range of the actual time of execution of the algorithm.

 

For example, let us consider the expression 3n2 + 5n, which is the time complexity of a certain algorithm.

 

When we make use of the Big-Θ notation to represent this, the time complexity of the algorithm would be Θ(n2), which we get after ignoring the constant-coefficient and removing the unnecessary or the insignificant part, which is 5n.

 

Here, in the example above, the complexity of Θ(n2) means, that the average time for any input n will remain in between, c1 * n2 and c2 * n2. Here c1, c2 are constants and the expression is tightly bounded representing the growth of the algorithm.

 

For a given function g(n) we represent Θ(g(n)) as:

Θ(g(n)) = {f(n): there exist positive constants k1, k2 and n0 such that
          0 <= k1*g(n) <= f(n) <= k2*g(n) for all n >= n0}

 

Big O:

This notation is used to represent the upper bound of the algorithm. This notation gives us information that a certain function will never exceed a specified time for any value of input n.

 

You might wonder that why is there a need for this representation when we already have the big-Θ notation, which represents the tightly bound running time for any algorithm. Let’s have a look at an example to understand this clearly.

 

In the case of the Linear Search algorithm, where we traverse an array of elements, one by one to search a given number.

 

In the Worst case, we start from the front of the array, we find the element or number we are searching for at the end. This will lead to a time complexity of n, where n represents the number of total elements.

 

However, it can happen, that the first element of the array is the element that we are searching for, in this case, the time complexity will be 1.

 

Now, considering the big-Θ notation or tightly bound time complexity for Linear search is Θ(n), we get a conclusion that the time required will always be related to n, as this is how we represent the average time complexity. But while making use of the big-O notation, we mean to say that the time complexity is O(n), which means that the time complexity will never exceed n. That is we are defining the upper bound, hence saying that it can be less than or equal to n, which is the correct representation.

 

This is the reason, most of the time we find Big-O notation is used to represent the time complexity of an algorithm as it makes more sense.

 

For a given function g(n) we represent O(g(n)) as:

O(g(n)) = { f(n): there exist positive constants k and n0 such that 
          0 <= f(n) <= k*g(n) for all n >= n0}

 

Big Omega:

The Big Omega notation is used to define the lower bound of an algorithm’s running time. In other words, we can say the best case of any algorithm.

 

This notation will always indicate the minimum time required for an algorithm for all input values. Hence making it the best case of any algorithm.

 

In simple words, when we represent a time complexity for an algorithm in the form of big-Ω, we mean that the algorithm will take at least a particular amount of time to complete its execution. However, it can take more time than this too.

 

For a given function g(n) we represent Ω(g(n)) as:

Ω (g(n)) = {f(n): there exist positive constants k and n0 such that 
           0 <= k*g(n) <= f(n) for all n >= n0}.

 

These are the 3 asymptotic notations that are used to represent the time complexity of algorithms. Read the next article on probabilistic analysis to understand more about the estimation of computational complexity.