Time Complexity - Big O

What is time complexity and why should you care?

Table of contents

Time Complexity is very important concept to know, if one wants to write efficient programs that can scale. It gives an idea of how the processing time of any function will increase as its input size increases.

In this post we will discuss what time complexity is and see some common use cases.

Big O definition as per wikipedia : “Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. It is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation.”

What is time complexity ?

When you search something on the web, you want the results as fast as possible. Now the search engine has to probably go through millions (if not billions) of records to show what you searched for, even with all the computing strength if the search programs are not efficient or worse if they have a worse time complexity, you might end up sleeping before any results show up.

Suppose if you have two functions returning the same results, how do we know which is better in terms of time complexity?

  • Do we measure absolute time of running the programs? - Well you could, but then a computer with more computing power will run the any of those faster than a smartphone with limited capabilities. Therefore this is not a good metric. Here our goal is to understand which program is better at scale, i.e., when the input size grows , how does the number of operations the program has to run grow?

So we indeed measure time complexity, also denoted by Big-O notation, by measuring the growth of number of operations as the input to the function grows, and we are only concerned with the growth at infinitely large inputs.

Consider the below two functions that return the sum of first n natural numbers:

# Function One

def sum_till_n(n):
    total = 0
    for num in range(1,n+1):
        total += num  #This step runs a total of n times 
    return total

#Function Two
def sum_till_n_new(n):
    return n*(n+1)/2; #This step runs once.

In the first function suppose the first step (assigning total=0)takes 'k' units of time, where k is a constant. The next line for loop has only one step (total += num), which runs as many times as the loop runs, here n times, and let each iteration take some 'c' units of time, so the total time for (no pun intended) loop to run is c*n. The last return statement takes another constant 'i' units. So the total time complexity can be represented as :

k+cn+i, which is represented as O(k) + O(cn) + O(i)

Now as input size grows O(k) and O(i) remain same while the middle term increases, thus we represent any constant time terms as O(1). When the input grows significantly the only terms that affects the middle term significantly is n, and replace c with 1. Mind here that we are not concerned with actual times, just with how the number of operations grow as input grows.

So the only term that affects the above operation the most is the middle term which can be simplified to O(n). So the Big-O (Time complexity) of the first function can be written as : O(n)

Let us consider the second function, the only line that executes is the return call, and some mathematical computation that takes some constant time O(1). Notice here that now matter how big n is the return statement or the mathematical computation happens only once. So the time complexity is: O(k) where k is the amount of time it takes for the computation which is a constant irrespective of n. Hence we can simplify the complexity and write the Big-O as O(1).

Clearly in the above examples both function will give instant outputs for smaller n, but as n increases the first function has to inherently do a lot of work (n steps) and hence the time difference starts becoming visible but the second function performs just one step, no matter the input size, and doesn't grow. This is the ideal time complexity that we would like our functions to have O(1). But not all functions can be reduced to O(1).

From the above illustration we can understand that why knowledge of time complexity is important and how it can help us decide which function to go for. You can and should read up more on Time complexity if you are serious about scaling your programs.