Time and Space Complexity
Time and space complexity are important in data structures. This is one way for us to measure or quantify how much time does a program or algorithm takes.
Time and space complexity depends on lots of things like hardware, operating system, processors, etc. However, we don't consider any of these factors while analyzing the algorithm. We will only consider the execution time of an algorithm.
Let me show you an example:
Imagine if I have list L of size n and an integer x and you have to find if x exists in the list L or not.
Simple solution to this problem is traverse the whole List L and check if the any element is equal to x.
for ele in L: if ele == x: return True return False
Each of the operation in computer take approximately constant time. Let each operation takes c time. The number of lines of code executed is actually depends on the value of x. During analyses of algorithm, mostly we will consider worst case scenario, i.e., when x is not present in the List L. In the worst case, the if condition will run ntimes where n is the size of List L. So in the worst case, total execution time will be (n∗c+c). n∗c for the if condition and c for the return statement.
As we can see that the total time depends on the size of list L. If the size of list will increase the time of execution will also increase.
Order of growth is how the time of execution depends on the length of the input. In the above example, we can clearly see that the time of execution is linearly depends on the size of list. Order of growth will help us to compute the running time with ease.