# Time and Space Complexity

Time and space complexity are important in data structures. This is one way for us to measure or quantify how much time does a program or algorithm takes.

Time and space complexity depends on lots of things like hardware, operating system, processors, etc. However, we don't consider any of these factors while analyzing the algorithm. We will only consider the execution time of an algorithm.

Let me show you an example:

Imagine if I have list ** L** of size

**and an integer**

*n***and you have to find if**

*x***exists in the list**

*x***or not.**

*L*

Simple solution to this problem is traverse the whole List ** L** and check if the any element is equal to

**.**

*x*

```
for ele in L:
if ele == x:
return True
return False
```

Each of the operation in computer take approximately constant time. Let each operation takes c time. The number of lines of code executed is actually depends on the value of ** x**. During analyses of algorithm, mostly we will consider worst case scenario, i.e., when

**is not present in the List**

*x***. In the worst case, the**

*L***if**condition will run

**times where**

*n***is the size of List**

*n***. So in the worst case, total execution time will be (**

*L***).**

*n∗c+c***for the**

*n∗c***if**condition and

**for the**

*c***return**statement.

As we can see that the total time depends on the size of list ** L**. If the size of list will increase the time of execution will also increase.

**Order of growth** is how the time of execution depends on the length of the input. In the above example, we can clearly see that the time of execution is linearly depends on the size of list. Order of growth will help us to compute the running time with ease.