Asymptotic Complexity

We want to be able to compare time functions for different algorithms without worrying about how long each primitive operation takes. Asymptotic Complexity gives us a way to do this. Basically, we ask what our time function would look like if our input was really, really big, and ignore the small stuff.

Watch the video and then answer the questions below.

Twenty-minute video

Note: In the video get muddled and say that the algorithm that takes more time is more efficient. Obviously, the algorithm which takes less time is more efficient.

You can also view this video on YouTube


Key Points

  • A polynomial is an expression like 2x2 + 4x + 7.
  • To find the time complexity of an algorithm, we look at how its time function grows
  • For polynomial time functions, their rate of growth is the same as the order of the polynomial, which is the highest power in the expression. The polynomail above is a second-order polynomial, because its highest power is 2.
  • We group algorithms into groups based on their complexity

Common Time Complexities

  • O(1) is constant time, i.e. the time doesn’t change with the size of input
  • O(n) is linear time, i.e. the time is a linear function of the size of input (an input twice as big takes twice as long)
  • O(n2) is quadratic time, so the time goes up with the square of the size of input.
  • O(log n) is logarithmic time
  • O(2n) is exponential time
  • And there are others

Questions

1. Check your understanding

1. Common Complexities

Assign each to its eqivalent time complexity

  Expression O(1) O(log n) O(n) O (n log n) O(n2) O(n3) O(2n)  
1. O(7 + 3)
2. O(3n + 6)
3. O(5n2 + 3n + 6)
4. O(n3 + 4n2 + 6n + 6)
5. O(5 log (n/2))
6. O(7n log n)
7. O(6n3 log n + 2n)

Check Answers

2. Graphs

Match the function to the graph. You might want to use this online calculator to help

  1. y = x
  2. y = x2
  3. y = 2x
  4. y = log x
  5. y = 1
  6. y = n log n

  1. y = x
  2. y = x2
  3. y = 2x
  4. y = log x
  5. y = 1
  6. y = n log n

  1. y = x
  2. y = x2
  3. y = 2x
  4. y = log x
  5. y = 1
  6. y = n log n

  1. y = x
  2. y = x2
  3. y = 2x
  4. y = log x
  5. y = 1
  6. y = n log n

  1. y = x
  2. y = x2
  3. y = 2x
  4. y = log x
  5. y = 1
  6. y = n log n

Check Answers


Summary

In this section we have learned the fundamentals of asymptotic complexity. Using Big Oh notation we can simplify our time functions to compare the efficiency of different algorithms.

  • You should recognise common time complexities and know what they mean
  • You should be able to order time complexities from smallest to biggest
  • You should be able to identify the time complexity of an algorithm from its time function

In the next section we are going to learn some maths important for calculating time functions for many algorithms: summation.