Big O notation is a way to describe how quickly the runtime of an algorithm increases as the size of the input increases. This runtime isn’t measured in seconds, we use the max number of operations instead. The most common Big O runtimes are the followings (from fastest to slowest):

  • O()
  • O()
  • O()
  • O()
  • O()
  • O()