An algorithm takes a number, , and returns a list of numbers, all divisors of . You may have ideas on how to implement such a problem, but you observe the following runtimes as grows. Which of these runtimes is the best upper bound for the algorithm?
Runtime (seconds) | |
3 | 1 |
6 | 4 |
9 | 9 |
12 | 16 |
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Given runtimes and input sizes, the logical choice for evaluating the big-oh complexity of an algorithm is to check what occurs as the input size is changed. Let f ( x ) = the runtime when N = x . For example, we see that f ( 3 ) = 1 , and f ( 2 ∗ 3 ) = 4 . We can hypothesize that f ( c x ) = c 2 f ( x ) , and by plugging this in for every piece of data with a value of x = 3 , we can conclude that, given the data we have available, this is a reasonable idea. Since multiplying x by a constant increases f ( x ) by the square of that constant, we can conclude that our algorithm is O ( n 2 ) .