Find the big-O running time of the following program that finds all lists with length equal to the length of an input list and elements only from the elements of the input list (the input list has no duplicate elements):
It has a list of lists to return, which begins containing one empty list
It repeats the following steps times
For each list in , it adds lists, the lists formed by adding each element of to (each of these will have one more element than ), to another list
It assigns the value to
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
The problem doesn't state the initial contents of A, but the only way I see to get the correct output is to assume that A initially contains one empty list.. Counting time according to the number of elements copied into a list added to B,
Step 1 produces n strings of length 1 from the single empty string for a cost of n 1=1. Step 2 produces n n strings of length 2, for a total cost of 2n^2. Step three produces n n n string of length 3 for a total cost of 3n^3, and so on.
total cost = ∑ i = 1 n i n i
Each term is greater than the sum of the previous terms, so that's O ( n ∗ n n ) = O ( n n + 1 ) .
I picked the closest answer, which really doesn't seem to be correct--unless you can duplicate a list of n elements in O(1) time.