I think paxdiablo is nearest, but not for the right reason. Time complexity ignores issues on real hardware such as cache sizes, memory limits and in this case the limited number of processes and the operation of the scheduler. Based on the Wikipedia page for Time complexity I'd say the answer is that you can't determine the runtime complexity because if it is defined as: Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, where an elementary operation takes a fixed amount of time to perform.
Thus the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor. Then we can't talk about the runtime complexity of this algorithm because time which the elementary operations take is so vastly different, that the time taken would differ by more than a constant factor.
Both the time complexity and the process complexity of that algorithm are O(braindead). With a sufficiently large value in the data set, you'll be waiting for an answer until the sun explodes. With a sufficiently large data set size, you'll (1) hit your process limit; and (2) find that early sleeps will finish before the latter ones begin, meaning the set (2,9,9,9,9,9,...,9,9,1) will not sort the 1 and 2 correctly.
Time complexity is irrelevant in this case. You can't get any less optimised than "wrong".
O(max(input)+n) The complexity just appears awkward to express because most sorting algorithms are data agnostic. Their time scales with the amount of data, not the data itself. FWIW, as pointed out here, this is not a reliable algorithm for sorting data.
2 I think it needs to be O(max(input)+n) where is the amount of inputs. If it takes longer to iterate through all inputs than the time it takes to process the largest one then O(max(input)) would be incorrect no? – yarian Jun 24 at 22:32.
If you read the thread you'll see that your question is already answered. The time complexity is O(max(input)) and the operational complexity (number of operations) is O(n).
There is an answer the thread sure, but I don't know if it is right – justinhj Jul 5 at 7:21.
One point which nobody seems to have addressed is how those sleeps are implemented. Ultimately they end up in a scheduler somewhere, and the operational complexity will depend on the scheduling algorithm used. For example, if the sleeps are put as events in a priority queue you will likely end up with something equivalent to heapsort, with complexity O(n log n).
A naive scheduling algorithm might result in O(n^2).
Update: I've come back to the conclusion that O(m+n), where n is the length of the list and m is the largest element in it, is actually sound. Original answer follows. Justinhj's answer is quite good, but I don't believe he's right in saying that you can't determine the runtime complexity.
Remember when calculating worst-case time complexity that it is worst-case time complexity. You may not make any assumptions about the size of the input; you must consider it to be infinitely large. "Infinite" in what way, depending upon the type of the input (and how it is used; you want to consider what will take the longest, so in some cases where the inverse is taken, this "infinite" may be instead something approaching zero rather than infinity).
For something taking a list, that will generally be a list with infinitely many items. For something taking a number, that will generally be ∞ (hereafter inf). By this correct reasoning, those answers that suggest O(max(input) + n) are faulty.
But still, it is possible to calculate the complexity, whether it is useful or meaningful or not. When considering this algorithm, clearly the item which is causing problems is sleep. The rest is easy to calculate—it ends up at O(n+sleep), with the knowledge that it doesn't matter how many times you're running sleep, their times will be independent (time complexity careth not for a dodgy task scheduler or a maximum number of processes which can be run or running out of memory—or else you could easily call everything O(1)).
But what is sleep's time complexity? Remember you must consider the input which will take the longest. That is, for sleep, an infinitely large number.
Thus, sleep is, with constants still in, O(inf). Then the important question: is inf a constant? If it is, then sleep is O(1); if it is not, sleep is still O(inf).
Whichever it is, there can be no simplification of the complexity of sleep as it appears in the main algorithm as the inputs must be assumed to be infinitely large. So which is it? Is inf a constant, or not?
I present you then with two possible answers for the time complexity of sleepsort. Bash: O(n) (if inf is a constant) O(inf+n) (if inf is not a constant) Which will you go with? For myself, I think I prefer the former.
I don't understand the reasoning behind rejecting the max(input) term. Where did the inf come from? Why would one want to sort infinite numbers?
They are not even comparable. :) – Rotsor Jul 12 at 3:30 @Rotsor: (b) inf comes from the fact that for sleep, Big O notation "describes the limiting behavior of the function when the argument tends towards ... infinity" (#). Are you happy that sleep is O(inf)/O(1) (if inf is constant)?(a) I reject max(input) on the same grounds: you must consider infinite data.It is invalid in Big O notation to consider non-infinite data in this way.
You must use the greatest max(input).(c) I couldn't care less that you won't be dealing with infinite numbers, but that's what Big O notation requires. – Chris Morgan Jul 12 at 4:04 If you follow the same reasoning, you'd have to reject n too!"You must use the greatest n, which is infinity". Working with infinity values brings us nowhere.
We may only work with limits. And the claim of O(n + max(input)) as n -> inf, inputi -> inf is sound, although questionably correct. – Rotsor Jul 12 at 4:21 And no, I'm not happy that sleep is O(inf)/O(1) because sleep(m) takes O(m) time.
If you consider m constant, or bounded, then fine, sleep(m) is O(1). If not, then sleep(m) is O(m). Why bring infs here?
– Rotsor Jul 12 at 4:24 @Rostor: in thinking it over (before reading your last comment, actually) I've come to the conclusion that sleep is indeed O(n). Not sure quite why I was going for inf, it is illogical. – Chris Morgan Jul 12 at 4:44.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.