Determining the worst-case complexity of an algorithm?

Starting from the equation is thinking of it a bit backwards. What you really care about is scalability, or, what is it going to do as you increase the size of the input.

Starting from the equation is thinking of it a bit backwards. What you really care about is scalability, or, what is it going to do as you increase the size of the input. If you just have a loop, for instance, you have a O(n) time complexity algorithm.

If you have a loop within another loop though, it becomes O(n^2), because it must now do n^2 many things for any size n input. When you are talking about worst case, you are usually talking about non deterministic algorithms, where you might have a loop that can stop prematurely. What you want to do for this is assume the worst and pretend the loop will stop as late as possible.So if we have: for(int I = 0;i .5) j = n; } } We would say that the worst-case is O(n^2).

Even though we know that it is very likely that the middle loop will bust out early, we are looking for the worst possible performance.

1 Just a caution about non-determinism. A loop terminating early is not non-deterministic... Determinism refers to being able to definitely know what a program would do, versus not being able to determine what it will do. A probabilistic or randomized algorithm would be non-deterministic because there is a step in the algorithm where you don't know what will happen(I'm not talking about random numbers-- but randomized algorithms).

En.wikipedia.Org/wiki/Non-deterministic_algorithm – Kekoa May 11 '09 at 22:25 There is a clear distinction and a close connection between non-determinism and randomness. In terms of Turin Machines we think of randomized algorithms as having access to a (infinite) tape of random numbers. If you fix what's on the random tape (your random seed) the rest is deterministic.

Non-deterministic Turing Machines can however move from one state to several others given fixed input. – Pall Melsted May 12 '09 at 2:02.

That equation is more of a definition than an algorithm. Does the algorithm in question care about anything other than the size of its input? If not then calculating W(n) is "easy".

If it does, try to come up with a pathological input. For example, with quicksort it might be fairly obvious that a sorted input is pathological, and you can do some counting to see that it takes O(n^2) steps. At that point you can either Argue that your input is "maximally" pathological Exhibit a matching upper bound on the runtime on any input Example of #1: Each pass of quicksort will put the pivot in the right place, and then recurse on the two parts.(handwave alert) The worst case is to have the rest of the array on one side of the pivot.

A sorted input achieves this. Example of #2: Each pass of quicksort puts the pivot in the right place, so there are no more than O(n) passes. Each pass requires no more than O(n) work.As such, no input can cause quicksort to take more than O(n^2).

In this case #2 is a lot easier.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions