Big O Notation - what are the differences?

Its a nice illustration – Neil Foley Dec 3 '09 at 22:03 Blue line is y = 0.1 x ^ 2. I'm trying to figure out what the red line is, specifically what base the logarithm is. Shouldn't it be base 2?

The general sense is correct, but the magnitude may be misleading. – Ewan Todd Dec 3 '09 at 22:08 1 The log base should be irrelevant in the long run. – Joe Koberg Dec 3 '09 at 22:18 This doesn't really convey the asymthotic behaviour of O(n^2) and O(nlogn), since if the green curve is exactly n^2 then you could have another function which is still O(n^2) yet look very similar to the red curve.

-1 for glossing over this critical aspects of Big-O understanding. – harms Dec 3 '09 at 22:25 1 The question asked the difference between NlogN and N^2. Not between N^2 and (0.01 * N^2)... Which Big-O notation is SUPPOSED to "gloss over".

– Joe Koberg Dec 3 '09 at 22:41.

You'll need to be a bit more specific about what you are asking, but in this case O(n log(n)) is faster.

cs.georgetown.edu/~alspaugh/cls/2009b-07... n log(n) grows significantly slower.

Big O calculates an upper limit of running time relative to the size of a data set (n). An O(n*log(n)) is not always faster than a O(n^2) algorithm, but when considering the worst case it probably is. A O(n^2)-algorithm takes ~4 times longer when you duplicate the working set (worst case), for O(n*log(n))-algorithm it's less.

The bigger your data set is the more it usually gets faster using an O(n*log(n))-algorithm. EDIT: Thanks to 'harms', I'll correct a wrong statement in my first answer: I told that when considering the worst case O(n^2) would always be slower than O(n*log(n)), that's wrong since both are except for a constant factor! Sample: Say we have the worst case and our data set has size 100.

O(n^2) --> 100*100 = 10000 O(n*log(n)) --> 100*2 = 200 (using log_10) The problem is that both can be multiplied by a constant factor, say we multiply c to the latter one. The result will be: O(n^2) --> 100*100 = 10000 O(n*log(n)) --> 100*2*c = 200*c (using log_10) So for c > 50 we get O(n*log(n)) > O(n^2), for n=100. I have to update my statement: For every problem, when considering the worst case, a O(n*log(n)) algorithm will be quicker than a O(n^2) algorithm for arbitrarily big data sets.

The reason is: The choice of c is arbitrary but constant. If you increase the data set large enough it will dominate the effect of every constant choice of c and when discussing two algorithms the cs for both are constant!

An O(n*log(n)) is not always faster than a O(n^)", this is correct. But the follow-up "but when considering the worst case it is" is unfortunately incorrect. Are you confusing worst-case analysis with Big-O analysis (and maybe best-case with Big-Omega)?

Those should not be confused, they have in fact nothing to do with each other. – harms Dec 3 '09 at 22:34 You are right since it is always 'except for an constant factor'! I will update my answer.

Thanks – Johannes Weiß Dec 4 '09 at 11:28.

Algorithms that run in O(nlog(n)) time are faster than those that run in O(n^2). Big-O defines the upper-bound on performance. As the size of the data set grows (n) the length of time it takes to perform the task.

You might be interested in the iTunes U algorithms course from MIT.

1 -1 for the first statement, +1 for the second. All O(n^2) functions are not guaranteed to be slower than all O(nlogn) functions for all values of n. – Annabelle Dec 3 '09 at 22:40 True.

I left out that relationship assumes a sufficiently large value of n. – acrosman Dec 4 '09 at 14:35 you also left out the fact that O() is an upper bound. – Kugel Aug 11 '10 at 18:21 1 Second paragraph opens: "Big-O defines the upper-bound on performance" – acrosman Aug 11 '10 at 19:02.

Big Oh" notation gives an estimated upper bound on the growth in the running time of an algorithm. If an algorithm is supposed to be O(n^2), in a naive way, it says that for n=1, it takes a max. Time 1 units, for n=2 it takes max. Time 4 units and so on. Similarly for O(n log(n)), it says the grown will be such that it obeys the upper bound of O(n log(n)).

(If I am more than naive here, please correct me in a comment). I hope that helps.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions