Actually, your first example is O(n^4)! That might sound surprising, but this is because indexing into a VBA Collection has linear, not constant, complexity. The VBA Collection essentially has the performance characteristics of a list - to get element N by index takes a time proportional to N.To iterate the whole thing by index takes a time proportional to N^2.
(I switched cases on you to distinguish N, the number of elements in the data structure, from your n, the number of cells on the side of a square block of cells. So here N = n^2.) That is one reason why VBA has the For...Each notation for iterating Collections. When you use For...Each, VBA uses an iterator behind the scenes so walking through the entire Collection is O(N) not O(N^2).
So, switching back to your n, your first two loops are using For...Each over a Range with n^2 cells, so they are each O(n^2). Your third loop is using For...Next over a Collection with n^2 elements, so that is O(n^4). I actually don't know for sure about your last loop because I don't know exactly how the Cells property of a Range works - there could be some extra hidden complexity there.
But I think Cells will have the performance characteristics of an array, so O(1) for random access by index, and that would make the last loop O(n^2). This is a good example of what Joel Spolsky called "Shlemiel the painter's algorithm": There must be a Shlemiel the Painter's Algorithm in there somewhere. Whenever something seems like it should have linear performance but it seems to have n-squared performance, look for hidden Shlemiels.
They are often hidden by your libraries.(See this article from way before stackoverflow was founded: http://www.joelonsoftware.com/articles/fog0000000319.html) More about VBA performance can be found at Doug Jenkins's webstie: http://newtonexcelbach.wordpress.com/2010/03/07/the-speed-of-loops/ http://newtonexcelbach.wordpress.com/2010/01/15/good-practice-best-practice-or-just-practice/ (I will also second what cyberkiwi said about not looping through Ranges just to copy cell contents if this were a "real" program and not just a learning excercise.).
I never knew this thing about collections, thanks for enlightening me, this is really interesting. – tube-builder Jan 30 at 9:53 Welcome to stackoverflow! There is also this: stackoverflow.Com/questions/2548526/… – jtolle Jan 30 at 14:54.
You are right that the first is 3 x O(n^2), but remember that O-notation does not care about constants, so in terms of complexity, it is still an O(n^2) algorithm. The first one is not considered a nested loop, even if it is working on the same size as the loop in the second. It is just a straight iteration over an N-item range in Excel.
What makes it N^2 is the fact that you are defining N as the length of a side, i.e. Number of rows/columns (being square). Just an Excel VBA note, you shouldn't be looping through cells nor storing addresses anyway.
Neither of the approaches is optimal. But I think they serve to illustrate your question to understand O-notation. Rng1.
Copy rng2. Cells(1). PasteSpecial xlValues Application.
CutCopyMode = False.
I don't think this is correct, but I'm willing to be told why I'm wrong. See my answer. – jtolle Jan 28 at 16:09 Better: rng2.
Value = rng1. Value – Tim Williams Jan 29 at 0:07.
Remember not to confuse the complexity of YOUR code with the complexity of background Excel functions. Over all the amount of work done is N^2 in both cases. However, in your first example - YOUR code is actually only 3N (N for each of the three loops).
The fact that a single statement in Excel can fill in multiple values does not change the complexity of your written code. A foreach loop is the same as a for loop - N complexity by itself. You only get N^2 when you nest loops.To answer your question about which is better - generally it is preferable to use built in functions where you can.
The assumption should be that internally Excel will run more efficiently than you could write yourself. However (knowing MS) - make sure you always check that assumption if performance is a priority.
I don't think this is correct, but I'm willing to be told why I'm wrong. See my answer. – jtolle Jan 28 at 16:08.
You are right that the first is 3 x O(n^2), but remember that O-notation does not care about constants, so in terms of complexity, it is still an O(n^2) algorithm.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.