I tracked this down to the fnmatch module. Glob. Glob calls fnmatch to actually perform the globbing, and fnmatch has a cache of regular expressions which is never cleared.
So in this usage, the cache was growing continuously and unchecked. I've filed a bug against the fnmatch library 1.
I cannot reproduce any actual leak on my system, but I think your "every 100th iteration, 100 objects are freed" is you hitting the cache for compiled regular expressions (via the glob module). If you peek at re.py you'll see _MAXCACHE defaults to 100, and by default the entire cache is blown away once you hit that (in _compile). If you call re.purge() before your gc calls you will probably see that effect go away.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.