You could try using the suppression file that comes with the python source.
You could try using the suppression file that comes with the python source Reading the Python Valgrind README is a good idea too!
As a high level note: In general Valgrind needs some help with custom allocators as it's not able to comprehend the behavior of a custom allocator as it could a standard implementaion. – Falaina Oct 5 '09 at 11:52.
This is quite common, in any largish system. You can use Valgrind's suppression system to explicitly suppress warnings that you're not interested in.
The most correct option is to tell Valgrind that it should intercept Python's allocation functions. You should patch valgrind/coregrind/m_replacemalloc/vg_replace_malloc. C adding the new interceptors for PyObject_Malloc, PyObject_Free, PyObject_Realloc, e.g. : ALLOC_or_NULL(NONE, PyObject_Malloc, malloc); (note that the soname for users allocation functions should be NONE).
Yes, this is typical. Large systems often leave memory un-freed, which is fine so long as it is a constant amount, and not proportional to the running history of the system. The Python interpreter falls into this category.
Perhaps you can filter the valgrind output to focus only on allocations made in your C extension?
There is another option I found. James Henstridge has custom build of python which can detect the fact that python running under valgrind and in this case pymalloc allocator is disabled, with PyObject_Malloc/PyObject_Free passing through to normal malloc/free, which valgrind knows how to track. Package available here: https://launchpad.Net/~jamesh/+archive/python.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.