[Numpy-discussion] memory leak in array
robert.kern at gmail.com
Thu Jun 15 00:38:41 CDT 2006
saagesen at sfu.ca wrote:
> Update: I posted this message on the comp.lang.python forum and their
> response was to get the numbers of references with sys.getrefcount(obj).
> After doing this I see that iterative counters used to count occurrences
> and nested loop counters (ii & jj) as seen in the code example below are the
> culprits with the worst ones over 1M:
> for ii in xrange(0,40):
> for jj in xrange(0,20):
Where are you getting this 1M figure? Is that supposed to mean "1 Megabyte of
memory"? Because they don't consume that much memory. In fact, all of the small
integers between -1 and 100, I believe (but certainly all of them in xrange(0,
40)) are shared. There is only one 0 object and only one 10 object, etc. That is
why their refcount is so high.
You're going down a dead end here.
> nc = y[a+ii,b+jj]
> except IndexError: nc = 0
> if nc == "1" or nc == "5":
What is the dtype of y? You are testing for strings, but assigning integers.
"I have come to believe that the whole world is an enigma, a harmless enigma
that is made terrible by our own mad attempt to interpret it as though it had
an underlying truth."
-- Umberto Eco
More information about the Numpy-discussion