[Numpy-discussion] Freeing memory allocated in C

Nick Fotopoulos nvf at MIT.EDU
Thu Apr 27 21:02:03 CDT 2006


Dear numpy-discussion,

I have written a python module in C which wraps a C library (FrameL)  
in order to read data from specially formatted files into Python  
arrays.  It works, but I think have a memory leak, and I can't see  
what I might be doing wrong.  This Python wrapper is almost identical  
to a Matlab wrapper, but the Matlab version doesn't leak.  Perhaps  
someone here can help me out?

I have read in many places that to return an array, one should wrap  
with PyArray_FromDimsAndData (or more modern versions) and then  
return it without freeing the memory.  Does the same principle hold  
for strings?  Are the following example snippets correct?


// output2 = x-axis values relative to first data point.
data = malloc(nData*sizeof(double));
for(i=0; i<nData; i++) {
   data[i] = vect->startX[0]+(double)i*dt;
}
shape[0] = nData;
out2 = (PyArrayObject *)
         PyArray_FromDimsAndData(1,shape,PyArray_DOUBLE,(char *)data);

//snip

// output5 = gps start time as a string
utc = vect->GTime - vect->ULeapS + FRGPSTAI;
out5 = malloc(200*sizeof(char));
sprintf(out5,"Starting GPS time:%.1f UTC=%s",
     vect->GTime,FrStrGTime(utc));

//snip -- Free all memory not assigned to a return object

return Py_BuildValue("(OOOdsss)",out1,out2,out3,out4,out5,out6,out7);


I see in the Numpy book that I should modernize  
PyArray_FromDimsAndData, but will it be incompatible with users who  
have only Numeric?

If the code above should not leak under your inspection, are there  
any other common places that python C modules often leak that I  
should check?

As a side note, here is how I have been defining "leak".  I have been  
measuring memory usage by opening a pipe to ps to check rss between  
reading in frames and invoking del on them.  Memory usage increases,  
but does not decrease.  In contrast, if I commit the same data in an  
array to a pickle file and read that in, invoking del reduces memory  
usage.

Many thanks,
Nick




More information about the Numpy-discussion mailing list