[Numpy-discussion] Not enough storage for memmap on 32 bit WinXP for accumulated file size above approx. 1 GB

Kim Hansen slaunger@gmail....
Mon Jul 27 05:37:53 CDT 2009


>
> I think it would be quite complicated. One fundamental "limitation" of
> numpy is that it views a contiguous chunk of memory. You can't have one
> numpy array which is the union of two memory blocks with a hole in
> between, so if you slice every 1000 items, the underlying memory of the
> array still needs to 'view' the whole thing. I think it is not possible
> to support what you want with one numpy array.

Yes, I see the problem in getting the same kind of reuse of objects
using simple indexing. For my specific case, I will just allocate a
new array as containing a copy of every 100th element and return this
array. It will basically give me the same result as the original
recarray is for read-only purposes only. This will be very simple
implement for the specific cases I have

>
> I think the simple solution really is to go 64 bits, that's exactly the
> kind of things it is used for. If your machine is relatively recent, it
> supports 64 bits addressing.
>
The machine is new and shiny with loads of processing power and many
TB of HDD storage. I am however bound to 32 bits Win XP OS as there
are some other costum made third-party and very expensive applications
running on that machine (which generate the large files I analyze),
which can only run on 32 bits, oh well....

Cheers,

Kim


> cheers,
>
> David
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>


More information about the NumPy-Discussion mailing list