[SciPy-User] Correlation coefficient of large arrays
Tue Mar 16 00:39:45 CDT 2010
how much memory does a
>>> 230000**2 = 52900000000L float (double) array take ?
I guess I don't have a real appreciation for how large this is. I can do
this numpy.ones((100000,50000),dtype=np.float64) and it uses about 85% of
the memory I have available. But thats a long ways from 230,000X230,000. Of
course the array is symmetric.
Is it feasible to do writing it to the disk?
The end goal is to find the difference between two correlation arrays and
then calculate the mean of each column. Which then leaves me with an array
my blog <http://vincentdavis.net> |
On Mon, Mar 15, 2010 at 11:16 PM, <email@example.com> wrote:
> On Tue, Mar 16, 2010 at 1:04 AM, Vincent Davis <firstname.lastname@example.org>wrote:
>> I have an array 10 observations of 230,000 variables and what to find the
>> correlation coefficient between each variable.
>> numpy.corrcef(data) works except I can only do it with about 30,000
>> variables at a time. numpy.corrcef(data[:30000]). It uses up a lot of
>> Is there a better way?
> how much memory does a
> >>> 230000**2
> float (double) array take ?
> (I'm not going to try)
>> *Vincent Davis
>> 720-301-3003 *
>> my blog <http://vincentdavis.net> | LinkedIn<http://www.linkedin.com/in/vincentdavis>
>> SciPy-User mailing list
> SciPy-User mailing list
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User