[SciPy-user] Carateristic distance in a cloud of points

Anand Patil anand@soe.ucsc....
Sun Feb 25 12:32:55 CST 2007


>Correcting a stupid mistake. It bothered me to leave it, sorry for the
>noise.
>
>Ga?l
>
>On Sun, Feb 25, 2007 at 01:03:03AM +0100, Gael Varoquaux wrote:
>  
>
>>Interesting remarks. You forced me to think a bit more about what I was
>>trying to achieve.
>>    
>>
>
>  
>
>>What I am trying to do is to find out the right size to use for symbols
>>when used on a 3D cloud of points. I am not sure what the right "typical
>>distance" should be used. If those symbols are arrows then it seems that
>>should be smaller than the typical inter-point distance. I have in mind
>>something like this:
>>    
>>
>
>  
>
>>if you have n points, find out the distribution of distances, divide it
>>by n**3, then take the value at 0.2 from the smallest. 
>>    
>>
>
>  
>
>>I am having diffculties expressing my point, but the idea would be to
>>consider that the typical distribution will increase as n**3 (which is
>>    
>>
>							n**(1/3.)
>  
>
>>not obvious, for instance if the points are along a plane) and take the
>>lower tail of the distribution, as we are interested in having symbols
>>smaller than the inter-point distance. Taking not the smallest value, but
>>the value at "20%" from the bottom helps getting rid of singular
>>situations where a few points are very close but the major part is spread
>>out.
>>    
>>
Hi Gael,

Scaling relative to the cloud of points might be an easier way to go 
than scaling relative to the actual interpoint spacing. It would make 
sure your arrowheads are readable on the plot, even though some may 
appear oversized relative to their shafts. Also, points that are far 
apart in 3d space might appear close when viewed from particular angles, 
like 'optical doubles' in astronomy.

If you want to make your symbols roughly the right size relative to the 
whole cloud, you might like a widely used quick-and-dirty method from 
statistics.:

from scipy import *
from numpy.linalg import eigh

points = 2.*randn(100,3)
C = cov(points.transpose())
D,V=eigh(points)

Then the `error ellipse', the ellipsoid that kind of sort of tries to 
fit the point cloud, has major axes given by the columns of V with 
length equal to the sqrt of the corresponding elements of D. You could 
then calculate approximately how big or small the point cloud looks by 
projecting the major axes into your viewing plane.

Hope that helps... that would be a milestone for me, my first time 
actually helping someone else on a Python mailing list.

Cheers,
Anand


More information about the SciPy-user mailing list