[Numpy-discussion] making the distinction between -0.0 and 0.0..

Christopher Barker Chris.Barker@noaa....
Tue Sep 29 11:53:40 CDT 2009


Hi folks,

This isn't really a numpy question, and I'm doing this with regular old 
python, but I figure you are the folks that would know this:

How do I get python to make a distinction between -0.0 and 0.0? IN this 
case, I'm starting with user input, so:

In [3]: float("-0.0")
Out[3]: -0.0

so python seems to preserve the "-". But:

In [12]: float("-0.0") == float("0.0")
Out[12]: True

In [13]: float("-0.0") < float("0.0")
Out[13]: False

In [14]: float("0.0") > float("-0.0")
Out[14]: False

It doesn't seem to make the distinction between -0.0 and 0.0 in any of 
the comparisons. How can I identify -0.0?

NOTE: numpy behaves the same way, which I think it should, but still...

My back-up plan is to process the string first, looking for the minus 
sign, but that will require more changes than I'd like to the rest of my 
code...

thanks,
-Chris



-- 
Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker@noaa.gov


More information about the NumPy-Discussion mailing list