I fixed the problem, but I was wondering if any of you experts know what going on in the bowels of the computer that would cause this. Here's the code snippet:
SUBROUTINE GET_DATE_TIME(DATE, TIME)
* Example of return values:
* DATE = '03/13/01'
* TIME = ' 9:18 am'
CHARACTER(8) DATE, TIME
INTEGER(2) IYR, IMON, IDAY, IHR, IMIN, ISEC, IHUNDRETHS
* GET DATE
TMP=IYR / 100.0
IYR = (TMP - INT(TMP)) * 100
To fix it I added this to the declarations:
and I changed
TMP=IYR / 100.0 to TMP = dfloti(IYR) / 100.0
The clue that tipped me off was that I assigned
TMP - INT(TMP) to a variable and looked at it in the debugger. The value was 0.049999 something, something. In our perfect base 10 world it should have been 20.05 - 20 = 0.05.
Yes, I know it's a precision problem, but I want to understand it. Any takers?