You suggest in the on-line documentation for your Command-Line Utilities that the hardware RTC should be set to UTC, which I've done. Some of my OS/2 programs are now talking to me in UTC rather than local time. I have the TZ environment variable set in my CONFIG.SYS file, but that doesn't seem to help. These programs don't respect it. How can I convince them to display local time?
This is the Frequently Given Answer to that question.
There are three approaches to this. The third one is the best, since it fixes the problem in the right way; but for obvious reasons it's also the most difficult to effect.
Your applications are broken. Stop using them and replace them with better ones.
You may think this harsh, but it is no more than the simple truth. Applications that don't understand the TZ environment variable are broken, with respect to timekeeping. Applications that expect the hardware RTC to run in local time are also broken (because this yields unresolvable ambiguities, and a "UTC" time that is neither unidirectional nor monotonic).
In an ideal world, everyone would be using better applications that are not broken with respect to timekeeping; applications that handle timezones properly, in the way that 32-bit OS/2 was obviously originally designed for them to be handled. Depending from the application, there may or may not be such replacements available.
DIGCLOCK
.
SAYDATE
, and
the DATE and TIME commands that accompany the
32-bit Command Interpreter,
all understand the TZ environment variable and will display either the
correct local time (applying DST adjustments automatically) or UTC.
XDIR
, and
the DIR command that accompanies the
32-bit CMD,
both understand the TZ environment variable and will display either the
correct local time (applying DST adjustments automatically) or UTC.
Claim that your local time is UTC. Edit your TZ environment variable so
that the offset from UTC to your local time is zero hours. (For Australian
Eastern Time, for example, use SET TZ=EST0EDT
.) Then set your hardware RTC
to this false UTC.
Your hardware RTC will effectively be running in your local standard time. OS/2 applications written with "DOS Think" notions of the system time will display your local standard time because they will use what they obtain from the RTC. OS/2 applications written to handle time properly and do the Right Thing will also display your local standard time, because they will apply an offset of zero to the system clock, which they believe to be running in UTC, when calculating your local time.
There are three caveats to this solution, which demonstrate why it is imperfect, and also why there will never be a perfect solution as long as the "DOS Think" persists:
/UTC
(/U
in version 2.0) option to
SAYDATE
,
DIGCLOCK
, or
ANACLOCK
, for example,
you won't see the correct UTC time (because, of course, you have lied to
them about what UTC is -- you have pretended that your local time and
UTC are identical).
However, many such "DOS Think" applications are pretty dumb when it comes to timezones anyway, and do things like not recording timezones in any timestamps that they generate in their data. So if one were changing the hardware RTC manually at every DST transition, these applications would become confused, or their data would become ambiguous, anyway.
This isn't as daft as it superficially appears to the novice. It's the
Standard C library where most of the code that needs to be changed resides.
Most applications programmers are sensible enough to use the standard
library routines such as time()
, localtime()
,
gmtime()
, mktime()
, and strftime()
to
perform date and time processing in their applications. So one simply needs
to recompile and link the application against a Standard C library that has
been fixed to work properly. With many applications, one doesn't need to
even touch the application binaries themselves, because the application
links to the DLL form of the C/C++ compiler's Standard C library. All that
one needs to do is create a new Standard C library DLL that contains library
routines that have been fixed to work properly, and substitute it for the
existing one.
The irony of the situation is that 32-bit OS/2 treats the current time more
like Unix does than like DOS does, yet people made the mistake of using
32-bit OS/2 like DOS. The UNIX kernel provides the current time to
applications code as a single (usually 32-bit) number that represents the
number of seconds since 1970-01-01 00:00:00 GMT. DOS merely provides the
local time to applications code in "broken down" form. 32-bit
OS/2, however, presents the system clock to applications code in pretty much
the same way as Unix does. It provides a 64-bit count of the seconds since
1970-01-01 00:00:00 GMT via DosQuerySysInfo()
.
The library programmers at Borland, IBM, MetaWare, Watcom, and so forth, when they implemented the Standard C library time functions in the 32-bit OS/2 versions of their C/C++ compilers, used the same mechanism that they did on DOS: read the hardware RTC directly, assume that it runs in local time with the user applying the DST corrections manually, and attempt to work backwards from that to UTC. On DOS, this sort of approach is, sadly, necessary, even though it results in unresolvable ambiguities and a "UTC" time that isn't actually unidirectional and monotonic. These implementations of the Standard C library time functions are a bodge, one which had to be invented in the first place for the standard libraries of DOS C/C++ compilers because DOS didn't work like UNIX did. If DOS had worked as UNIX did, the DOS C/C++ compiler vendors could just have copied the original UNIX standard library implementations, of course. But it didn't, and the DOS implementations are flawed as a result. (It is, in fact, impossible to implement the functions completely correctly, given the constraints imposed.)
However, 32-bit OS/2 does work as UNIX does. On 32-bit OS/2, such a bodge isn't necessary. One can simply use the same approach as used on UNIX implementations of the Standard C library, which results in a monotonically increasing UTC, automatic adjustment between daylight and standard time without the user having to lift a finger, and (almost incidentally) the ability to use multiple timezones and DST rules on one machine. If the compiler vendors had take a little more care with their 32-bit OS/2 C/C++ implementations, rather than doing a cheap port from their DOS code, they could have had things working correctly right from the start.
But it is possible to fix some of the Standard C libraries.
To fix (for example) EMX C++ so that it does the Right Thing on 32-bit OS/2,
simply remove the current code from its standard library, and put the GNU
"libc" time handling code in instead, using a gettimeofday()
that is a simple wrapper around DosQuerySysInfo()
. Once one
has fixed EMXLIBCS.DLL
and EMXLIBCM.DLL
to
handle time properly, of course, one has fixed, in one fell swoop, all of
the applications that use them.