Re: Changing the clock speed

From: David Koogler <koogler_at_nospam.org>
Date: Tue Feb 14 1995 - 08:35:52 PST

A frequency counter? Yes, that is the best way to get a calibrated clock. One
technique I saw for MS/DOS and calibrating against the U.S. NIST atomic clock
was to pulse one of the parallel port status lines (Ready to Send?) on each
clock tick. Just hook the frequency counter up to the status line and directly
read off the clock rate.

I agree that the RTC drift can be rather high, depending on the manufacturer.
 Most of the high-end equipment, such as Compaq, have rather low drift rates
but some of the other clones can be really nasty. However, over just a short
period of time, say 1 to 5 minutes, the RTC should be reasonable.

Personally, I wish the PC designers had used the "main's" frequency as the time
basis rather than the internal crystal. The electric power grid has to
maintain a frequency error rate of something less than 1 ppm, or else the
entire grid can collapse. A simple notch filter and a zero-crossing detector
gives you the external clock reference...much cheaper than the crystal
oscillator and interval timer chip on the PC. Most of the minicomputers used
just such a simple time reference (thus the 60HZ interval on the PDP-11 and
Unix).

If your interested, I can look up the source code for calling NIST and setting
the clock. I believe there are several other international standards
organizations with similar facilities.

-- 
David Koogler (koogler@bedford.progress.com)
Received on Tue Feb 14 07:58:12 1995

This archive was generated by hypermail 2.1.8 : Thu Sep 22 2005 - 15:12:17 PDT