Re: Changing the clock speed

From: David Koogler <koogler_at_nospam.org>
Date: Tue Feb 14 1995 - 11:01:42 PST

Peter Holzer brought up some interesting observations and criticisms:

1). Don't use BogoMips.

I agree that the Linux BogoMips metric is not very useful in and of itself.
 However the measure is relatively easy to make, easy to understand, and does
not require any external equipment. I can easily hook up a frequency counter
to an I/O port or pick out the signal from the CPU and use that as a metric.
 Not everyone is so equipped.

What we need is some method to characterize the performance of a machine
against many others that are running VSTa in order to normalize performance
metrics. The PC hardware provides only two devices for measuring elapsed time.
 The only way I can see to get an idea of a machine's performance is to run
some set of instructions and see how far the interval timer and/or RTC counted
during the execution. This metric is crude but easily reproduced any machine
regardless of configuration.

2). Why calibrate the RTC against the interval timer?

I want to tune the number that I place into the interval timer. Under MS/DOS,
the CPU just allows the timer to roll-over and accepts the 1/18.2 second
period.
If I want a higher rate, I need to set the counter to a different value. What
value do I use? Without an external time reference of any sort I can only
guess (divide 0xFFFF by 5 to get a 100HZ period). That number needs adjusting
to account for the overhead in processing the interrupt and resetting the
counter.
Why not write a calibrator that runs for a given period of time (as determined
by the RTC) and then see how many times the interval timer ticked? Long term
accuracy is not required, only good short term accuracy.

Note that the error on the interval timer's absolute rate depends upon a
crystal oscillator circuit. Typical crystal's have a frequency output error of
between 1% to 5% (cheap clones have lousy crystals :->). No matter what I do,
the interval timer will never give accurate measures over long time periods,
but it is useful for short period measures.

3). You can't set the interval timer interval very precisely.

I absolutely agree with you. I'm not looking for microsecond resolution, but
something more on the order of 10 milliseconds is more appropriate. For more
precise measures, I hook up my frequency counter which is rated to 100mHZ with
a 0.1 ppm error rate.

P.S. I have always hated the PC's interval timer implementation. Besides the
accuracy problems, the timer is limited to 16-bits and runs a very odd-ball
frequency. For industrial control work which needs accurate time, I use one of
the digital I/O and timer cards. The cards have 1MHz 1% clocks and, by ganging
the counters together, a 48-bit range.

-- 
David Koogler (koogler@bedford.progress.com)
Received on Tue Feb 14 10:24:04 1995

This archive was generated by hypermail 2.1.8 : Thu Sep 22 2005 - 15:12:17 PDT