Ethereal-dev: Re: [Ethereal-dev] NetXray / Sniffer Time Codes

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: Guy Harris <guy@xxxxxxxxxx>
Date: Thu, 8 Feb 2001 19:09:43 -0800 (PST)
> Thanks, I found it about 3 hours after I sent my prev msg.
> Just out of curiosity, are there any ideas about why that
> value was chosen?

>From my copy of "ISA System Architecture", a manual describing
PC/AT-compatible PCs (i.e., PCs prior to EISA, Microchannel, VESA Local
Bus, and PCI):

								Chapter 24

	Prior To This Chapter

		The previous chapter described the numeric coprocessor
		interface and floating-point emulation.

	In This Chaper

		This chapter describes the system timers incorporated in
		all ISA-compatible machines.

	-------------------------------------------------------------------

	The System Timer, Timer 0

		The System Timer, Timer 0, is a programmable frequency
		source.  A 1.19318MHz signal provides its input clock
		rate.  You can specify a divisor to divide into the
		input clock to yield the desired output frequency. ...

1/(1.19318*10^6) = .838096*10^-6, i.e. 1.19318 MHz is one cycle every
.838096 microseconds.

The classic Sniffer was a notebook PC with Network General's software
running on top of DOS, but "running on top of DOS" doesn't prevent you
from whacking the PC system timer to keep whatever time you want,
including setting the divisor to 1.

I seem to remember reading some Sniffer documentation that (partially)
described the classic Sniffer file format; I think it gave the various
time unit code values as

	0	Unspecified - default by network type
	1	"PC" - 0.838096 microseconds
	2	"3Com" - 15 microseconds
	3	"Micom" - 0.5 microseconds
	4	"Sytek" - 2 microseconds

I suspect that "PC" means "we used the PC's timer to time-stamp
packets", and that "3Com", "Micom", and "Sytek" refer to using a timer
on network interfaces from the company in question to time-stamp
packets.

> Other than cussedness?  Seems goofy
> that NG would force the use of floating point into something
> as basic and pervasive as measuring time.

Blame IBM, or maybe Intel, depending on who specified the system timer
input signal frequency. :-)