Ethereal-dev: Re: [Ethereal-dev] [Ethereal-users] Packet error handling ideas, was: Ethereal 0

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: Guy Harris <gharris@xxxxxxxxx>
Date: Fri, 15 Jun 2001 02:25:47 -0700
On Fri, Jun 15, 2001 at 09:10:56AM +0200, Biot Olivier wrote:
> All protocol dissectors should define a "protocol_name.error" entity,
> which is 0x0000 if no errors occur in the packet.  Any error that
> occurs on this packet should set a bit in this ".error" entity.

Is that entity an integral data type? If so, it's limited to, at best,
64 bits - and I'd like to fix, at some point, the places where we
currently use 64-bit integral quantities, so that they use another
scheme on platforms that don't have gint64 and guint64, so it'd
potentially be limited to 32 bits.

That might not be enough bits for some protocols.

An alternative might be to have Boolean fields for various types of
errors, which could be added to the tree as hidden fields - in fact, we
already do that in some cases; those fields currently aren't known to
the Ethereal core as error indicators, but we could perhaps add the
ability to make a field have the property "is an error indication".

That wouldn't be sufficient to allow Ethereal to label particular
visible fields in the protocol tree as "erroneous", and thus wouldn't be
sufficient to allow it to display them in a different color, for
example.

> In this case, selecting packets with any or specific errors is quite
> straightforward.

How would you select packets with checksum errors? I hope you wouldn't
do it by specifying, as a numerical value, a bitmask with the "checksum
error" bit set; you'd want a name associated with the error.

(Currently, the way you'd select IP packets with checksum errors is with
the display filter "ip.checksum_bad".)