> In Wireshark the Max Delta represents the DELAY?, are these concepts
all right?
No, what you mean with DELAY is end-to-end delay which should be under
150ms to have good quality. Wireshark can not calculate this end-to-end
delay since the only information is has are the timestamps of the
packets as they were captured. Max Delta just represents the maximum gap
between two consecutive packets. In case of g.711 codec and 20ms
packetization this would mean, that packets should come in gap of 20ms
and in the ideal case, without any jitter, also the Max Delta would be
20ms. But because of the jitter one packet will come later and the Max
delta will increase.
Regarding the Max and Mean jitter be aware that jitter calculations
follows the specification of RFC1889 saying:
"The interarrival jitter is calculated continuously as each data
packet i is received from source SSRC_n, using this difference D for
that packet and the previous packet i-1 in order of arrival (not
necessarily in sequence), according to the formula
J=J+(|D(i-1,i)|-J)/16
"
in other words, what you see in the table for jitter are not absolute
values of last two packets. Max jitter is the highest jitter which appeared.
This is how the first implementation of this function in ethereal
worked. I didn't look in the code, but I think it is still more or less
the same (otherwise someone will hopefully correct me).
Regards, Miha