On Sun, Mar 25, 2001 at 12:33:20PM +0800, Rick Goh
wrote:> Inter-packet gaps if too big may cause real-time application to have
> jitters. But how do i measure the jitter if supposedly i send out a tcp/udp
> stream??
>
> Attached is an extract of tcpdump data.
>
> Any information on how to post-analyse tcpdump for jitter, packet loss is
> very very much appreciated.
The text format isn''t very useful for me. :) Tcpdump binary would be
much
better.
What I normally do is a binary capture set to only grab the flows I''m
interested in (i.e. no spurious arp activity like your dump), then I run the
binary output back through tcpdump with it set to delta time mode once for
each flow with the right filters to isolate the flow.
Then I can import the resulting data into R (a freesoftware statistics
package) where I can see and analyze the delay characterize of the flow.
It''s also useful to gather the same data on the transmit side and see
how
the statistics differ.