Jitter is defined as a variation in the delay of received packets.
Let us suppose at the sending side, packets are sent in a continuous stream with the packets spaced evenly apart. Due to network congestion, improper queuing, or configuration errors, this steady stream can become lumpy, or the delay between each packet can vary instead of remaining constant. This variation in delay is ‘jitter’.
While there are many ways of measuring this variation, in NetSim ‘jitter’ is measured as the statistical variance of delay. Variance is defined as the square of deviation from the mean.
The unit of Jitter is the square of the underlying. So if the underlying is seconds then the Jitter unit is Second Square. If the underlying is Microseconds then the unit is (Microsecond) Square
Introducing Jitter using Background traffic
Background traffic can be used to test the performance of applications when link bandwidth is consumed by other traffic. It can also be used to induce jitter for testing real-time applications.
The Background traffic in NetSim can be modelled as a Poisson process in which bursts of data of a fixed sized are transmitted at an average rate such that the link will be occupied at the specified link utilization rate. Because it is a random process, over short periods the actual background traffic link utilization rate may vary from the configured value. The rate of arrival of background traffic frames affects the jitter. Larger number of background packets induce greater jitter in competing traffic.
In NetSim, the way to increase the number of background packets arriving is to reduce the inter-arrival time of that application, as explained in the links below