Fall Is Here Who Has The Jitters?

Fall-is-Here-Who-Has-the-Jitters
I’m at the EtherNet/IP ODVA meeting in Charlotte today. One of the biggest questions at the meeting is the definition of jitter. Many people are very unclear on the meaning of Jitter. In very simple terms, network jitter is simply the deviation between the time when a device is expected to issue a message and when the message is actually transmitted. For example, if a DeviceNet or EtherNet/IP device is expected to issue cyclic messages every 10msecs (very typical cycle time) and sends them between 9 and 11 msecs then we have 1 msec of network jitter.

 

So what’s good? Well, anything less than 50% jitter is acceptable. So, in our previous example, a device could send the next cyclic message between 5 and 15msecs of sending the last one. Unless you have a very high speed, timing critical application network jitter like this is acceptable. Once the device gets over 50% jitter though, the Client or Master device might begin to detect missing messages. If the jitter gets bad enough, the Master (Client) might even shut the device down and try reconnecting to it causing a loss of data or worse.

A good question that came up at today’s meeting was about Ethernet switches. What is the impact of the switch on network jitter in an EtherNet/IP network. Well, it turns out that it is nearly nothing. As switches receive a packet they analyze the first few bytes of the packet to identify the outgoing port and immediately start to send message out that port, even before the end of the message is received. At this speed the switch adds almost nothing to network jitter.