Determinism and Ethernet
By: Mike • Essay • 1,361 Words • November 19, 2009 • 1,089 Views
Essay title: Determinism and Ethernet
Determinism and Ethernet
With the emergence of Ethernet as an industrial Fieldbus, many detractors have started to question whether Ethernet is up to the task of being a control network and, in particular, whether Ethernet can be considered deterministic. In this article I will explain some of the recent developments which have taken place since Ethernet was invented in 1972 which allow us to consider a properly planned and installed Ethernet network, deterministic.
The start of Ethernet
Ethernet, as we know it, was first invented by Bob Metcalf of Xerox’s Palo Alto Research Centre almost 30 years ago. However, it has its roots a few years before that with some pioneering work done by Norman Abramson at the University of Hawaii in 1967.
Abramson had the task of getting the university mainframe talking to some of the outlying terminals which were located on some of the other islands of Hawaii. Running a physical cable to them was out of the question and so Abramson looked at using radio. However, due to the frequencies that he was forced to use, he did not have enough different frequencies for all the terminals that he had - so some terminals would have to share. However, the problem with normal shared radio frequencies is that interference occurs and so Abramson realised he would have find a way of regulating the transmission of data.
Abramson came up with the idea of using just two frequencies for all communication, and establishing rules that would dictate when and what a terminal or the mainframe would send. For outbound transmission from the mainframe one frequency would be used and for inbound transmission from the terminals another frequency would be used. So Abramson developed a system of addresses and replies for communication.
The mainframe would send out a message with its address set to one of the terminals. Although all terminals would get it, only the one it was addressed to would actually pass it for further processing. The others would just discard it. Upon receiving the message, the terminal would then check that no other terminals were using the frequency. Then it would transmit a receipt for the message which would normally only be received by the mainframe and other nearby terminals. This worked in both directions, ie for messages sent out by the mainframe and for messages sent out by the terminals.
But what if two or more terminals transmitted messages at the same time? Well, some interference, or a collision, would occur and the mainframe would not receive the message and would not send out a reply to the terminals. The terminals had a timeout period built into them, typically 200 to 1500 nanoseconds, that if they had not received the reply within that period, then they would send the message again. There was also a maximum limit placed on the number of retries the terminal would attempt before reporting an error to the operator or user.
This process or method is known as a CSMA/CD model (or Carrier Sense, Multiple Access, Collision Detect model).
A few years later, when Bob Metcalf was tasked with connecting Xerox’s latest invention (a laser printer) to one of their other inventions (a PC) he decided against running a cable from each PC to the laser printer. Instead, looking through some recent developments in communications he came across Abramson’s work and, with a bit of re-engineering, was able to transfer it so that it ran on coaxial cable and quite a bit faster. (The Aloha network at the University of Hawaii had a bandwidth of 4800bps - Metcalf got the Ethernet network at PARC up to 2.94Mbps) To improve efficiency, although he stuck with the CSMA/CD model, he changed it slightly so that it did not send replies as a way of detecting collisions. Instead, as the system was running on a copper cable, Metcalf looked at the actual voltage on the cable and when the voltage jumped by a predetermined offset, a collision had occurred. This voltage jump was easily detected by all interfaces on the cable and the sending terminal could then retry after a short delay, known as the backoff time. Ethernet was born!
It is at this point that most arguments about the determinism of Ethernet start and finish. The system described above is obviously not deterministic. You could almost never know how long a message was going to take to arrive because you had no way of knowing the other traffic on the network. However, Ethernet development has not stood still since 1972 - rather it has increased in pace during recent years.
The speed of Ethernet
One of the main advantages of Ethernet over almost every other network type is its speed. A common phrase in the networking industry