# Catching a Bus: Basic Serial Communication Part 1, the UART

Bill Marshall
Engineer, PhD, lecturer, freelance technical writer, blogger & tweeter interested in robots, AI, planetary explorers and all things electronic. STEM ambassador. Designed, built and programmed my first microcomputer in 1976. Still learning, still building, still coding today.

March 17, 2020 11:35

Thanks for the excellent write-up. I also had the opportunity to work with UART on FPGA quite recently where I depicted the 'critical' role of bit sampling on the receiver side:

https://github.com/Stavros/UART

March 17, 2020 17:01

@stavros Thanks! I was not aware of this 'triple-sampling' technique.

March 9, 2020 13:15

Thanks. This takes me back to the days when we connected teleprinters to HF radio with things called an Elmux to manage the links. Some nice explanations on the RS-232. I find it valuable.

March 9, 2020 11:07

As you explained, the UART defines the nature of the encoding of parallel data into a serial stream of ones and zeroes with specific timing, and RS-232 defines one method of representation of those ones and zeroes as signal levels on transmit and receive data lines, as well as some handshaking signals to manage the flow of information. A logic 1 (Mark) is represented as a negative voltage at least 3 volts negative with respect to ground, and a logic 0 (Space) is represented as a positive voltage at least 3 volts above ground. The RS-232 output drivers usually have a source impedance of 300 ohms, limiting the current in case of shorts to ground (which should still be avoided)!

There are also several other standards methods for representing binary serial data as electrical signals. One of the early ones is a 20 ma current loop. This became a de facto standard due to its use on the Model 33 Teletype, released commercially in 1963.
As its name indicates, the 20ma current loop represents the data as changes in current, rather than a change in voltage. A current of 20ma signifies a logic 1 (Mark), and a current of 0 (open circuit) signifies a logic 0. This method has the advantage of working over
distances up to several thousand feet at 19,200 baud, and even longer at lower baud rates. The receiver can easily be an optocoupler with logic level output, adding the benefit of voltage isolation between the transmitting equipment and the receiving equipment. The disadvantage is lack of a clear standard for electrical and mechanical implementation. The current source could be anywhere in the loop - at the transmitter, or at the receiver, so the installer needed to know a little bit about the equipment to be connected.

The other prevalent alternatives to RS-232 use balanced data lines, with a two wire twisted pair carrying the transmitted data, instead of a single wire. The receiver looks at the polarity of the difference between the two signal wires, instead of the voltage on a single wire with respect to ground. This gives greater noise immunity, as any noise gets coupled to both wires of the pair nearly equally, and when the receiver takes the difference between them, the noise in one line cancels out the noise in the other line, leaving just the signal. RS-485 is a common implementation. Like the 0-20ma standard, it is capable of handling relatively long (1/4 mile) distances. The combination of long distance plus high noise immunity makes it very popular in industrial settings like manufacturing plants or water treatment facilities. It is also capable of a fair bit of common mode voltage, and of driving many receivers off a single transmitter.

March 9, 2020 11:07

Bill, you mentioned DTR and DSR in your reply to Boss. The RS-232 DTR (Data Terminal Ready) and DSR (Data Set Ready), along with CD (Carrier Detect) and RI (Ring Detect) signals provided management of the interface with the modem. The first two allowed the terminal (or computer) and modem to know when the other was present and powered up. The carrier detect let the terminal or computer know when the modem had an active connection to the modem on the far end of the phone or other communications line. And the ring indicator signal provided indication of an incoming phone call. These four signals, along with TX and RX data and RTS, CTS, and Ground, round out the lines present on a 9-pin serial port.
Note that it wasn't always a mis-use for a program not to communicate if it didn't see the DSR signal. It all depends on context. If a communication link is local, with no modems / phone lines present, then RTS / CTS provide a sufficient mechanism for managing the data flow. But if it is a remote communications link, it is entirely appropriate for software to make sure the modem is powered up and ready before sending data to the modem (which may include commands to the modem itself), and further to require detection of carrier before transmitting data intended for the remote end.
Since the same software would sometimes be used for both link types, a "null modem" connector or jumper wires were used on local lines. This wiring fed the Data Terminal Ready signal from the terminal or computer directly back to the Data Set Ready and Carrier Detect lines, so as soon as the computer or terminal asserts DTR (saying "I'm up and ready"), it sees that same signal coming back telling it that the modem is up and ready (DSR), and that the comm link to the remote end is up (CD). This is the spoofing you referred to.

The additional lines on a 25 pin RS-232 connector were for less commonly used / implemented features. These allowed for a secondary data channel, and for explicit clock signals to be supplied, especially important for synchronous data links. (The ubiquitous UART is for Asynchronous serial, but there were also USART chips that could do asynchronous or synchronous communications. Some of these also included Manchester coding of the data, which allowed the data to contain its own recoverable clock.)

Also, the RTS / CTS handshaking evolved over the years. In the early days of half-duplex modems, the handshake was asymmetric. The terminal would assert RTS, asking the modem to turn on the transmit portion, and the modem would assert CTS once it had established synchronization with the modem on the other end. But there was no corresponding signal to tell the modem the terminal's buffer was ready to receive. As modems evolved, full-duplex became the norm, and faster transmission rates meant bi-directional throttling via handshake became more important. (Throttling becomes more important when new characters are coming in fractions of millisecond apart, to a computer that may be servicing interrupts from many peripherals, increasing the likelihood the processor won't always be ready for more data.) The RTS signal evolved into a meaning more accurately described as Ready To Receive.
All these issues help explain why RS-232 breakout boxes became an essential item in the toolbox of anyone doing much with serial communications!

March 9, 2020 18:25

@BradLevy My view is that the RS-232 standard was obviously devised by a very large committee and shows what happens when nobody will compromise! Do you really need 25 signals to provide a low-speed serial comms link between Data Terminal Equipment (DTE) - a teletypewriter, and Data Communication Equipment (DCE) - a modem? The fact is that in most cases only a maximum of perhaps 6 pins on the 25-way DB25 connector were ever actually wired up. This became obvious when IBM reduced the COM port connector for their PC AT range to the smaller 9-pin DE9 type instead. The trouble was that manufacturers of peripheral devices could pick and choose which pins they used for their particular design resulting in the customer wondering why the product didn't work when they plugged it into their computer. Getting RS-232 links to work was a nightmare. If that wasn't bad enough, using all those tempting 'spare' pins for other purposes created even more confusion. For this reason the RS-232 'standard' became no standard at all.

March 9, 2020 22:25

@Bill Marshall I take a more charitable view of the RS-232 standard. It is true there are some circumstances in which 3 wires would suffice: TXD, RXD, and Ground. That works as long as you don't care if the other end sends data when you aren't ready for it, and don't care if your data gets through. There are plenty of times where that is fine. But there are times when it isn't, so you need handshaking lines. And over a phone connection via modem? You need the ring indicator and carrier detect. You are up to at least seven of the nine pins of that DB-9 at that point. The standard defined what most of the pins in the 25 pin connector were to be used for. Only three were unassigned or reserved for test purposes. The others provided loopback and test capability, a secondary channel, and external timing sources. Each of those were important capabilities in some environments and use cases - they just may not have been the environments you worked in. Problems did occur when some people appropriated some of the lines not needed in their use case to other purposes, ignoring the standard, and sometimes created potential for damage in the process. One of the biggest causes of cabling not working right off the bat has to do with the sense of direction of TXD and RXD. A straight-through cable worked for connecting the DTE (Data Terminal Equipment) to the DCE (Data Communication Equipment, typically a modem). The TXD line carries data from the DTE to the DCE, and the RXD line carries data from the DCE to the DTE. But if you are connecting two DTE devices (computers or terminals) directly to each other, the TXD line of one needs to go to the RXD line of the other, and vice-versa. Otherwise the two transmitters will be fighting each other over the TXD line, while the two receivers will both be listening on the RXD line, but have no signal to listen to. Similarly with the handshake lines. So a "null modem" crossover adapter would be used. This problem wasn't due to lack of adherence to the standard, just to a different use case: two DTEs talking to each other, instead of each talking to a modem. The same issue popped up in 10Base-T Ethernet cabling: peer-to-peer connections (directly connecting two computers) required a crossover cable, while computer-to-router or router-to-modem used straight through wiring. In the case of Ethernet, auto MDI-X circuitry was eventually developed that allowed smart Ethernet interfaces to automatically negotiate which end was going to transmit on one pair and which end was going to transmit on the other pair, eliminating the issue for newer installations.

March 9, 2020 11:07

Yes a great post, brought back so many memories with all the different breakout boxes we had (still have some!). Also recall interfacing an RS232 (expensive) microbalance to a "Pet" computer (this dates it!), after a few minutes smoke came from the balance which ended up with a burnt-out PCB! The manufacturers had put the +/- 24V power onto "unused" RS232 pins for their own expansions purposes!!!
I particularly like the reference to the mechanical strobe, I was not aware of these.

March 9, 2020 11:07

@Boss Your experience with RS-232 sounds all too typical! Even today I find manufacturers of say, wireless modules, play fast and loose with both the logic levels and the signals they use for flow control. Fortunately most have now settled on RTS/CTS, but older equipment might (mis)use DTR/DSR instead or as well, requiring 'spoofing' of connections to make the interfaced equipment work. As for the printer I have nothing but admiration for the engineers who designed all this mechanical logic. The service engineers were a pretty impressive bunch too. I have the original user and service manuals for the Creed 75 as well as the 'special tools' which include the tuning fork shown, spring balances to check spring tensions and a key to wind the mechanism round by hand. Use of that key was I guess equivalent to 'single-stepping' a processor chip. I imagine a mis-alignment in any of the levers/cranks and that big motor could wreck the mechanism completely. The manual key must have been the service engineer's best friend!

March 9, 2020 11:07

@Bill Marshall wow, treasure that tuning fork! Having completed my degree I think at least for me working with service engineers, well at least looking over their shoulders provided me with a great foundation for design. These are a professional group of people! I also took every opportunity to dig in and fault find, it provided a great foundation to build upon. As for RS 232, the parity check bit also added to the confusion when there was the typical lack of documentation! Thanks for a very interesting post, really enjoyed it.

March 9, 2020 11:07

Great post, Bill! Reminded me also of an FPGA workshop we ran a few times, called ChipHack, where an introductory exercise was to build a UART transmitter and receiver in RTL.

https://github.com/embecosm/chiphack/blob/master/tutorials/uart/source/index.rst