An Introduction to

ASCII

Every generation hears the same thing from the previous generation: “You don’t know how lucky you are.” The factory floor is no different. The seemingly simple task of passing characters between two electric devices was a real challenge in the early days. There was no standard way for a keyboard, printer, barcode scanner or display to easily encode and decode printable characters. And then, someone had a great idea.

ASCII resources

Want to learn more about ASCII?

Subscribe to our Automation Education email series to learn the ins and outs of ASCII and the top industrial protocols in a byte-size weekly format!

opcualogo

The History Behind ASCII

It was way back in 1961 when a bunch of white-shirted, black-tied engineers at IBM led the effort at the American National Standards Institute (ANSI) to develop a single common way to interchange numbers between computers. It was a mess before that. Different vendors not only represented letters and numbers differently in their computers, but they used different numbers of bits. A lot of vendors used 6 bits (that’s just 64 different combinations!).

The IBM guys convinced everyone that all computers should represent letters, numbers and special characters in the same way. Their effort led to the American Standard Code for Information Interchange, or ASCII (pronounced AS-kee).

In 1968, President Lyndon Johnson signed a memorandum adopting ASCII as the standard for federal computers, but it wasn’t until Intel invented eight-bit microprocessors in the early 1970s that eight bits and ASCII became common. By the end of that decade, IBM 360 computers (what every Fortune 500 company used) were using an old EBCDIC standard (Extended Binary Coded Decimal Interchange Code), which was finally retired with the introduction of the personal computer in 1981.

A Closer Look

ASCII is 128 specified characters that are categorized into seven-bit integers which define a standard encoding for letter, number and symbol characters.

ASCII was the technology used in almost every PC peripheral. It also became the standard for the US government in 1968. Simple command and response structures were used to create custom control protocols.

Here are some examples of Command (CMD)/Response (RSP) strings.

CMD: Get all?
RSP: 79.8,44.3,186_c,ON,Error2

CMD: ACQ?
RSP: ACQ:ENZERO=5,VMIN=6,AVG=8.99

CMD: ACQ?ENZERO,VMIN
RSP: ACQ:ENZERO=5,VMIN=6

Like what you’re reading?

Subscribe to our Automation Education email series to learn the ins and outs of the top industrial protocols in a byte-size weekly format!

ASCII TPC vs Serial

Now that we know what ASCII is and what devices utilize the 7-bit character encoding, let’s talk about the two most common physical layers used to pass ASCII messages.

ASCII over Serial

For most of its life, ASCII communication occurred over a serial physical layer. Often that physical layer was RS-232, where the ASCII characters were converted into binary bits and transmitted by a signal voltage of +/- 3-13 VDC. The main limitation was that devices needed to be close together and most communication was 1-to-1. It forced early PCs and PLCs to have a lot of serial ports. PLCs, like PCs, have largely done away with native serial ports today.

ASCII over Ethernet

Ethernet TCP/IP is more reliable with its longer runs than its serial counterpart. Serial devices utilizing ASCII messages were the first to swap physical layers. The same ASCII data payload was put inside of a standard TCP/IP packet. This offers the additional benefit of a network that supports multiple nodes.

Where Is ASCII Today?

ASCII is still all around us in RFID devices, barcode scanners and printers. Heck, it’s even used in weather stations to send data. It’s the only data format that can be universally decoded by any computer on the planet. ASCII was the most common encoding used on the Internet until it was taken over by Unicode in the early 2000s. ASCII formed the foundation of Unicode and was therefore included in its set as well as its successor, UTF-8, and the standard encodings being created today.

The Future of ASCII

If you’re an ASCII aficionado, don’t worry. Even after 75 years, ASCII isn’t going away. Strings will continue to move between all sorts of devices: controllers, printers, labelers, barcode readers, enterprise applications, cloud servers and more. As long as we need to move strings around the factory floor, as long as people need string type information to communicate, as long as barcode readers read barcodes and printers print strings, we’ll always have ASCII.

The pertinent question for all of us is: how are we going to move that ASCII data in the future? In the past, we used RS-232 and RS-485, sometimes with protocols like Modbus, to move data to PLCs. Those technologies worked but they were slow and unreliable. Today, we’ve replaced much of that with USB. Sending ASCII data into PLCs over USB has become the standard way barcodes get into PLCs. USB is faster than the serial technologies, more reliable and less expensive. However, USB is limited by distance and connectivity issues inherent in the technology.

The better choice, of course, is Ethernet. Ethernet is the default physical layer for the factory floor. Anything and everything on the manufacturing floor will have Ethernet in the future, even sensors. Note that using Ethernet and TCP/IP requires implementing some sort of application layer to move ASCII data.

Test Tools

When you come across a device using ASCII commands to operate, it is advantageous to have tools that allow you to visualize the commands on the wire. For ASCII over Ethernet, our team uses SocketTest. For serial-based ASCII, we recommend MTTTY. Having both of these tools handy can be useful in the field to help resolve quite a few headaches. Best of all, they’re free.