Even More On Normalization of PLC Data Tags

Even more on data normalization - RTA's Blog

I’ve written a number of articles in the past on normalizing PLC data especially Allen-Bradley PLC data. I keep writing more about it for two reasons. One, it’s more critical today in this IoT world where we’ve created digital garbage dumps by sending tons of raw data into the cloud. And two, possibly the most important feature of our new digitalization Software Toolkit (something we call Raptor) is our normalization capabilities.

Normalization is a funny word. Literally, it means to make normal. That sounds good, but what is normal? For most of us, it means usual and customary. I normally eat breakfast at 6 am. I normally go to church on Sunday. Each of us has our own normal activities.

Unfortunately, that’s also true in the automation world. Different systems and applications define what normal means to them in different ways. For example:

Database designers – If you’re an expert in relational databases, you’ll know that databases can be structured anywhere from first normal form to fifth normal form. What that really means is probably a mystery to you (it is to me), but normalization as I understand it means keeping data organized into different tables that can be referred to by other tables. Think of a table with the names of the states so every table wouldn’t have to store strings like “Michigan” or “California” and that state names could be updated in only one place.

System Startup normalization – refers to the time required for a system to have communications established, data points read successfully and the system to “normalize” before allowing alarms, events, or historical logging to occur.

Point normalization – similar to system startup, except at a micro-scale. Some sensors on power produce wildly varying values for a small period of time, before settling down to their expected operational range/value.

Remote normalization – this is a process of bolting on context (units, maximum, minimum, collection time/date…etc.) with the data value so that a remote system can do its own normalization to its own satisfaction. If you can keep that context with the value, this works fine, but that’s been unusual for data on the factory floor.

Imputed Normalization – this is where a system makes assumptions about the value, its units, and any scaling. It assumes that 45 means 4.5% on a 0 to 50-degree temperature scale. That works great as long as the source system reliably supplies that value without making any changes to the scale.

Alarm normalization – some people call the process of a value in an alarm range returning to its pre-alarm value as alarm normalization.

Normalization isn’t really sexy and it isn’t much talked about in polite company, but it really is the guts of Smart Manufacturing, Digitalization, and Industry 4.0. When data isn’t normalized or when it is improperly normalized, everything falls apart. Your highly paid, highly trained data scientist can’t generate any results if their data can’t be properly correlated.  Normalization is the key to making that happen.

That’s why when our Raptor software ingests EtherNet/IP data, PROFINET IO data, DNP3 data or anything else, we normalize it so that when it gets to your enterprise application, AWS, MS Azure or your cloud system, it can be used to solve real-world problems and increase the quality and efficiency of your manufacturing processes.