Local Vs. Cloud: Are You Going About Data Logging All Wrong?

The Surprising Answer the Tech Goliaths Don’t Want You To Know!

Introduction

There was a time in the not too distant past when data logging meant someone walked around the
plant gathering USB sticks from data recorders and transferred the data to Microsoft Excel.
Before that, they were walking around the plant with a clipboard carefully writing down values.
When long-term storage was required, there were circular chart recorders and people whose job
was to file those charts in rusty, old, metal filing cabinets.

Those clipboards are now used to level formerly wobbly table legs. The USB sticks were unceremoniously chucked into the nearest dumpster after the Stuxnet malware attack. The chart recorders (mostly) became museum relics. Faster than you can say “cloud computing,” manufacturers went from these archaic data collection methods to new technologies, devices and software systems that can collect and process unlimited amounts of manufacturing data.

Today, your Allen-Bradley (AB) PLC data can be logged directly into cloud applications, middleware software or local data loggers right on your production network – although the emphasis of many device vendors certainly seems to be on cloud data collection. It’s no secret what Amazon (AWS), Microsoft (Azure) or Google (GCP) think you should do – their cloud is the answer, regardless of the question. They desperately want to be integrated into your manufacturing operation, and charge you every month for that integration, of course.

Many manufacturers have come to realize that “the Cloud” should not be the default data collection solution. For many data logging applications, local data logging and storage is a better answer. Using the Cloud in some applications is like driving a Shelby GT500 in a soap box derby!

Local Data Logging Can Make a Lot of Sense

No one questions the value of manufacturing data and the value of data in your AB PLC. Most of us understand that manufacturers use a miniscule amount of all the data available in their manufacturing systems. More effective maintenance, higher productivity and quality, and quicker response to unexpected process upsets can be obtained when you can get access to that data.

Much, if not all, of that can be achieved by having local data logging capabilities in your control engineer’s toolbox, to:

  • Track down time – number of hours and downtime reason codes
  • Monitor performance – utilization vs. capacity, number of units per hour, number of unscheduled outages
  • Save product genealogy – serial numbers, version numbers, test results
  • Watch quality indicators – test results, key quality indicators
  • Troubleshoot process upsets – look at machine behavior prior to the process upset
  • Collect cost information – manufacturing time, materials used

For most manufacturers, especially smaller ones, none of this necessitates the cost and complexity of cloud computing. None of it requires complex mathematics, running machine learning applications on terabytes of data storage, or using any of those sophisticated and elaborate cloud applications. What’s needed is a good, local data logger and someone who can use it to extract the required information from the data.

One of the main impediments to any of this – local or cloud – is collecting data that is usable and manageable. Analog data is notoriously prone to misinterpretation. The same 50% tank level, for example, can be captured as .5, 5 (0-10V input), 32768 (16-bit full scale value) or some other value. Many times, the lack of scaling and normalization results in the compiling of digital garbage dumps. There’s value there, but it smells and it’s a tiring and laborious effort to extract it. It often becomes nearly impossible to make any rational decisions on the data. The ability to scale is critical to good data logging.

There’s also something to be said for owning your own data and having control of it on your own premises. One of the problems with some of the cloud data collection systems is that once they get control of your data, they own it. They provide you with access – for a fee, of course – because it’s not yours anymore. With a local data logger, what’s yours is yours and you manage and control it.

Cloud Data Collection Has Its Place

That’s not to say that cloud computing doesn’t have its place. It took a few years for it to reach the manufacturing floor, but it is now here in full force and provides your manufacturing PLC control system with a whole host of benefits to applications that meet certain criteria:

  • Massive amounts of data – cloud systems offer infinite storage
  • Unlimited data processing – these platforms offer real time data analytics from a variety of sophisticated tools
  • Large corporate infrastructure support – for huge global organizations, cloud systems provide the infrastructure to easily collect and merge all that data from different sources
  • Long term archival – when government regulations or your customers demand long term data storage, cloud storage is much more effective
  • Data access – cloud storage simplifies access to your data around the globe
  • Reduce capital IT expenditures – cloud computing is not cheap, but it does lower your IT costs: personnel, hardware, software, infrastructure, and energy (a lot of it)

There are lots of reasons – good reasons – to move your automation data into a cloud database whether it is your own or one of the cloud companies. For many, large global manufacturers, it makes a lot of sense.

However, in many applications, it’s just not all that necessary and shouldn’t be your default choice.

What a Local Data Logger Can Do for You

A local data logger (aka Data Historian, Process Historian, or Operational Historian) is a software program that connects to process devices like Allen-Bradley PLCs to retrieve and record production and process data. Data is recorded by time in a time series database that efficiently stores data. The data logger often displays those trends over a time range (such as the last day, last 8 hours, last year) and can export the time series data to other systems.

The two critical components of a good data logger are time and scaling. Data that isn’t timestamped isn’t nearly as valuable. Data loggers can get their time using a Real Time Clock (RTC), a network time server, some local master time server, the Internet, or one of many other ways, but time must be obtainable. Data that isn’t scaled properly results in those digital garbage dumps described earlier.

Besides time and scaling capabilities, good data loggers for the AB architectures should include these critical features:

  • Data collection from your old legacy AB PLCs including Micrologix, PLC 5Es and SLC 5/04s
  • Simple alarming to alert users to values exceeding setpoints or below targets
  • Data scaling and change of units tag data capture, tag data refinement and tag data buffering with store and forward
  • Exporting time series data into your favorite external database
  • Triggering of a set of data collections based on some trigger value
  • Initiate trigger data collection cyclically
  • Local charting and viewing of your data within the data logger
  • Discovery of connected PLCs and tags within those PLCs
  • Ability to transfer internal data to other data archives and databases
  • No reprogramming of any PLC
  • Platform independent – Docker/Module/Windows/Linux options

All these features are important to control engineers with AB PLC systems, but several are crucial. The ability to collect data from all your legacy controllers is essential. Few of us have had the luxury of replacing all our old SLC, PLC5 and MicroLogix controllers. Scaling data as described earlier is an imperative and the ability to trigger, alarm and chart values are critical to your success.

An Example of a Local Data Logger

The RTA Data Logger from the Raptor Data Services family is a perfect example of a small form factor, local data logger that can solve many application requirements of AB control engineers in manufacturing systems where cloud applications aren’t required. It operates locally – within your firewall – and it’s easily connected to one or more PLCs:

  • Supports all the legacy AB PLCs including Micrologix, PLC 5Es and SLC 5/04s
  • Provides a way to do simple alarming on value changes
  • Scales your data to your requirements
  • Exports data collected to other systems for ongoing analysis
  • Triggers data collections based on some trigger value
  • Built-in charting tool to see live data in your browser
  • Ships in Windows Docker, Linux, and module configurations

With the RTA Data Logger, you can quickly capitalize on the data you already have present in your organization. You can keep your data where it’s safe – inside your firewall.

Summary

Cloud computing is now firmly integrated into many manufacturers’ operations. The Cloud provides benefits to many applications including: managing millions of data points, bringing incredible toolsets to bear on that data, and providing safe, long-term storage for critical data. Despite what the tech Goliaths who want to stick a large straw into your company’s bank account say, the Cloud should not be the default data collection solution. There’s a time and a place for it – but there’s plenty of applications where it’s just overkill.

Local data logging can grab data on demand from any of your AB PLC controllers, alarm on it, chart it and send it to Microsoft Excel or your favorite database for more comprehensive analysis. A simple tool that does just that needs to be a tool in your control engineering toolbox.

Monitoring of PLC data in all kinds of AB PLCs is easily accomplished with the RTA Data Logger, a tool that delivers peace of mind by providing real-time monitoring, logging, charting and alarming of your data locally and in your favorite database.

For more information, check out the RTA Data Logger, call 800-249-1612 or contact us.