We all know that in the not-too-distant past, data logging meant someone walking around the plant gathering USB sticks from data recorders and transferring the data to Microsoft Excel. Now, cloud computing and data logging are now firmly integrated into many manufacturers’ operations. But despite what the goliaths who want to stick a large straw into your company’s bank account say, the cloud should not be the default data collection solution.
Who Benefits Most from Cloud Data Logging?
Cloud databases – whether it’s your own or one of another cloud company’s – allow organizations to manage millions of data points, bringing incredible toolsets to bear on that data and providing safe, long-term storage for critical data.
For many large global manufacturers, cloud makes a lot of sense, but many of us are not global manufacturers. In many applications, cloud data logging is just not all that necessary and shouldn’t be your default choice.
What Is the Advantage of Local Data Logging?
Local data logging can capture data on demand from controllers like Allen-Bradley PLCs, alert on it, chart it and send it to MS Excel or your favorite database for more comprehensive analysis. A simple tool that does just that needs to be a tool in your control engineering toolbox.
One of the main impediments to data logging – local or cloud – is collecting data that is usable and manageable. Analog data is notoriously prone to misinterpretation. The same 50% tank level, for example, can be captured as 0.5, 5 (0-10V input), 32768 (16-bit full scale value) or some other value.
Many times, the lack of scaling and normalization results in the compiling of digital garbage dumps. There’s value there, but it smells, and it’s a tiring and laborious effort to extract it. It often becomes nearly impossible to make any rational decisions on the data. The ability to scale is critical to good data logging.
Data Logging Tools: Traditional Data Loggers vs. Historians
Historians satisfy a different need than data loggers. Where data loggers do simple collection of data, historians focus on collecting and retaining data in time-series databases for long-term analysis and reporting. Learn more about the technical differences between a data historian and a data logger.
There are two key design considerations for a historian. One, how rapidly it can collect time-series data and two, how much storage it has to retain that data.
Unlike traditional data loggers, historians prioritize data collection and retention, with the exact requirements depending on the specific application. Collection requirements can vary from sub-milliseconds to the end of a shift, job or month. Storage requirements can vary from kilobytes to terabytes.
Good Data Logging: Key Features of a Historian
The indispensable features required of historians include:
- Support for a variety of data ingestion sources
- Flexible local preprocessing and filtering
- Custom and standardized data modeling
- Scalable integration with enterprise and cloud applications
- Publishing data models to enterprise or cloud systems
- Simple deployment and configuration
- Cybersecure operation
- Fast ingestion frequency
- Accurate timestamping
What is an Example of a Historian?
The RTConnect Allen-Bradley PLC Historian is easy to install and quickly configurable, offering the ideal set of features for most time-series data collection applications. Tags from multiple PLCs are captured, normalized and saved. User-defined models are filled and published on demand, without subscriptions, licensing constraints or reliance on third-party middleware. With configurable storage of up to 1 TB, a comprehensive suite of publishing protocols (SQL, HTTP, FTP, WebSockets, USB, MQTT and email) and direct integration with InfluxDB for visualization and analytics, the RTConnect Historian is an invaluable tool for plant floor operations, maintenance and process engineers.
To learn more about the RTConnect Historian email solutions@rtautomation.com or call and speak to an Enginerd at 262-436-9299.


