What a Good Data Logger Looks Like

We all know that in the not too distant past, data logging meant someone walking around the plant gathering USB sticks from data recorders and transferring the data to Microsoft Excel. Now, cloud computing and data logging are now firmly integrated into many manufacturers’ operations. The cloud provides benefits to many applications, including managing millions of data points, bringing incredible toolsets to bear on that data, and providing safe, long-term storage for critical data. But despite what the goliaths who want to stick a large straw into your company’s bank account say, the cloud should not be the default data collection solution.

There are lots of reasons – good reasons – to move your automation data into a cloud database, whether it’s your own or one of the cloud companies. For many large global manufacturers, it makes a lot of sense. But many times, in many applications, it’s just not all that necessary and shouldn’t be your default choice.

Local data logging can grab data on demand from any of your A-B PLC controllers, alarm it, chart it, and send it to MS Excel or your favorite database for more comprehensive analysis. A simple tool that does just that needs to be a tool in your control engineering toolbox.

One of the main impediments to data logging – local or cloud – is collecting data that is useable and manageable. Analog data is notoriously prone to misinterpretation. The same 50% tank level, for example, can be captured as 0.5, 5 (0-10V input), 32768 (16-bit full-scale value), or some other value.  Many times, the lack of scaling and normalization results in the compiling of digital garbage dumps. There’s value there, but it smells, and it’s a tiring and laborious effort to extract it. It often becomes nearly impossible to make any rational decisions on the data. The ability to scale is critical to good data logging.

The two critical components of a good data logger are time and this kind of scaling. Data that isn’t timestamped isn’t nearly as valuable. Data loggers can get their time using a Real-Time Clock (RTC), a Network Time Server, some local master time server, the internet, or one of many other ways, but time must be obtainable.

The RTA data logger, known as the A-B PLC Historian, is a perfect example of a small form factor, a local data historian that can solve many application requirements of AB control engineers in manufacturing systems where cloud applications aren’t required. It operates locally – within your firewall – and it’s easily connected to one or more PLCs:

  • Supports all the legacy A-B PLCs including Micrologix, PLC 5Es, and SLC 5/04s
  • Provides a way to do simple alarming on value changes
  • Scales your data to your requirements
  • Exports data collected to other systems for ongoing analysis
  • Triggers data collections based on some trigger value
  • Built-in charting tool to see live EtherNet/IP, Modbus TCP, or other data in your browser
  • Ships in Windows Docker, Linux, and module configurations

With the A-B PLC Historian, you can quickly capitalize on the data you already have present in your organization. You can keep your data where it’s safe – inside your firewall. Monitoring of PLC data in all kinds of A-B PLCs is easily accomplished with the A-B PLC Historian, a tool that delivers peace of mind by providing real-time monitoring, logging, charting, and alarming of your data locally and in your favorite database.

For more information on the A-B PLC Historian, visit our website, call 800-249-1612, or email solutions@rtautomation.com.

Attn: Our office is closed November 28-29 in observance of Thanksgiving. Orders placed after 2 pm CST 11/27/24 will be processed 12/2/24.