Shining a Light on Edge Computing in Industrial IoT

What you’ll learn

  • The concept of edge computing and how it benefits the IIoT.
  • Leveraging machine learning in edge computing.


Lots of the talk surrounding edge computing is in the context of industrial IoT, so first let’s define that term. Essentially, edge computing means that the data collected by a sensor or a group of sensors is processed at the source without the data being transferred to the central computing or cloud resource. In some cases, the processed data (a much smaller dataset than the raw data) would be passed onto the cloud for historical record-keeping or further processing such as trend analysis.

Let’s clarify with an example. Think about an industrial camera used in a manufacturing environment for inspecting parts. Such a camera would combine a CMOS sensor with a processor and software to inspect the product and make a quality determination at the edge without having to transfer the data to the cloud. In a bottling plant, for instance, a smart edge device (in this case a camera) could read labels, verify if the bottle is full, and maybe even check the seal—all while the bottles are moving on the assembly line. 

In a PCB manufacturing assembly line, this type of camera can flag gross routing errors and other potential defects. Oftentimes, these cameras would be hooked up to a local industrial PC (to store historical data) but would not be connected to the internet to minimize security risks (see figure).

By enabling data to be processed closer to where it’s created, edge computing paves the way for smarter manufacturing processes.

This is classic edge computing, where the data can and must be processed locally (for latency) and the processing capability can be incorporated in a small form-factor device that’s deployed on the actual assembly line (i.e,. at the edge of the industrial network).

According to CB Insights, “Edge computing enables data to be processed closer to where it’s created (i.e., motors, pumps, generators, or other sensors), reducing the need to transfer data back and forth between the cloud.”4

Benefits of Edge Computing

Latency is defined as the time it takes for the data to travel from the sensor to the processor. In many industrial use cases, the total latency, which includes the processing and the return of the data to take a critical action, is important. Consider the case of an electronic barrier (also called a light curtain) around a welding machine. If anyone breaches the barrier, the welding needs to be turned off. Also, when something comes within X mm of the barrier, an alarm must go off to indicate the hazard.

In such a use case, there’s no reason to send the data generated by the light curtain over the network. A local processor can process both the breach data and the proximity data and trigger the appropriate response. Some of the data may be stored in memory and, at a predetermined time, be sent to the cloud for trend analysis.

Having the processor local within the edge device not only minimizes latency, but also guarantees deterministic latency. Data that travels over Ethernet to a processor that will schedule the data processing at an opportune time doesn’t work in a factory or process industry setting, where latency must be deterministic at all times.

Think of an autonomous car as the ultimate edge device. The visual and the radar data that the vehicle processes must be processed within a set amount of milliseconds and action taken must be within a short, deterministic time window. This kind of data doesn’t lend itself to cloud processing.

Michael Clegg, a vice president at Supermicro,2 explained this in a very interesting way: “By processing incoming data at the edge, less information needs to be sent to the cloud and back. This also significantly reduces processing latency. A good analogy would be a popular pizza restaurant that opens smaller branches in more neighborhoods, since a pie baked at the main location would get cold on its way to a distant customer.”

Minimizing the Data Transfer

To understand how much data we are talking about, this is what an IDC whitepaper states: “By 2025, 175 zettabytes (or 175 trillion gigabytes) of data will be generated around the globe. Edge devices will create more than 90 zettabytes of that data.”3 Of course, this is for all data, not just in the industrial setting. Still, it underscores the magnitude of the potential data-transfer problem.

For an example of both latency and minimizing the data transfer, consider a vibration sensor on a motor. The frequency of vibration may be a good indicator of impending failure and can even flag an immediate problem that may necessitate a shutdown or alarm. The vibration data collected may be huge, and not all of it is worthwhile to store or analyze.

The vibration sensor can have a local processor that does, for example, a fast Fourier transform (FFT) on the vibration data and immediately flags frequencies, which may necessitate shutdown, maintenance within a set time window, or some such corrective action. By doing the processing locally, only relevant data needs to be transferred to the cloud for offline analysis.

While the summarized data can be sent to the cloud for later analysis, the wrong frequency must be flagged immediately by the vibration sensor to avoid any serious damage. Thus, the latency limitation.

A case study done by IHS Markit on a Duke Energy industrial IoT system notes: “To collect vibration information, it can be necessary to capture anywhere from 10,000 to 100,000 samples per second for several seconds to obtain a good measure of the machine condition.”1 This data must be processed in real-time using edge-processing systems, which can then narrow down the data to that pertaining to the health status of the machine.”


This isn’t a clean benefit for edge-based computing. While an air-gapped, physically secure edge device would have a lower security risk, an internet-connected or compromised edge device can substantially increase the attack surface. 

As we all know, almost anything connected to a network can be hacked. While losing confidential data is certainly bothersome, imagine the havoc if edge devices that open or close a valve or operate alarms are hacked to operate in a potentially unsafe manner.

In a factory setting, edge devices are within a physically secure environment. If the only communication channel from the edge devices to the connected industrial PC is for a one-way data dump—e.g., a stream of vibration data frequency analysis for the day—the system is well-secured.

However, if the edge devices can be controlled or reconfigured from the industrial PC or a microPLC, then the security depends on how well that system is protected against attacks. Many edge devices will have some kind of remote access and management capability. Or, an intelligent sensor in, say, an oil pipeline pressure-monitoring application is remote enough to open the door to physical tampering. In either of those cases, an intelligent edge-computing device will need to have standard security protocols built-in either via software or hardware.

Edge Computing Trends in Manufacturing

Leveraging Machine Learning

Soon we will see machine-learning algorithms brought to bear on the software controlling edge devices. In the beginning, a large amount of visual data (going back to the industrial camera example in the introduction) for both good and bad parts can be fed into a cloud-based machine-learning system to train the neural network. The training data would be used to develop a good/bad classification algorithm. Typically, while the training algorithm requires massive computing resources, the inferred algorithms are smaller and can be run on a low-power microprocessor.

The only hitch in developing a deep-neural-network training algorithm for industrial applications is the availability of a large enough training dataset. For example, more than 350 million photos are uploaded every day on Facebook. That’s a huge training dataset. In any industrial setting, the training dataset is a small fraction of what’s available in the consumer world.

It remains to be seen how effective machine learning will be in creating inference algorithms for industrial systems. However, we will definitely see edge-computing devices leveraging the new machine-learning technologies in the near future.

Separating Useful Data from Just Data

As we start collecting terabytes, petabytes, and even more levels of data, it’s becoming clear that not all data is useful to be transmitted, stored, and analyzed. In the industrial setting, there’s a cost to keeping the data and running algorithms to analyze and correlate this humungous dataset to a specific interesting event (e.g., failure or energy efficiency).

We know from experience that some datasets correlate to what we’re interested in: predicting failure via vibration frequency, predicting optimum energy usage via current and phase, predicting optimum performance of a motor via variations in speed, to give a few examples. These select datasets can be stored and even transmitted to the cloud for further analysis and to develop optimum inference algorithms.

Some datasets could be deemed as having low value for either prediction or optimization and may be discarded at the edge once the necessary action is taken. 

Rise of Predictive Maintenance

One interesting case study for predictive maintenance is an Australian startup called Ping Monitor, which has developed an acoustic wind-turbine monitoring system. The system sits on the base of a wind turbine and measures the sound generated by the turbine blades as they rotate. Leveraging these acoustics, the company has an AI-based system that can predict any failure or anomalous behavior of the wind turbine. An edge device like this saves on operating costs and inspection costs. By scheduling maintenance ahead of time, it also optimizes energy production.5

While predictive maintenance was always touted as the killer app for the industrial IoT, so far it has failed to live up to its promise. As edge devices grow in sophistication are able to incorporate more processing and sensing components, they’ll likely play an increasing role in finally giving us the elusive goal of predictive maintenance.


Edge computing is indeed the latest trend that everyone is talking about, but this is more than just a passing fad. This type of distributed computing—at the edge and in the cloud when it makes sense—is here to stay. Some data, especially in the industrial setting, is best processed and acted upon locally. Equally, some datasets are best analyzed in conjunction with a vast amount of historical data.

Since it’s abundantly clear that not all smart devices need access to the cloud to make sense and act upon the data, the era of edge computing is here to stay and exist in parallel with cloud computing.

Suhel Dhanani is Director of Business Development, Industrial & Healthcare Business Unit, at Maxim Integrated.







Source link