5 +1 things driving IIoT and 4.0: wireless connectivity (I)Follow article
How do you feel about this article? Help us to provide better content for you.
Thank you! Your feedback has been received.
There was a problem submitting your feedback, please try again later.
What do you think of this article?
Connecting thousands of CPS in a factory will demand a perfect concept of data concentration and distribution. Wireless connections will help to reduce costs and downtimes. You don’t need to be a mentalist to predict that wireless connectivity will replace wired technologies more and more. But which wireless technology will make the race? And which factors (like frequency, data rate, robustness, etc.) will be core criteria in the automation industry? Given plenty of technologies which all have there pros and cons, these questions are hard to answer. I will try to bring a little light into the jungle of wireless options by giving you a rather perfunctory summary. You can find detailed and often brilliant explanations of most technical terms on the internet.
The first part of this article will cover...
The specific challenges for wireless communication in an industrial environment
When talking about automation, I need to avoid misunderstanding because there are two scopes of industrial automation:
(you could also say “discrete factory automation”) is about controlling machines that produce or manipulate countable items. The control is typically strictly sequential and cyclical. This type of automation has a high demand for low latencies (reaction time from sensing to acting) and deterministic cycle times. On the other hand, the distances between controller and sensors or actuators are often far under 50 m. Due to these demands, non-wireless connections could not be realised with traditional ethernet (which has not even defined a latency maximum). Highly deterministic standards like Profinet or IO-Link with cycle times far below 1 ms were created to comply with these demands.
The scope of
(or “process control”) is monitoring, controlling and protocolling the flow of material and its transformation, mostly in large scale production plants (e.g. chemical industry). SCADA (Supervisory Control and Data Acquisition) is a significant domain of process automation. In contrast to factory automation, there is very seldom a demand for low latency, and if there are cyclical operations and communication, it mostly has slow cycles (>100 ms). Sensors and actors (like valves) can be distributed over a large area (several square kilometres), often too far away for an Ethernet line (max. 100 m). This also demands low energy devices, sometimes operated by battery or energy harvesting.
Apart from these classical industrial automation fields, there is a third scope called “building automation”. Wireless and wired standards for building automation need different considerations, and I will not cover them in this article.
Whenever you are controlling something (like driving a car) the time between sensing a particular condition (e.g. red traffic light) and reacting (e.g. stop your car) is essential. Imagine a signal to travel from your brain to your foot would take too much time. You could not stop the car early enough to prevent an accident. The delay between sensing input and the resulting action is called latency. If a wireless connection is used for transmitting inputs or outputs of a controller, latency must often be below 10 ms, sometimes even below 1 ms. At least, you must know the maximum allowed latency for your control application and choose a transmission standard that does specify a maximum latency. Classical Ethernet, e.g., does not specify latency. In network technology, the definition of latency is different: It's the elapsed time a message needs to travel from the sender to the receiver.
Deterministic Cycle Time
Industrial automation follows the concepts of "finite-state machines" (FSM). At a given point in time, a controller (e.g. a PLC) is in a specific state. According to the inputs (sensors, switches, etc.) and following pre-defined rules, it can change to another state. Each of these states has pre-defined outputs (levels, commands) for the actuators (relays, valves, lamps, etc.). Checking the inputs, deciding about state changes (following the rules) and setting the outputs according to the current state is done cyclically (see this article). The cycle time's jitter must be a known and deterministic value to enable coordination of multiple controllers, guarantee to meet mechanical constraints or for safety reasons. That means none of the involved IO-connections (wireless or wired) should use stochastical states (like collision detection where the next state is a function of probability). Typical cycle times of industrial controllers reach from less than 1 ms to 100 ms.
In digital wireless connections, the maximum distance is directly related to the bit error rate (BER), which increases with distance. The least acceptable BER value depends on the data rate (you could also say on the bandwidth). For speech, e.g. a BER of 0.1% (1-3) is still tolerable whereas, for high-speed data, the BER demand can be smaller than 1-6. Communication distance is always limited as the energy used for transmission gets lost over distance (the receiving power is 1/d², with d=distance). That results in the direct relation of transmission output power (which is part of a transmitter's power consumption) to the maximum signal range. Interferences also limit possible distances because their mean energy level increases with distance. Interferences can be seen as noise, and the signal to noise ratio (SNR) defines the BER. While the signal fading due to the atmosphere's absorption is another reason for the limit of the LOS (line of sight) value, there are many more interferences in real-world transmission scenarios:
Shadowing by and reflection at obstacles, refraction on surfaces of penetrated objects, scattering and diffraction are increasing BER (reducing bandwidth and data rate), partially because they result in several signal paths with differing (dispersing) run times (overlapping of time-shifted messages at the receiver). As a rule of thumb, you can say sub-GHz UHF signals are much more immune against these interferences than GHz (UHF and SHF) signals. The sp called "free space lost" of transmission power is also a function of frequency. The higher the frequency, the more significant the loss. A free sight connection without interferences does have a 2.67-fold higher distance for 900 MHz than 2.4 GHz (with both transmitters using the same energy and the same attenuation of antennas. The "Friis equation can calculate this". Other effects are more surprising: The diffraction angle of a 900 MHz signal when touching a building's edge is much higher than the one of a 2.4 GHz signal. This is why 900 MHz is said to "bend around buildings".
Sometimes co-existent senders need to limit their transmission to defined timeslices (time multiplexing protocols). The signal run time needs to be much smaller than such timeslices to keep all senders' messages apart at the receiver's location. So the length of the timeslices in such cases also limits the distance. Thus you can see another example (besides the defined data rate) of how the protocol definitions can influence the range. As the signal modulation type also impacts BER, it also influences the range (or data rate).
As a matter of course, the quality and amplification of the used antennas significantly impact the range. On the other hand, we need to keep in mind that antenna space (length) increases with wavelength (reciprocal frequency). And do not forget the receivers' selectivity and sensitivity combined with their well-designed antennas are crucial for the wireless range.
Network topology does, of course, impact the transmission range: In a mesh network, the overall distance is the sum of many small transmission ranges. In contrast, in a star type network, the range is defined between the central transceiver (gateway) and the endpoint (sensor).
Everything I've summarised in the above chapter concludes that a sub-GHz transmitter can be operated at much lower energy consumption than a 2.4 GHz transmitter.
When talking about low energy, the average data rate is critical. If you transmit small packets of data and then switch off the transmitter, this may result in devices that can be operated by battery for years.
This is where network topology becomes essential. A one-way end-to-end connection (e.g. a sensor transmits directly to a gateway) can switch off the transmitter between transmission. This is not the case with a mesh topology where each transmitter must keep the receiving radio switched on all the time to catch its neighbours' messages and forward them.
EMC (electromagnetic compatibility) in home applications focusses on a low emission of interfering signals and demands a medium tolerance for such signals' immission. In an industrial environment, things are different. You cannot stop a welding machine or inverters from emitting high power electromagnetic fields. Thus EMC needs to focus on a high tolerance for immission. Industrial norms like the EN61131-2 for controller hardware define a high standard for such devices to guarantee a failure-free operation despite substantial electromagnetic interference (also called "electromagnetic immunity"). The problem for wireless connection in such an environment is evident: You need to analyse the local electromagnetic field conditions before deciding to use a radio connection.
Typically you will have fewer problems with GHz ( especially SHF) connections because most industrial RFI sources (except microwave ovens) emit high power RFI at frequencies far below 1GHz (see this paper for more details). But RF immunity is just one aspect of robustness. Other aspects have to fight against phenomena mentioned in the paragraph "Distance" (shadowing, reflection, refraction, scattering and diffraction). Some of these problems can be solved, e.g. by using multiple antennas ("MIMO"). Some wireless standards, like G5, include MIMO techniques. Other ways to make a wireless network robust against these issues are high redundancy like in mesh topologies or decreased BER with highly sophisticated modulation techniques.
Keep in mind that there are external reasons for wireless connections to break down and there might be no ways to significantly reduce the risk. Strong solar flares can be one reason for such break downs.
There is no wireless protocol that fulfils the demands of functional safety because due to EMI and other aspects, their message transmission is always probabilistic. The "black channel" technique helps achieve a deterministic system (as demanded by functional safety). This concept is not new as Ethernet-based industrial communication protocols like PROFIsafe are using it for many years. A black channel technology uses a standard network technology without changing it or making any safety-relevant assumption about this network. It puts another protocol on top of the network protocol, which covers all safety aspects. One important aspect is to set time limits for successful message transmission and using "safe states" as a reaction to missing valid messages. This concept is also used in wireless networks to add functional safety. As long as the jitter of latency is small, such a system will not need to enter safe states too often.
Security has many aspects. One of them is the protection of data from being read or changed by intruders. In wired connections, this aspect can be controlled by strictly limiting physical access to the cables. That is something easy to do for local assets. As soon as wires are outside your local property or you are using radio connections, there is no secure way to limit access. That's the reason why wired internet connections are using secure protocols like HTTPS. Encryption, signatures and certificates are some keywords for securing the freely accessible transmission of confidential data. Please read our series of article about industrial security. While these security technologies have been proven to work well (the internet is a great test platform), you need to consider a race between hackers and security experts to continuously improve their methods. That is why wireless components should always include possibilities to be updated.
Another aspect of security is to prevent intruders from sabotage your data traffic. While hacking the secured connections (like wifi or Bluetooth) is very hard to do, it is straightforward to compromise wireless communication by emitting high power electromagnetic fields of propper frequency spectrums. On the other hand, such an act of sabotage is easily detectable, and the source can be well located. So this would work for a short period but one would not expect a constant violation of RF regulations. Still, it is one reason not to use wireless connections for security-relevant data (like motion sensors for intrusion detection).
Finally, any industrial solution must comply with economic aspects: How easy is it to implement? Is there a high ability of components now and in future? How well does it fit into the existing infrastructure? How significant is the investment and how quick is the ROI? Will there be recurrent expenses like tariff rates?
Some networks like G5, SigFox, NB-IoT etc. are run by third-party companies, raising fees for their service. Others (like propriety G5 in Europe) might confront you with periodical frequency allocation dues payable to authorities.
I hope you liked this first part. In Part 2, the article will give a summary of several specific wireless standards.