How do you feel about this article? Help us to provide better content for you.
Thank you! Your feedback has been received.
There was a problem submitting your feedback, please try again later.
What do you think of this article?
It’s fair to say that Industry 4.0 arrived just in time. Almost overnight, we suddenly had to place incredible global reliance on digital backbones and operational data. Put simply, we couldn’t go to the machines, so we had to bring the machines to us.
Most of the remote and immersive technology we have relied on over the past two years was already there, we just weren’t using it. So, we could argue that this has been a paradigm based on technological adoption. The great technology you used to see at trade shows very rapidly evolved from long-term novelties into short-term must haves. In fact, had the pandemic not happened, it is likely that smart glasses and AR would have remained medium-term aspirations.
Augmented reality, virtual reality and digital twins were already part of the engineering lexicon, and their use cases were obvious and highly desirable, as were their benefits, but industry has a lot of inertia, and it takes a while for new approaches like these to become established.
The new normal
Soon after the global lockdown took shape, quite understandably a little bit of panic crept in. All of a sudden face-to-face was impossible, and processes that were taken for granted were transferred to online. Field maintenance was being undertaken with a smartphone and a headset, training was delivered virtually, and design reviews were taking place online.
Then companies started investing in higher quality cameras, better lighting, interactive software, and clearer remote processes. CAD drawings were retrieved from the archives and digital twins were created of machines that had been in the field for years. Suddenly, everything went virtual and even immersive. Entire machines can now be projected onto your desk, and you can strip them down component by component with the customer watching in.
What makes it even more impressive is the speed at which this all took place. Normally, we would have to wait years for this tech to gain this level of saturation. A two-year adoption of tech on this scale is almost unheard of.
This new normal has been a revelation. But it’s not a new normal anymore, now it’s just normal. And this is all thanks to data and the vastly expanded ways of collecting it, analysing, formatting it and then, importantly, sharing it.
In essence, this new normal is the process in which we get the right data, in the right format, into the right hands at the right time, so the right decision can be made – ideally in real-time. This is what efficiency and yield are all based on. Overall equipment effectiveness (OEE) is rising, and the total cost of ownership (TCO) is falling, all for the sake of more pertinent information.
Whether you have digitally transformed, converged your IT and operational technology, or simply added a communication gateway of IIoT-capable device to one of your machines or production lines, you are now in the digital realm and ready to exploit what it has to offer.
The Industrial Internet of Things (IIoT) has brought everything closer together and made it a lot more accessible. Although it may only be transmitting a simple yes/no signal, when factored into the holistic data pool, it is providing worth. It is giving a snippet of information upon which decisions can be made.
With more data comes greater confidence, and this greater confidence gives us the comfort factor to pass some decision making over to new tools based around artificial intelligence. The advent of big data, the cloud, and machine learning has led to even more robust algorithms and logic solving and, as a result, more trustworthy decisions.
Before AI could be properly exploited in the field, the picture was based on minimal data and heavily pixelated; and you certainly wouldn’t have trusted a machine to make sense of it. You needed an engineer with experience and domain expertise. But today’s 8K clarity, backed by terabytes of metadata and a complete lack of ambiguity, means we have a lot more faith in software’s capabilities to make the right decision. And this clearer picture is because we have more data from more devices.
The human question
Much like automation, this IIoT-fed capability will remove the necessity to use humans for the mundane, repetitive, easy tasks. Instead, we can be deployed in roles where our creativity and subjectivity add value. It doesn’t mean the labour pool is drying up, it’s just changing with all pointers indicating high demand for a digital-savvy workforce that can manage and maintain these new digital data creators and decision-makers.
Supply and demand
Anyone who has bought a car in the last 12 months will tell you that there has been an industry-wide options cull. All of the features and gadgets that created the USPs and pushed you closer to buying are simply not available. Just as we’ve reached the pinnacle of technological adoption and deployment we have a chip-supply issue.
The chip supply problem was born from a perfect storm of occurrences that all played a part in disrupting buyers and electronics designers across the globe. The pandemic was the obvious antagonist, with social distancing and a drop in manufacturing being juxtaposed with a sudden massive demand for laptops, webcams, and headsets. The car industry then complicated things by slashing sales forecasts, and then suddenly revising them up again. We had flights grounded, ships grounded, natural disasters in the US and Asia, and a factory fire in Japan. All of which was topped off with posturing and trade embargoes between two of the world’s larger nations.
The chip shortage is not a short-term issue either, with many companies making concessions and making do with what they have – especially with lead times for ICs and IC-based devices now being measured in months and even years, as opposed to weeks.
So, what’s around the corner? Firstly, there will be a massive rationalisation of our data capabilities. There is such a thing as too much data, and even though we are throwing analytics at it, there is still too much to use all at once. This data will still be collected, but clever software will sift and divert it, only letting the useful stuff through.
Many of the globe’s big automation players are defining data audits as a vital stage within a digital transformation. They are also conscious of people becoming overwhelmed. It’s analogous to the launch of more capable PLCs, which became so feature-heavy that simplicity and reduced front-end feature sets soon became a selling point.
Intelligence at the end points, such as that delivered by IoT-capable hardware is still relatively untapped. We have edge computing, but this tends to be dealt with by bigger IC brains, but there is nothing stopping this decision-making migrating even nearer to the edge to device level. But we’ve only just got used to computers making major decisions, it will take a while for commodity-level devices to be trusted.
As we emerge from the pandemic, we may also see a stall in deployment, coupled to the chip issues. But this isn’t necessarily a backward step. A bit of reflection on what has been a roller coaster ride will set a slightly less undulating path going forwards. We’ll certainly see the odd bump, but our faith in our capabilities to accept change and our trust in the technology delivering it are certainly now anchored on much firmer ground.