DesignSpark Circuit Simulator Maintenance on 23rd May at 15:00 UTC for 2 hours

Skip to main content
Sponsored by: NVIDIA

Sponsored by: NVIDIA

Making the most of EDGE Computing and AI

by billd700

Reduced latency and real-time analytics are just some of the benefits of edge computing, which is finding applications across industry, automotive and consumer electronics.

The emergence of the digital economy means we rely on data more than ever.

On the factory floor, critical production information such as vibration, temperature and humidity are collected via sensors, with subsequent analysis driving more informed maintenance strategies.

Meanwhile, in our daily lives, dependence on smart devices in an always-on society means we are consuming data at every turn.

This proliferation of connectivity has been underpinned by cloud computing. Data is transmitted to a virtual space, where it is stored and processed at a hosted facility. This approach provides multiple benefits – scalability, flexibility and cost-effectiveness, to name but a few.

But there is a problem. Shifting data in large volumes from multiple endpoints to the cloud takes time and energy. These 'journeys' might only be measured in milliseconds, but that lag can cause problems for some real-time applications. So, other data strategies are sought.

Edge computing in action

Enter edge computing. Here, storage and processing are delivered at the 'edge' of the network – physically closer to users, devices and data sources. This infrastructure can reduce traffic latency and enable real-time analytics.

Edge Computing

For some ultra-time-sensitive applications, such as autonomous vehicles, patient monitoring and smart city traffic systems, improving the response time of endpoint devices is critical to performance.

Indeed, edge computing is a rapidly growing solution for data management in today's hyper-connected world. The global edge computing market was worth $11.99 billion in 2022 and is expected to grow to $15.96 billion in 2023, rising to $139.58 billion by 2030.

According to Gartner, by 2025, as much as 75 per cent of enterprise-generated data will be created and processed outside of a traditional centralised data centre or cloud. If this prediction holds true, it is fair to say that the era of edge computing will be well and truly upon us.

Increasingly, edge computing's ability to make ultra-fast decisions, often on the device where the data is created, is set to supercharge the AI revolution. Now, with no need to shift data from one point to another and back again, latency is reduced to an absolute minimum, opening up new use cases and applications.

System on a Chip explained

Critical to such performance is system-on-a-chip (SoC) architectures. Rather than a large board with a collection of discrete components, a SoC is an integrated circuit that combines the functionality of an electronic system onto a single chip. This typically includes a CPU, GPU, memory, I/O interfaces, peripherals, networking and power management, all contained within a highly integrated system.

System on Chip SoC

SoCs hold enormous potential for edge computing because they can be highly customised to provide a unique combination of power efficiency, integration, performance, and security. These technical characteristics are well-suited to edge devices and applications with diverse and often resource-constrained requirements.

Suddenly, designers and engineers can overcome the challenges of integrating AI capabilities into sensors at the edge, incorporating multiple components and functions into a single chip, making devices smaller and more power efficient. Many edge AI applications require real-time or low-latency processing of sensor data. SoCs provide the necessary computing power for these tasks while minimising delays in data processing. This is particularly important for applications where split-second decisions are crucial.

Meanwhile, edge AI sensors can communicate with other devices or cloud services, relaying information as required and independently from their low-latency operation. Most SoCs offer various connectivity options, such as Wi-Fi, Bluetooth, 5G and sensor interfaces, to facilitate seamless data transfer and integration with external systems.

Edge computing use cases

So, where is edge computing finding the most significant uptake? Perhaps the single most suitable end use is autonomous vehicles – where there is a mission-critical need for on-vehicle edge computing from IoT sensors detecting a broad range of parameters, including road conditions and proximity to other vehicles and pedestrians. Faster and more accurate decision-making based on real-time data will drive the requirement for more precise navigation and obstacle detection.

For future intelligent transport systems, edge computing will be a critical foundation technology for Vehicle to Everything (V2X) architecture, which organises communication and interaction between cars, pedestrians and smart city infrastructure, helping address the challenge of dynamic handover between communication channels and low latency to allow information to be transmitted across vehicles. Maintenance is another application area, with edge computing deployed for battery monitoring and predictive methodologies, with sensors aggregating a host of conditions in real-time - notifying the car's owner if there are any patterns or trends outside the norm.

Meanwhile, reliance on cloud-based centralised data infrastructure can significantly impact wearables' power consumption and battery life. It is far better to perform AI operations as close to the end user as possible — literally on their wrist. With self-learning AI, training a neural network and allowing inference on the edge, less processing power is required, and the battery life of wearable devices can be extended.

Performing the operations locally also overcomes latency issues, with faster response times, boosting the user experience and opening up the opportunity for new functionality and there is also a reliability factor that comes into play. Relinquishing the need for two-way data flows across the cloud reduces the chance of network failure issues, as well as mitigating some security concerns.

Robotics is another area of opportunity. Many factories have embraced automation and digitalisation to streamline operations and boost quality and repeatability. Robots are vital to this transformation, as they have become cheaper, more flexible and easier to integrate and operate on the shop floor. Now, the emergence of edge computing could further enhance robotics performance. For example, edge-based computer vision techniques mean object detection could be performed locally on devices such as cameras and sensors, improving latency and reducing bandwidth consumption. It could also perform image classification, supporting quality control on manufacturing production lines. As well as being used to identify any unexpected events or patterns in data sets, highlighting potential equipment malfunctions.

Security and privacy considerations

These are all positive use cases for edge computing in our hyper-connected world. However, adopting such architecture brings complex privacy and security considerations to the fore. A recent submission request on edge computing security and privacy from the Journal of Cloud Computing identified three potential challenges due to the distributed and heterogeneous nature of the architecture that requires more collaboration and research:

  • Edge nodes close to the users can receive large amounts of privacy-sensitive data, with severe consequences if data is leaked.
  • Compared with data centres, edge nodes have limited resources, so it is more challenging to support complex security mechanisms.
  • High mobility of the devices and users means the edge computing environment is very fluid, making it easier for hackers to access and more difficult to design security rules with multi-domain overlapping.

These factors present challenges that must be considered when instigating best-practice cyber security initiatives to secure the perimeter edge.

A bright future for a flexible approach

In conclusion, it is clear that edge computing delivered through SoCs, and other systems and components, will play a crucial role in the growing deployment of advanced technologies, such as AI. Its low latency, flexibility and scalability make it highly suitable for a broad range of applications as long as security and privacy have been built in.

While edge computing will not replace the need for centralised data management architecture, it will undoubtedly provide new opportunities and alternatives for design engineers to build practical and cost-effective IoT systems where real-time analyses matter.

With a background in electronics and electrical engineering, with a keen eye on innovation and how things work.
DesignSpark Logo

Enjoying DesignSpark?

Create an account to unlock powerful PCB and 3D Mechanical design software, collaborate in our forums, and download over a million free 3D models, schematics, and footprints.
Create an account

Already a DesignSpark member? Log In

Comments