Skip to main content
shopping_basket Basket 0

From State Machine to Machine Learning – what did MCUs ever do for us?


It is no exaggeration to say that microcontrollers kickstarted Moore's Law; the integration of MOSFETs led to the first arithmetic logic units, which lead to the first, fully integrated 'computer'. The way Gordon Moore forecast, at an early stage, how transistor integration would continue created the eponymous law.

The microcontroller, or MCU, is a first cousin to the microprocessor, or MPU. The latter gets all the glory, while the former does all the work. It has been that way since their common origin, the 4-bit Intel 4004. This device inspired engineers at Texas Instruments to design the TMS 1000, credited as the world's first, true, MCU. It differentiated itself from those early MPUs not by bus size, but by the way it integrated almost everything it needed to operate into a single device. The early MPUs needed support from external memory (and still do), but the MCU carried its memory with it. As such, the first MCU was also the first System-on-a-Chip (SoC), but more on that later.

The integration of the CPU, memory and I/O on a single device empowered the embedded electronics sector, which now underpins every vertical market today. In fact, the automotive industry was the first end market to adopt the MCU at scale, as the electrification process started, as early as the mid-70s.

The world wouldn't look the same today without the embedded electronics sector, and embedded electronics couldn't exist without the MCU. It started as a relatively simple, programmable controller used to bit-bash its input and output pins to control other parts of a system. Without really losing sight of its roots, it has evolved over the last 50 years to take centre stage in almost every electronic product on the market today.

The scalability of the MCU architecture is unmatched by any other class of component. Consequently, there is probably no task for which an MCU isn't applicable. But for some, the MCU remains a simple digital controller, little more than glue logic with some clever peripherals, like serial digital interfaces or a PWM output. Perhaps it is time to reconsider the not-so lowly MCU and see what it can really do.

"I'm an analogue engineer, MCUs aren't for me!"

In the past, this was a common criticism of the MCU's ability to handle analogue signals well. It probably stems from the ability to implement high-quality analogue in a CMOS process, or the effects of high-frequency digital clocks on sensitive analogue signals.

Today the majority of analogue integrated circuits are implemented in CMOS, and the semiconductor industry has been able to control noise at the transistor level for many process generations. Analogue ICs even use aggressive CMOS nodes now. In many ways, the latest digital devices are more like analogue, because of their speed of operation and demand for signal quality.

Integrating analogue functions in CMOS is no longer a barrier, so what about integrating them alongside digital functions? Most MCUs feature some element of mixed-signal operation, often that takes the form of integrated ADCs and DACs. The ADuCM family of MCUs (786-3372) from Analog Devices, for example, features multi-channel 24-bit Sigma-Delta ADCs.

The potential barrier here is moving between domains; analogue to digital and digital to analogue. But once in the digital domain, engineers can do so much more with the data, such as apply filters, search for patterns and weed out anomalies. The key to this is the software support provided to implement these complex analogue functions in firmware. Increasingly, MCUs now use 32-bit processing cores that include highly capable DSP instructions, often with dedicated hardware to accelerate those instructions. The Arm Cortex-M family of cores is the most widely licensed and deployed instruction set architecture within MCUs. Many of the cores now include floating-point units (FPUs) and DSP instructions, making them able to handle extremely complex digital filter algorithms. More recently, the Digital Signal Controller, or DSC, has emerged based on the combination of DSP and MCU functionality into a single device. These are applied to applications that need to monitor and react to analogue signals, such as motor control.

"I'm an RF engineer, MCUs aren't for me!"

This is another common misconception based on the specific nature of RF design. Often seen as a 'black art', there is a long history of RF designers working in shadowed huddles, casting incantations over PCBs to make them work.

Fast forward to today and the picture is very different. Of course, the vast majority of RF is still shrouded is secret knowledge selectively passed down to only the worthy. For the rest of us, the main aim is to implement some form of standard wireless communications protocol, such as Bluetooth, Wi-Fi or similar. For these applications, simpler is better. This is where the wireless MCU comes in.

Even for RF engineers, these devices can provide a simpler design option, thanks to the flexibility of the platform. The CC1310  (168-4935) from Texas Instruments features a sub-1GHz RF front-end able to operate at a range of frequencies, including 315, 433, 470, 500, 779, 868, 915, and 920 MHz with 50 kHz to 5 MHz channel spacing. This means it can be used to create devices designed to operate as part of a wireless sensor network running protocols such as 6LoWPAN, Wireless M-Bus, KNX Systems, Wi-SUN or a proprietary protocol.

The use of multiple protocols is endemic in the IoT, which often means creating networks with devices that use dissimilar wireless interfaces. The approach here is to use something generally referred to as a gateway, which bridges these different interfaces. Designing something operating across multiple protocols could challenge even the wisest wireless wizard, but the development of multi-protocol wireless MCUs means they needn't worry.

An example here is the EFM32MG22 family of wireless MCUs from Silicon Labs (200-9664) These devices have been designed to operate at 2.4 GHz running protocols including Zigbee, Thread and Bluetooth, often on the same device and sometimes at the same time.

Perhaps the most beneficial feature of Wireless MCUs is the level of software support offered by the manufacturers. This typically includes a pre-verified protocol stack, provided free of charge and optimised for the specific device being used. This really can cut the design cycle drastically and the nature of integrated RF front-ends means the certification process is usually much more straightforward, too.  

"I'm a hardware engineer, MCUs aren't for me!"

This is perhaps a criticism levelled at the need to configure an MCU through software. For engineers more familiar with using discreet digital and analogue devices, moving so much of the design into the software domain may be slightly daunting. However, the level of support for software development is now so high that hardware engineers – who are really the primary target market for MCUs – should always consider using an MCU first.

One of the main reasons why MCUs are popular in the embedded domain is because they are so hardware-centric. The processing core has become more important over time, as the industry has evolved from 4-bit architectures to 32-bit instruction sets. Fundamentally, though, it is still there to support the hardwired peripherals included in the devices.

Because of this, the way the core interfaces to the peripherals has seen a lot of development over recent decades. And because one of the main technical demands coming out of the embedded domain is for low operating power, the techniques employed by manufacturers to achieve low power are now quite sophisticated. The effect of this is that the core is now able to spend the majority of its time in a deep sleep mode, doing little to nothing for most of its time.

While the lines between MCU and MPU cores may be blurring, sleep modes remain one of the main differentiating features. To preserve battery power, some MCUs can clock- or power-gate large parts of their functionality and put the core itself into a very low-power mode. Waking on interrupt is a common technique, which only wakes the core of the device when it receives an external stimulus. However, even this has consequences, so the trend is now towards making the peripherals more autonomous, allowing them to handle more on their own, without waking the core at all.

For example, many of the PIC and AVR MCUs from Microchip, including the PIC16LF18875 (905-3050) come equipped with 'core independent peripherals'. This a general term used to describe peripherals that can be configured to carry out repetitive or triggered functions without the intervention of the CPU. This can include sensing and signal conditioning, waveform generation or user interface monitoring, as well as others.

The ability to implement a system-wide low power scheme through a few lines of code should make hardware engineers happy. Couple with that the fact that all the 'wiring' between functions is handled for you, through an on-chip communications backbone, means system development becomes simpler.

Some MCUs now favour the hardware designer by providing even more configurability over the peripherals' functionality. The PSoC family of MCUs from Cypress (now Infineon) provide fine-grain control over I/Os as well as programmable analogue and digital functions.

"I'm a computer scientist, MCUs aren't for me!"

This is a new one. Computer scientists tend to be abstracted from the 'bare bones' of the computing system they use to run their code. That code is typically very compute-intensive, so their main concern is about raw horsepower. In this respect, the performance of the average high-performance MPU exceeds that of even the best MCU.

At least that used to be the case. The trend for multicore devices has entered the MCU domain, which means many MCUs now feature high-performance processing cores alongside a more modest microcontroller core. Currently, we are seeing a lot of the 'intelligence' migrate from the core of the network to the edge; where embedded systems live. Consequently, MCUs with more processing performance are coming onto the market. In the most recent examples, these devices are able to run inference engines, which puts artificial intelligence and machine learning capabilities into small, remote sensors.

What this means in practice is that the programming skills exhibited by computer scientists are now in demand in the embedded domain. Machine learning at the edge is going to be huge, the support for edge processing is growing and the skills needed are in high demand. If you're a computer scientist working on AI and ML, chances are you will be rubbing shoulders with MCUs in the near future.

When is an MCU not an MCU?

This raises an interesting point. If MCUs are taking on more processing capabilities, are they still MCUs? There is no clear definition of what an MCU is or isn't, but going back to those early examples, we could say that if it integrates a processing core, permanent storage and some flexible I/O then it is an MCU. If we take that as a definition, then SoCs become a subset (or superset?) of the MCU.

Other super/subsets exist. We’ve already discussed the concept of the DSC (Digital Signal Controller). That shows that the semiconductor industry isn't shy about taking the key parts of MCU technology and mixing it with the best of the rest to create something new.

Let's take machine learning as an example. Putting inferencing engines on MCUs is a brand-new area of exploration, but it makes perfect sense. MCUs are the interface to the real world, and it is real-world data that artificial intelligence needs to interact with. The challenge is, inference engines are normally developed on processing platforms with practically limitless compute and memory resources. How do we port these large algorithms to resource-constrained MCUs? And if we do, are they still MCUs?

Part of the answer lies in optimising the process of creating an inference engine and then porting that optimised version to a software framework designed to run on a processing core that can be realistically integrated into an MCU.

This is where organisations like tinyML come in. The tinyML foundation is a non-profit initiative focused on getting machine learning technology onto ultra-low-power MCUs at the very edge of the network. Another is TensorFlow Lite, another open-source framework for executing deep learning algorithms on small devices.

On the hardware side, the likes of NXP are developing crossover MCUs. Like the DSC, the crossover MCU brings in elements suited to a specific domain. In this case, that's machine learning.

The future of MCUs

It is likely that we will see continued integration of functions into MCU architectures. There are already nearly 8000 MCUs available from RS and every one of them has its own unique features. That doesn’t mean there isn't room for more.

It is easy to see that MCUs will eventually become the dominant device in embedded systems. Hardly any product doesn't feature at least one, and many have several. At selling prices as low as £1 (or less) for so much functionality, it is hard to argue against their wider use.

The trend towards hybridisation, like the DSC and crossover MCU, indicates that this number is only going to increase. It presents an interesting thought that at some point there will be an MCU suitable for almost any conceivable application. At this point, the need for SoCs, ASICs and even FPGAs could be questionable. Typically, these devices are used when there is no standard product available, but with such diversity and the expectation of even more, that may be a thing of the past.

One thing is certain, the reign of the MCU is set to continue for many years more.

DesignSpark is your go-to engineering design platform from RS Components, providing free CAD software, online resources, and design support. Our flagship CAD software includes DesignSpark PCB, DesignSpark Mechanical and DesignSpark Electrical. Join our community at

Recommended Articles

DesignSpark Electrical Logolinkedin