Skip to main content

Automating PCB Inspection with Arduino UNO Q Part 1: Introduction

by Andrew Back

Arduino UNO Q and Edge Impulse

Automating inspection of printed circuit boards with Arduino UNO Q and Edge Impulse.

In this series of articles, we will take a look at how the unique capabilities of Arduino UNO Q (066-5593) may be used to seamlessly integrate AI and control of physical inputs and outputs, with user-friendly training of AI/ML models courtesy of the Edge Impulse software platform.

Note that machine learning (ML) is regarded as a subset of artificial intelligence (AI) and this series of articles may use both terms interchangeably.

The Application

Solder joints on PCB

Visual inspection of printed circuit boards (PCBs) is an important part of manufacturing quality control and can be implemented at numerous stages of the assembly process, including bare board and solder paste inspection, pre and post-reflow, and through-hole soldering where present.

Manual inspection can very quickly become time-consuming and in addition to which error error-prone, hence is only really practical with prototypes and small production runs, with volume manufacture typically using some form of automated optical inspection (AOI).

There are various machine vision techniques which may be utilised for AOI, such as pattern matching, feature-based and machine learning algorithms. With the latter being the most powerful given the inherent flexibility of ML, together with the many significant advances which have been made in related hardware and software technologies.

One such advance is the availability of hardware which can be used to embed AI at the edge and with a compelling size, weight, power and cost (SwaP-C), thereby removing a dependency on larger, expensive, power-hungry and remote compute resources. At the same time, AI models are increasingly being developed with embedded use in mind, and software platforms are available which greatly simplify the task of training these models and deploying them to embedded hardware.

Enter Arduino UNO Q and Edge Impulse.

Arduino UNO Q

Close-up of the Arduino UNO Q

Arduino UNO Q is an intelligent prototyping platform which features a hybrid design that combines a Linux-capable microprocessor (MPU) with a real-time microcontroller (MCU), thereby providing the best of both worlds. Greater than the sum of its parts, UNO Q sets about this with a particular focus on convenience, modularity and enabling the rapid prototyping of sophisticated solutions.

Hardware

Qualcomm Dragonwing QRB2210 System-on-Chip

The MPU capability is provided by a Qualcomm Dragonwing QRB2210 System-on-Chip (SoC) with a 4-core Arm Cortex-A53 complex running at 2.0 GHz. The SoC also features an Adreno 702 GPU running at 845 MHz, plus dual image signal processors (ISPs), and is equipped with either 2GB RAM + 16 GB flash or 4GB RAM + 32GB flash.

ST STM32U585 with an Arm Cortex-M33 CPU

The MCU capability meanwhile, is provided by an ST STM32U585 with an Arm Cortex-M33 CPU clocked at up to 160 MHz.

A USB-C port provides USB 3.1 with role-switching capabilities, support for DisplayPort Alt-Mode and USB Power Delivery (5V/3A contract only). A keyboard monitor, mouse and other peripherals may be connected via a USB C hub.

Arduino UNO headers

The instantly familiar Arduino UNO headers on the top side of the board provide access to analogue and digital I/O, and are compatible with a wide range of Arduino UNO shields.

A 2x5 header provides access to a UART console, plus inputs for forced USB boot and PMIC reset.

High speed headers

High-speed headers on the underside provide a mixture of camera/display interfacing, SDIO, GPIO and power. Whereas the classic Arduino headers are 3.3v logic, these are a mixture of 3.3v and 1.8v, depending upon whether they are connected to the MCU or MPU.

WCBN3536A (Qualcomm WCN3980) wireless module

Lastly, a WCBN3536A (Qualcomm WCN3980) wireless module provides support for 802.11a/b/g/n/ac (dual-band) + Bluetooth 5.1.

Software

Arduino Software - Motion detection example

Motion detection example.

The MPU runs Debian Linux and can be interacted with and programmed much like any other Linux system, with a vast array of tools, languages and frameworks etc. available.

The MCU runs the Zephyr real-time OS (RTOS) with Arduino Core API, which means that it can run sketches just like a classic Arduino board and programmed using the familiar Arduino IDE.

The two halves of the platform are integrated via the Arduino Bridge, a remote procedure call (RPC) library which simplifies the task of creating applications which span the MPU and MCU.

This is further built upon by Arduino "Bricks", which are building blocks written in Python and may be packaged similar to a Python module or as a Docker container. The latter is used when deploying AI models and other similarly complex components which may have many dependencies.

Arduino “Bricks”

Bricks in App Lab.

A unified development environment is then provided by Arduino App Lab, which comes pre-loaded on UNO Q and enables applications to be created which integrate bricks and custom code, and execute across both the MPU and MCU. This can be used with an UNO Q connected locally via USB or remotely via WiFi, and also takes care of upgrading the device software and firmware.

Arduino App Lab

While it would be entirely possible to develop MCU code using the classic Arduino IDE and then separately develop Python or other code running on the MPU, App Lab provides a convenient single interface where sketch, Python code and other assets may be developed and run. It also expedites development by making it easy to add bricks and use examples as a starting point.

So we have a eminently compact Linux platform which is capable of running containerised AI models, an MCU for real-time workloads and interfacing with physical inputs and outputs, plus a feature-rich development environment which provides instant access to bricks and examples.

Next, we need to figure out how we’re going to train our ML model, which can be a laborious and not always straightforward task if we need to collect many images, process these and optimise for our application and target hardware. Fortunately, we have some help here also.

Edge Impulse

EDGE Impulse - Face Detection

Edge Impulse is an AI development platform which simplifies the task of training models and deploying them to devices. It provides a web-based studio where users are guided through data collection and processing, to model training and deployment, without having to write code. It includes support for Arduino UNO Q and deploying models to either the Qualcomm Dragonwing Arm processor complex or its Adreno 702 GPU.

Next Steps

In Part 2, we’ll take look at setting up the development hardware, before then getting up and running with Arduino App Lab and Edge Impulse, and preparing PCBAs for model training and testing.

  — Andrew Back

Open source (hardware and software!) advocate, Treasurer and Director of the Free and Open Source Silicon Foundation, organiser of Wuthering Bytes technology festival and founder of the Open Source Hardware User Group.
DesignSpark Logo

Enjoying DesignSpark?

Create an account to unlock powerful PCB and 3D Mechanical design software, collaborate in our forums, and download over a million free 3D models, schematics, and footprints.
Create an account

Already a DesignSpark member? Log In

Comments