
Sponsored by: NVIDIA
Industrial AI - Part 5: Getting Started with Jetson Nano / Orin, and Why Octopus Brains are ML Marvels.
Getting Started with NVIDIA Jetson Nano.
Many engineers are familiar with Arduino and Raspberry Pi computers, sometimes referred to as ‘Single Board Computers’ (SBCs). The general difference is the former are microcontrollers, and the latter are microprocessors. The basics being you can do more controlling servos, switches, lights, etc. with a Arduino (and you can switch it on/off lots, without shutting it down!), and with RasPi the emphasis is more on processing, going online, using APIs and suchlike. Both are running on what you’d call a CPU chip, (Central Processing Unit).
A Graphics Processing Unit, or GPU, has many more cores, but they do more discrete tasks - in parallel. They originated in the gaming industry rendering complex graphics, where a lot of pixels in a 3D world needed to be rendered in parallel. The key difference between the two is that GPUs are better for large amounts of data, and as such are ideal for Machine Learning.
Aside from Graphics rendering, GPUs are also excellent at processing translation, and it’s why ‘Large Language Models’ feature heavily in AI/ML applications, as they were one of the first applications of the ‘multi-mini-parallel-core’ architecture of GPUs, to help translate languages, which of course requires vast repositories of words, but also vast details of ‘rules’ of how we put words next to one another in making a coherent sentence.
The reason for this preamble is that the Jetson Nano “C100” (252-0055) Dev Kit from NVidia is somewhat an ‘all-in-one’ with a significant number of GPIO pins to allow for a wider range of peripherals, powering out 3v3, 5v rails, and interfacing with and I2C, UART and SPI (as well as the 2 ribbon-cable camera inputs, and ethernet and USB ports as standards). However, like a Raspberry Pi, it needs to be ‘shut down’ properly, so it’s not as robust as a ‘bulletproof’ Arduino microcontroller, which you can simply ‘turn off’.
'Jetson Nano / Orin' vs 'Raspberry Pi + AI Hat'.
Cards on the table, I’m of course blogging about the Jetson Nano / Orin 'universe' here as it’s what I’m being paid to do. However, objectively, the reason for using a Jetson system is that it can handle what is called ‘inference’, which is to say, it can *train* (or refine and improve) a given machine learning model, (hence Development Kit). Whereas with a Raspberry Pi you’d need to train the model elsewhere, (on a laptop/computer), but you can run it on a Raspberry Pi + AI Hat - but you can’t change the model so easily ‘on the go’ as with a Jetson.
For example, if you’ve trained your model on 1000 photos of the ‘perfect apples’, and 1000 photos of the ‘bad apples’, and your model is unlikely to need updating, as well, apples are not going to ‘evolve’ or ‘iterate’ anytime soon - you’re good. Like Makoto in Japan with Cucumbers, load it onto a Raspberry Pi + AI Hat, and away you go!
Conversely, if like Lion Vision, you have a ‘moving feast’ situation, where batteries keep changing, products keep building new enclosures, and other ‘confusing’ things enter the consumer market - you’ll need to keep re-training and adapting the model - so having a dynamic system like a Jetson is perhaps advisable...
Above: NVIDIA Jetson Nano, “C100”, selling for under £180. (RS Online - see full range).
So the choice is yours, but as this whole series is about Training AND Running machine learning models, with circumstances that need updating and refining with new data being input regularly, as would be the case for most professional situations where you seek to improve quality/throughput/accuracy as an ongoing goal.
Above: Unscrewing and unclipping the Machine Learning ‘execute’ top PCB board, from the ML ‘training’ base PCB board.
Taking apart the Dev Kit - which is effectively the ‘base board’, used for training/changes to the model. This has the ‘Machine Learning board’ affixed to it, with some clips and two screws for security. It’s a simple operation to remove, and in principle if you had the ‘dock’ you could do away with the Dev ‘base board’ if you so wished, once trained (or perhaps because you needed a smaller package size).
Multiple ‘Sensory’ Inputs.
Above: Image 1: Jetson Nano, with Raspberry Pi camera on the left, and a Temp/RH% sensor; CO2 Sensor; and Thermocouple, on the right. Image 2: GPIO Pin Outs for Jetson Nano. Link.
As mentioned earlier, the Jetson Nano can connect to a wider range of peripherals, with a few examples below, and more above, mentioned previously. Where I suspect it’ll get interesting is seeing how multiple ‘inputs’ combine to build more nuanced models. After all, we as humans, try to use ‘all our 5 senses’ [and then some] to make intelligent decisions… AI just needs to ‘catch up’ to multi-faceted inputs, and this gear seems well-equipped for such needs!
Why You Should Marvel at Octopuses Having 9 Brains.
I was speaking with Johnny Núñez from NVIDIA in preparation for some future AI/ML projects we are cooking-up, and I asked ‘so a big USP that the Jetson Orin [new version of Jetson Nano] has?’. He considered this, and although there were various, the stand-out for me was the ‘networking’ capability, which in my terms is a bit like ‘daisy chaining’, in that you can have ‘one-to-many’ connected together.
Anyway, weeks later, (whilst writing this article series), my son was enthusiastically telling me that an Ocopus has 9 brains - 1 ‘main brain’ in its head, and 1 in each of its tentacles. Because I’d been mulling away trying to find a nice metaphor for what Johnny had told me, it instantly struck me this was a perfect example. (I checked with Johnny also in case this was too crude, and he said it was good enough - so take it with a pinch of salt, but the point is to get your imagination thinking in new ways!).
(It also turns out they can lose and regrow a tentacle! More the relevance of that later.)
Above: Octopus with 9 Brains. Image Credits: Scientific American.
What I love about this, is that it reminded me a bit of Terminator 2, when the ‘bad’ robot plunges an iron pole through the ‘good’ robot, there is a scene where his circuits ‘short out’, and he sort of ‘dies’. The ‘bad robot’ begins its rampage of terror, unimpeded by the ‘good’ robot - but then the ‘good’ robot re-routes its power, and reboots up again, removing its kebabbed pole, and goes on [spoiler] to save the day. Although somewhat a Sci-Fi blockbuster movie moment, this is based in some fact.
Above: Scene where ‘good’ robot, reroutes power, after sustaining critical damage. Link/Credit: YouTube.
I worked in California in 2008, and one of the projects was designing ‘hot swappable’ parts for Data Centres, for a big player in the industry. What sounds ‘easy’ is actually very hard, as to change a part of a computer, whilst it is powered on (ie ‘hot’) and ‘swap’ it requires all sorts of clever stuff to stop it ‘shorting circuiting’ and creating issues, etc.
Anyway, my point being that the Jetson Orin offers some interesting opportunities:
-
To create ‘Primary’ and ‘Secondary’ ML systems.
Considering an Octopus’s ‘main Brain’ and its ‘tentacles’, one might assume that this is because their tentacles are so complex (hundreds of suckers, sensors, muscles, and more) they need their own ‘intelligence’ that is running locally. It would be less efficient to run all that ‘processing’ over a ‘long nervous system connection’ perhaps. Nature is inherently efficient, so although I’m not an expert, my guess would be that it is perhaps because of this, that they evolved this way.
Other metaphors may include ‘Conductor’ and ‘members of an Orchestra’, although this may be a bit contentious, as it implies ego, which Jetson computers don’t worry about so much! When you consider networks of this kind, it also means that issues like Data Privacy might be improved - in that a local AI might parse images of a sensitive nature, compile the output, and relay that to the ‘main computer’ (which is connected to the web) but this effectively ‘isolates’ the sensitive/private from the public.
Likewise, it might be that some ML algorithm might be so ‘top secret’ that the ‘main computer’ can only know the ‘output’. Let’s say you’re making KFC chicken with a robot, it would mean the people cooking the chicken do not need to ‘see’ what ‘secret blend of herbs and spices’ are being combined in the room next door. They just get the ‘resultant’ spice mix.
-
Redundancy, Failure and Hot Swaps.
If you imagine that an Octopus can lose a tentacle, so too a ML system can ‘lose’ or perhaps just ‘temporarily lose’ a ‘Secondary’ Jetson unit, perhaps because it needs to be repaired, upgraded, maintained, have new inputs, etc. But the rest of the system can operate regardless.
This opens up a lot of possibilities also for ‘backups’, so you may have 2 ML Jetson’s running, in case one fails. Perhaps say for a flight controller system, etc. You’d be able to land with one broken and repair it, but the renaming one could do all the tasks alone.
Orchestrated Machine Learning, inspired By Nature.
I find this an interesting aspect of the evolution of ML - to consider the ‘orchestrated’ nature of things. Even as humans, we are increasingly considering the possibility of a ‘Gut intelligence’ outside of our heads. This mindset-shift is new to me at present, but I think will only become more commonplace as we expand our creative affinity for ‘thinking more like a robot’, or indeed, realising that - as is often the case - we didn’t realise Mother Nature has been working on this stuff for eons. We’re just catching up to the marvels of biology. One just hopes we realise the ever-increasing urgency to protect such a wondrous and sublime intelligence that will surely inspire us to expand our consciousness and humanity if we care for it accordingly.
Industrial AI Blog Series Contents:
Part 1: Lion Vision, AI vs Automation, and Why a Game of ‘Go’ Changed Everything.
Part 2: “Dirty, Dangerous, Difficult & Dull” - The Case for Ethical AI Automation.
Part 3: Key ML Terminology: Are You 'Sorting Ingredients' or 'Baking Cakes'?
Part 4: ML Lessons from Lion Vision. AI Failures, and ‘Sensing Like A Robot’.
Part 5: Getting Started with Jetson Nano / Orin. And Why Octopus Brains ML Marvels.
Part 6: A *Shiny* Idea, Whilst at Lion Vision: “Hi Vis Batteries”. And Why You Need Underdog Engineers.
Are you looking for additional information on Lion Vision?
Comments