Skip to main content

Useless Machine: Make a Mess & have Fun! Part 3

In Part 1, I described the base part of a rather useless machine. Part 2 covered the top part of it and described the preparation of the RevPi Core (installation of libraries, etc.). In this last part, I explain the central control software written and Python and also show how I used OpenCV to calculate the fluid levels. Please read part 1 to understand why a useless machine can have a purpose and what the UM2 is all about.

Although OpenCV is a mighty library, I ended up with over 1700 lines of Python code (see attached file). One reason is the GUI. I wanted to give the user an impression of what is going on behind the scene. So the 7” display does show the only input (besides the command buttons) coming from the camera. OpenCV provides some functions to display pixel arrays (i.e. pictures) and to manipulate them by drawing 2D geometric shapes and text. Using these features, I could overlay the calculated results of the level detection. The result looks like this screenshot:

screenshot02_c767001b36e59aadd7f7ba48f6552f3421902f92.png

You can see a lot of interior in the background. I thought it would be much better to test the UM2 with a realistic environment instead of a nice single-coloured backdrop. There is no object recognition involved, but the average pixel values do influence the white balancing and exposure setup of the camera.

I’m using fixed areas of the picture for level detection, and I can frankly do so because the camera is tightly mounted at its position. So the objects will always be at the same pixels of a picture frame. You can see the areas for level detection I use: They are the white-bordered boxes. I take these areas and mask them with three colour masks (one for green, one for red and one for blue). The result for the left field is tinted green and scaled to the size of the cyan-bordered boxes. Finally, it is copied inside one of these boxes. This procedure is analogue for the oil.

The number of coloured pixels divided by the total number of box pixels gives the filling grade in %. For the middle container, I do need three masks (resulting in three cyan-bordered boxes). The blue colour mask results in two little blue fields, which represent the blue marker lines. I let OpenCV find their circumference, smooth them (see tiny black lines in the middle overlay) and take the average vertical pixel position (= Y-value) between the horizontal lines of the circumference as marker position. The yellow lines in the overlay do represent the levels found. These levels are used to divide the middle white-bordered area into three parts. This enables me to calculate the percentage of red and green filling in each of the three sections.

So I end up with eight numeric level values to control the UM2.

Things could be so easy – but the reality is not – It is full of pitfalls and perfidy…

I knew from the UM1 experiment that you should find a camera with the possibility to switch off its auto-white balancing and auto-exposure. These functions may be excellent for regular webcam use, but they integrate over the whole picture and thus might drastically change pixel values for the fluids and make you highly dependent on ambient light conditions (one of the reasons for the disaster of UM1 I mentioned in part 1). This project also needs a camera with adjustable focal distance and focus because I need to capture all three containers from just 30 to 80cm distance at a 90° angle.

So I chose the ELP 1080P, a full HD two-megapixel camera with 30 fps and 1920 x 1080 maximum resolution. It has a 2.8-12mm vario-lens. My first examinations involved a Linux driver with possibilities to manually set the white balance and exposure value.

It could be fine – but it turned out that the driver is undocumented and buggy. Manually setting white balance results in a horrible yellow-tinted picture and setting the exposure value is nearly impossible because the value is not a linear function but more or less randomly jumps from over- to underexposure.

As I did not like the idea of buying a dozen more Chinese cameras only to find out that their driver was sloppily programmed, I decided to live with the auto-exposure and auto-white-balancing switched on. This, in turn, caused me to write some hundreds of lines of code more to get a UI driven set-up to compensate for all possible ambient light problems. This is what the screen looks like in setup mode:

screenshot01_346df2132a32f38ac382d7547e3bd09c6e84aec6.png

The additional white numbers show setup values. Hovering with the mouse over them and then using the scroll-wheel does change their values. They mainly define the colour masks’ parameters. As I work in the HSV colour system (“hue”, “saturation”, “value”), the masks must define upper and lower limits for these three colour values. One of the value triplets does display the colour values of the original picture’s pixel under the cursor (the example shows 30,113,158). This helps me to find the correct limits.

I also can move the white boxes which mark the analysed picture areas horizontally to the left or right by double-clicking on the position of the container where the left side of the box shall be. With this functionality, I can move the area out of any light reflections caused by the potential for intense light from the sides (see the right side of the red tank to get what I mean).

Adding several keyboard letters to work as switches for manually turning valves and pumps on and for saving the setup values gives the freedom of testing several different system conditions. As I directly can observe the results of changing the parameters, it is not rocket science to adapt these values to work with the current ambient light condition. Maybe I will introduce learning algorithms in a later version which would supersede the manual process.

There are additional yellow text overlays for the machine state and health (like CPU core temperature and CPU frequency). White textboxes do show any prompts to the user. The black box with arrows and red or green letters “P” and “V” does show which pump and valve is currently switched on.

The control algorithm is a classical state machine (I show the actual state as a yellow text overlay in the upper right corner) running with a 200 ms cycle time. This gives the camera and OpenCV enough time to do their work and leaves me with five cycles per second – enough to react to the process values. There are several “sub-state-machines” which coordinate the spectacular light show effects.

OpenCV calculations do not only deliver eight actual level values but also do provide a kind of “health value” of these level values. Whenever I cannot detect the blue section divider lines, or when I discover a zero level in the outer containers, I assume that there is some camera problem. This observation is converted to a bool health value. The state machine immediately enters a fault state, switches all pumps off and closes valves. It then displays a warning message with the most likely reason for this interruption:

dude_message_330dc26ade9f60c79ca858ee350f9fb52b4f7892.jpg

There is another safety level built-in: before switching on any of the valves and pumps, I always check some safety rules. If the actual level values do not comply with these rules, I do not transit to the “on” states but instead, change into a safety violation state. E.g. the pump and valve for emptying the oil of the middle container can only be switched on if the upper outlet is not in contact with water. Therefore the green level of the middle of the three sections must be zero.

Any unlogical level combination (e.g. upper and lower sections show levels higher zero, but the middle section shows a zero level for one fluid) does result in a transition to the “messy state” where the machine is halted, and the user must restart it after resolving the situation.

Using a software watchdog

Without mistakes, software writing would be boring, right? It’s debugging which makes things exciting! But what if a mistake is not forgiven by the system and ends in a system crash just after switching on the water pump? The pump continues sucking any fluid out of the middle container regardless of whether it is green water or red oil. The mess would be perfect and a full day cleaning session could be the result.

So better introduce some kind of supervision. This is what my software watchdog does. For those of you who know that the RevPi’s Processor does provide a hardware watchdog: Yes it does indeed, but we could not use it here because its timeout period is several seconds – much too slow. So I added a little Python script which is observing one of the unused digital outputs in the process image. If this output does not change its state at least every 500 ms (the watchdog’s cycle time) then the watchdog is firing: It sends a reset command to the PiControl backplane bus driver which forces all outputs to “off” (assumed to be the safe state).

The watchdog is an easy and short script which can be 100% tested and thus can be declared as safe. It is started together with the control program, which in turn is responsible for reversing the observed digital output every 200 ms cycle, thus preventing the watchdog from firing. So if my complex Python control program should ever get stuck (or is forced to stop by the user) my little watchdog will react within at least half a second and shut off all valves and pumps.

Please find the code for the control program ("UM01.py") and the watchdog ("wd,py") attached as uploads. If you want to follow any of the above instructions and you are finding problems, please do not hesitate to leave your questions or comments below. I will try to answer them thoroughly.

Would you like to watch the machine working? Well, here it is:

Volker de Haas started electronics and computing with a KIM1 and machine language in the 70s. Then FORTRAN, PASCAL, BASIC, C, MUMPS. Developed complex digital circuits and analogue electronics for neuroscience labs (and his MD grade). Later: database engineering, C++, C#, industrial hard- and software developer (transport, automotive, automation). Designed and constructed the open-source PLC / IPC "Revolution Pi". Now offering advanced development and exceptional exhibits.
DesignSpark Electrical Logolinkedin