Autonomous Car on Digilent Zybo Z7Follow article
How do you feel about this article? Help us to provide better content for you.
Thank you! Your feedback has been received.
There was a problem submitting your feedback, please try again later.
What do you think of this article?
You have probably heard about self-driving cars or autonomous vehicles. But what are they? An autonomous car, also known as a driverless car, can guide itself without human conduction. Autonomous driving technology involves connectivity, sensor fusion, and deep learning. It also requires the highest level of safety requirements. A Field Programmable Gate Array (FPGA) is capable of high-performance computing as well as being highly reliable. FPGAs are used as a part of vehicle-related algorithms to determine how the car functions and responds to obstacles.
Xilinx has recently released the automotive-grade Zynq 7000 SoCs.The combined programmability of hardware and software in Zynq 7000 SoC architecture allows developers to integrate a complete Advanced Driver Assistance System (ADAS) imaging flow, from sensing through environmental characterization to feature implementation into a single device.
Students from the University POLITEHNICA of Bucharest, Romania used Digilent Zybo Z7 to create a prototype of an autonomous car. The autonomous car is an OpenCV and Embedded Linux powered car with the following key features:
- Drive between lane lines
- Stop at obstacles
- Detect road signs
- Execute smart commands from RFID cards
- Navigate crossroads
Let’s dive in and see how they developed this amazing project.
The car consists of the following parts.
- Digilent PCAM 5C camera sensor
- Digilent Zybo Z7 Xilinx Zynq 7000 SoC. powered by non-automotive grade
- RFID scanner
- 3-axis Accelerometer
- USB WiFi Adapter
- High speed & high torque metal gear servo
- Dual DC motor driver
- 5V Voltage regulator
- 6V Voltage regulator
- 2200mAh Li-Po battery
The hardware platform is built around a plastic car frame on 2 levels, with all the additional mounting hardware and supports being designed in CAD and 3D printed out of PLA plastic. The frame also includes 2 electric brushed DC motors with ample torque to power the car at a decent speed. The steering system resembles that of a go-kart (Ackerman steering), with the servo pushing one wheel hub that also transfers the motion to the second one via a pushrod. The steering system is fully 3D printed and requires minimal assembly. The frame itself requires additional holes and mounting points as per your personal needs, depending on how you want to place the different components/sensors.
The front wheels move like on a normal car, steering left to right. The camera is mounted at the front of the car. A fisheye lens was added on top of the camera sensor to extend the view field. An RFID scanner is mounted at the bottom of the vehicle. A sonar is mounted in the front bumper to prevent collisions.The accelerometer is mounted on top of the vehicle.
The overall design block diagram is presented below.
Video pipelines and video processing - lane detection
The raw data coming from the Pcam 5C over the MIPI PHY lanes is interpreted and processed by the MIPI_D_PHY_RX block and the MIPI CSI-2 Receiver and the Bayer format stream is then passed through an AXI_Bayer_to_RGB block, outputting a more usable AXI Stream signal. The resulting AXI Stream is split into two identical streams, to be repurposed accordingly. One of them is left untouched, 720p 60Hz, and the other one is put through a series of image processing techniques to obtain a grayscale image suitable for lane detection
Sensor data acquisition and processing - RFID, Accelerometer, and sonar
The accelerometer and RFID reader are both I2C enabled and connected through the same bus and to the Processing System directly. The sonar is connected via its PWM output to a block that computes the PWM duty cycle and offers the resulting data to the Processing System via an AXI4-Lite link.
Motors and steering control
The ‘motors and steering’ controller, also called internally Motion is a block that manages dual motor drivers and a servo that directly controls the front steering. The main block communicates with the Processing System via an AXI4-Lite link and internally it consists of three PWM generators with synthesis-time customizable parameters for the Resolution and Frequency. The values chosen are: Motors - 2x 16 bit, 100kHz drivers; Servo - 1x 12 bit, 50 Hz driver. The servo requires a specific frequency and duty cycle to function properly.
Petalinux embedded Linux distribution
The project is centered around an Embedded Linux distribution from Xilinx Petalinux 2017.4. The Linux OS acts as a common ground between hardware and software and manages the processes. The modified kernel provides support for the physical devices including camera, sonar, motors and servo for the WiFi adapter via a modified USB driver. Xilinx has already included Zynq I2C driver to connect with the camera, accelerometer and RFID reader.
Main control application
- Lane component: position the car on the right path of the track and adjust its speed corresponding to the road.
- Sign component: detect stop signs.
- RFID component: correctly detect and store RFID cards placed in key parts of the road.
- Display component: display relevant images to the user.
- Configuration/Calibration component: give the user the capability of utilizing a separate file to set important parameters without the need to recompile the program or to overwrite the configuration file to match the current road conditions.
All components (except the configuration/calibration) run in loops. Each iteration corresponds to one frame.
The project is open source and source files can be downloaded at Digilent Design Contest Website.