Skip to main content

Autonomous Car on Digilent Zybo Z7

You have probably heard about self-driving cars or autonomous vehicles. But what are they? An autonomous car, also known as a driverless car, can guide itself without human conduction. Autonomous driving technology involves connectivity, sensor fusion, and deep learning. It also requires the highest level of safety requirements. A Field Programmable Gate Array (FPGA) is capable of high-performance computing as well as being highly reliable. FPGAs are used as a part of vehicle-related algorithms to determine how the car functions and responds to obstacles. 

Xilinx has recently released the automotive-grade Zynq 7000 SoCs.The combined programmability of hardware and software in Zynq 7000 SoC architecture allows developers to integrate a complete Advanced Driver Assistance System (ADAS) imaging flow, from sensing through environmental characterization to feature implementation into a single device. 

Students from the University POLITEHNICA of Bucharest, Romania used Digilent Zybo Z7 to create a prototype of an autonomous car.  The autonomous car is an OpenCV and Embedded Linux powered car with the following key features:

  1. Drive between lane lines
  2. Stop at obstacles
  3. Detect road signs
  4. Execute smart commands from RFID cards
  5. Navigate crossroads

Zybo_Autonomus_Car_d6e8eb0d9a26f67e391fedce9dba31cc3c89646a.png

Let’s dive in and see how they developed this amazing project.

The car consists of the following parts.  

  1. Digilent PCAM 5C camera sensor (174-1555)
  2. Digilent Zybo Z7 (164-3487) powered by non-automotive grade Xilinx Zynq 7000 SoC.
  3. Sonar 
  4. RFID scanner 
  5. 3-axis Accelerometer
  6. USB WiFi Adapter
  7. High speed & high torque metal gear servo
  8. Dual DC motor driver
  9. 5V Voltage regulator
  10. 6V Voltage regulator
  11. 2200mAh Li-Po battery  

The hardware platform is built around a plastic car frame on 2 levels, with all the additional mounting hardware and supports being designed in CAD and 3D printed out of PLA plastic. The frame also includes 2 electric brushed DC motors with ample torque to power the car at a decent speed. The steering system resembles that of a go-kart (Ackerman steering), with the servo pushing one wheel hub that also transfers the motion to the second one via a pushrod. The steering system is fully 3D printed and requires minimal assembly. The frame itself requires additional holes and mounting points as per your personal needs, depending on how you want to place the different components/sensors.

The front wheels move like on a normal car, steering left to right. The camera is mounted at the front of the car. A fisheye lens was added on top of the camera sensor to extend the view field. An RFID scanner is mounted at the bottom of the vehicle. A sonar is mounted in the front bumper to prevent collisions.The accelerometer is mounted on top of the vehicle.

Camera_f21c6ccd8540fba2e254f0c8138cedc7e8a09753.png

<Car_Design_49123cc5b68a8792ecd1154b05d65ce171a37306.png

Pmod_ACL_756d9efeb2c2d85773cb35575839628eb1e1df0b.png

RFID_Scanner_43ef6f90452e7fa9e7baf3c97ea86f8269128048.png

System Design

The overall design block diagram is presented below. 

Block_Diagram4_cfcfcd06e5f642678c278abae74692b9803a96b7.png

Hardware Design

Video pipelines and video processing - lane detection

The raw data coming from the Pcam 5C over the MIPI PHY lanes is interpreted and processed by the MIPI_D_PHY_RX block and the MIPI CSI-2 Receiver and the Bayer format stream is then passed through an AXI_Bayer_to_RGB block, outputting a more usable AXI Stream signal. The resulting AXI Stream is split into two identical streams, to be repurposed accordingly. One of them is left untouched, 720p 60Hz, and the other one is put through a series of image processing techniques to obtain a grayscale image suitable for lane detection

Sensor data acquisition and processing - RFID, Accelerometer, and sonar

The accelerometer and RFID reader are both I2C enabled and connected through the same bus and to the Processing System directly. The sonar is connected via its PWM output to a block that computes the PWM duty cycle and offers the resulting data to the Processing System via an AXI4-Lite link.

Motors and steering control

The ‘motors and steering’ controller, also called internally Motion is a block that manages dual motor drivers and a servo that directly controls the front steering. The main block communicates with the Processing System via an AXI4-Lite link and internally it consists of three PWM generators with synthesis-time customizable parameters for the Resolution and Frequency. The values chosen are: Motors - 2x 16 bit, 100kHz drivers; Servo - 1x 12 bit, 50 Hz driver. The servo requires a specific frequency and duty cycle to function properly.

Software Design

Petalinux embedded Linux distribution

The project is centered around an Embedded Linux distribution from Xilinx Petalinux 2017.4. The Linux OS acts as a common ground between hardware and software and manages the processes. The modified kernel provides support for the physical devices including camera, sonar, motors and servo for the WiFi adapter via a modified USB driver. Xilinx has already included Zynq I2C driver to connect with the camera, accelerometer and RFID reader.

Main control application

  1. Lane component: position the car on the right path of the track and adjust its speed corresponding to the road.
  2. Sign component: detect stop signs.
  3. RFID component: correctly detect and store RFID cards placed in key parts of the road.
  4. Display component: display relevant images to the user.
  5. Configuration/Calibration component: give the user the capability of utilizing a separate file to set important parameters without the need to recompile the program or to overwrite the configuration file to match the current road conditions.

All components (except the configuration/calibration) run in loops. Each iteration corresponds to one frame.

Project Details

The project is open source and source files can be downloaded at Digilent Design Contest Website.

awong has not written a bio yet…
DesignSpark Electrical Logolinkedin