Awesome Open Source
Awesome Open Source

DroNet: Learning to fly by driving

This repository contains the code used to train and evaluate DroNet, a convolutional neural network that can safely drive a drone along the streets of a city.


If you use DroNet in an academic context, please cite the following publication:

Paper: DroNet:Learning to Fly by Driving

Video: YouTube

	doi = {10.1109/lra.2018.2795643},
	year = 2018,
	author = {Antonio Loquercio and Ana Isabel Maqueda and Carlos R. Del Blanco and Davide Scaramuzza},
	title = {Dronet: Learning to Fly by Driving},
	journal = {{IEEE} Robotics and Automation Letters}


Due to the danger that flying a drone can cause in an urban environment, collecting training data results impossible. For that reason, DroNet learns how to fly by imitating the behavior of manned vehicles that are already integrated in such environment. It produces a steering angle and a collision probability for the current input image captured by a forward-looking camera. Then, these high-level commands are transferred to control commands so that the drone keeps navigating, while avoiding obstacles.


DroNet has been designed as a forked CNN that predicts, from a single 200×200 frame in gray-scale, a steering angle and a collision probability. The shared part of the architecture consists of a fast ResNet-8 with 3 residual blocks, followed by dropout and ReLU non-linearity. After them, the network branches into two separated fully-connected layers, one to carry out steering prediction, and the other one to infer collision probability. See for more details.



In order to learn steering angles, the publicly available Udacity dataset has been used. It provides several hours of video recorded from a car. We additionally recorded an outdoor dataset to learn the probability of collision by riding a bicycle in the streets of our city.


Running the code

Software requirements

This code has been tested on Ubuntu 14.04, and on Python 3.4.


  • TensorFlow 1.5.0
  • Keras 2.1.4 (Make sure that the Keras version is correct!)
  • NumPy 1.12.1
  • OpenCV 3.1.0
  • scikit-learn 0.18.1
  • Python gflags

Data preparation

Steering data

Once Udacity dataset is downloaded, extract the contents from each rosbag/experiment. To have instructions on how to do it, please follow the instructions from the udacity-driving-reader, to dump images + CSV. After extraction, you will only need center/ folder and interpolated.csv file from each experiment to create the steeering dataset. To prepare the data in the format required by our implementation, follow these steps:

  1. Rename the center/ folder to images/.
  2. Process the interpolated.csv file to extract data corresponding only to the central camera. It is important to sync images and the corresponding steerings matching their timestamps. You can use the script to accomplish this step.

The final structure of the steering dataset should look like this:


*Since Udacity dataset does not provide validation experiments, we split HMB_1 so that the first 3900 samples are used for training and the rest for validation.

Collision data

Collision dataset is ready to be used after downloading. It can be directly downloaded from here. Its structure is as follows:


Finally, merge the training/, validation/ and testing/ directories from both datasets, and use them to train and evalutate DroNet.

Training DroNet

  1. Train DroNet from scratch:
python [flags]

Use [flags] to set batch size, number of epochs, dataset directories, etc. Check to see the description of each flag, and the default values we used for our experiments.


python --experiment_rootdir='./model/test_1' --train_dir='../training' --val_dir='../validation' --batch_size=16 --epochs=150 --log_rate=25 
  1. Fine-tune DroNet by loading a pre-trained model:
python --restore_model=True --experiment_rootdir='./model' --weights_fname='model_weights.h5'

The pre-trained model must be in the directory you indicate in --experiment_rootdir.

Evaluating DroNet

We evaluate our model on the testing data from each dataset.

python [flags]


python --experiment_rootdir='./model' --weights_fname='model_weights.h5' --test_dir='../testing' 

Download our trained model

A folder containing the trained model that we used in our real world experiments can be download here: best_model. To evaluate it, use the following command:

python --experiment_rootdir=PATH_TO_FOLDER --weights_fname=best_weights.h5 --test_dir='../testing' 

Test dronet on a real robot

In the folder drone_control you can find instructions on how to use DroNet to fly a Bebop drone.

Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
Python (888,878
Dataset (33,267
Udacity (5,009
Probability (4,232
Drone (3,627
Collision (2,666