Neat

[ICCV'21] NEAT: Neural Attention Fields for End-to-End Autonomous Driving
Alternatives To Neat
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Carla9,261219 hours ago5November 17, 20211,010otherC++
Open-source simulator for autonomous driving research.
Transfuser775
20 days agomitPython
[PAMI'22] TransFuser: Imitation with Transformer-Based Sensor Fusion for Autonomous Driving, [CVPR'21] Multi-Modal Fusion Transformer for End-to-End Autonomous Driving
Di Drive498
8 months ago2apache-2.0Python
Decision Intelligence Platform for Autonomous Driving simulation.
Lav267
8 months ago11apache-2.0Python
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
Neat245
7 months agomitPython
[ICCV'21] NEAT: Neural Attention Fields for End-to-End Autonomous Driving
My_bibliography_for_research_on_autonomous_driving175
2 years ago
Personal notes about scientific and research works on "Decision-Making for Autonomous Driving"
Oatomobile133
2 years ago3July 06, 20207apache-2.0Python
A research framework for autonomous driving
Awesome End To End Autonomous Driving131
8 days agoapache-2.0
A curated list of awesome End-to-End Autonomous Driving resources (continually updated)
Pgdrive92
2 years ago6March 12, 202118apache-2.0Python
PGDrive: an open-ended driving simulator with infinite scenes from procedural generation
Imitation Learning55
4 years ago11mitPython
Autonomous driving: Tensorflow implementation of the paper "End-to-end Driving via Conditional Imitation Learning"
Alternatives To Neat
Select To Compare


Alternative Project Comparisons
Readme

NEAT: Neural Attention Fields for End-to-End Autonomous Driving

Paper | Supplementary | Video | Talk | Poster | Slides

This repository is for the ICCV 2021 paper NEAT: Neural Attention Fields for End-to-End Autonomous Driving.

@inproceedings{Chitta2021ICCV,
  author = {Chitta, Kashyap and Prakash, Aditya and Geiger, Andreas},
  title = {NEAT: Neural Attention Fields for End-to-End Autonomous Driving},
  booktitle = {International Conference on Computer Vision (ICCV)},
  year = {2021}
}

Setup

Please follow the installation instructions from our TransFuser repository to set up the CARLA simulator. The conda environment required for NEAT can be installed via:

conda env create -f environment.yml
conda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch -c nvidia

For running the AIM-VA baseline, you will additionally need to install MMCV and MMSegmentation.

pip install mmcv-full -f https://download.openmmlab.com/mmcv/dist/cu111/torch1.9.0/index.html
pip install mmsegmentation

Data Generation

The training data is generated using leaderboard/team_code/auto_pilot.py. Data generation requires routes and scenarios. Each route is defined by a sequence of waypoints (and optionally a weather condition) that the agent needs to follow. Each scenario is defined by a trigger transform (location and orientation) and other actors present in that scenario (optional). We provide several routes and scenarios under leaderboard/data/. The TransFuser repository and leaderboard repository provide additional routes and scenario files.

Running a CARLA Server

With Display

./CarlaUE4.sh --world-port=2000 -opengl

Without Display

Without Docker:

SDL_VIDEODRIVER=offscreen SDL_HINT_CUDA_DEVICE=0 ./CarlaUE4.sh --world-port=2000 -opengl

With Docker:

Instructions for setting up docker are available here. Pull the docker image of CARLA 0.9.10.1 docker pull carlasim/carla:0.9.10.1.

Docker 18:

docker run -it --rm -p 2000-2002:2000-2002 --runtime=nvidia -e NVIDIA_VISIBLE_DEVICES=0 carlasim/carla:0.9.10.1 ./CarlaUE4.sh --world-port=2000 -opengl

Docker 19:

docker run -it --rm --net=host --gpus '"device=0"' carlasim/carla:0.9.10.1 ./CarlaUE4.sh --world-port=2000 -opengl

If the docker container doesn't start properly then add another environment variable -e SDL_AUDIODRIVER=dsp.

Running the Autopilot

Once the CARLA server is running, rollout the autopilot to start data generation.

./leaderboard/scripts/run_evaluation.sh

The expert agent used for data generation is defined in leaderboard/team_code/auto_pilot.py. Different variables which need to be set are specified in leaderboard/scripts/run_evaluation.sh. The expert agent is originally based on the autopilot from this codebase.

Training

The training code and pretrained models are provided below.

mkdir model_ckpt
wget https://s3.eu-central-1.amazonaws.com/avg-projects/neat/models.zip -P model_ckpt
unzip model_ckpt/models.zip -d model_ckpt/
rm model_ckpt/models.zip

There are 5 pretrained models provided in model_ckpt/:

Additional baselines are available in the TransFuser repository.

Evaluation

Spin up a CARLA server (described above) and run the required agent. The required variables need to be set in leaderboard/scripts/run_evaluation.sh.

CUDA_VISIBLE_DEVICES=0 ./leaderboard/scripts/run_evaluation.sh

Acknowledgements

This implementation primarily extends the cvpr2021 branch of the existing TransFuser repository.

If you found our work interesting, check out the code for some more recent work on CARLA from our group:

Popular Autonomous Driving Projects
Popular Imitation Learning Projects
Popular Artificial Intelligence Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Autonomous Driving
Imitation Learning