AirSim is a simulator for drones (and soon other vehicles) built on Unreal Engine. It is open-source, cross platform and supports hardware-in-loop with popular flight controllers such as Pixhawk for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped in to any Unreal environment you want.
Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.
Check out the quick 1.5 minute demo
This project is under heavy development. The current release is in beta and all APIs are subject to change. The next major features currently in works are standalone mode, several API enhancements and Python client support. We welcome contributions!
To get the best experience you will need Pixhawk or compatible device and a RC controller. This enables the "hardware-in-loop simulation" for more realistic experience. Follow these instructions on how to get it, set it up and other alternatives.
There are two ways to get AirSim working on your machine. Click on below links and follow the instructions.
- Walkthrough Demo Video
- AirSim Setup Video (shows you all the setup steps)
If you have a Pixhawk flight controller (or compatible device) and a remote control you can manually control the drones in the simulator and fly around.
There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button on the lower right corner. This will start writing pose and images for each frame.
If you would like more data logging capabilities and other features, file a feature request or contribute changes. The data logging code is pretty simple and you can modify it to your heart's desire.
A more complex way to generate training data is by writing client code that uses our APIs. This allows you to be in full control of how, what, where and when you want to log data. See the next section for more details.
For MavLink enabled drones, you can also use our Log Viewer to visualize the streams of data.
You can also playback recorded logs for side-by-side comparison between real drone and the simulator.
The AirSim exposes easy to use APIs in order to retrieve data from the drones that includes ground truth, sensor data as well as various images. It also exposes APIs to control the drones in a platform independent way. This allows you to use your code to control different drones platforms, for example, Pixhawk or DJI Matrice, without making changes as well as without having to learn internal protocols details.
These APIs are also available as a part of a separate independent cross-platform library so you can deploy them on an offboard computer on your vehicle. This way you can write and test your code in simulator and later execute it on the real drones. Transfer learning and related research is one of our focus areas. See custom drones.
You can get additional technical details in our paper (preprint). Please cite this as:
@inproceedings{airsim2017fsr,
author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor},
title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles},
year = {2017},
booktitle = {Field and Service Robotics},
eprint = {arXiv:1705.05065},
url = {https://arxiv.org/abs/1705.05065}
}
We welcome contributions to help advance research frontiers.
Join the AirSim group at Facebook to stay up to date or ask any questions.
If you run into problems, check the FAQ and feel free to post issues on the AirSim github.
This project is released under MIT License. Please review License file for more details.