The assignment is slightly different than the localization project due to some computational constraints. Unfortunately, SLAM is not fast enough to run in real time on board the raspberry pi. Instead, you will implement SLAM to run on some pre-recorded flight data (lists of keypoints, descriptors, and IR data). Your SLAM will run offline on the data, and you can judge the correctness of your implementation using animate_slam.py to view the animation of the flight. We will provide you with the animation produced by our solution code for comparison.
If you try to run your SLAM program offline on the drone, you will find that the program takes up to 15 minutes to run! Instead, we recommend that you use the department machines to develop your solution, as they have opencv, numpy, and matplotlib pre-installed and only take a few moments to run SLAM on sample data. Alternatively, you are welcome to install these dependencies on your own computers.
After you have implemented your solutions offline, you may optionally use your solution code to generate maps onboard the drone, then fly the drone localizing over the map. To do this, you may use a modified version of the localization code, called MATL (mapping and then localization) which will be provided for you. See the more detailed instructions for this below.
We will provide you with the following files:
slam.py student_slam_helper.py utils.py map_data.txt animate_slam.py animation.mp4
utils.pycontain fully implemented helper code that you are welcome to read but do not need to edit. You may, however, edit the number of particles in
slam.pyto experiment with the speed/accuracy tradeoff for FastSLAM. You should find that 15 particles are plenty.
utils.pycontains add_landmark and update_landmark as promised, as well as some other helper code.
map_data.txtcontains data from a previous flight. It stores keypoints, descriptors, and IR data from each camera frame. The helper code included in
slam.pywill read the data from map_data, create a FastSLAM object imported from
student_slam_helper.py(your code), and run the data through your SLAM implementation.
slam_data.txtwill hold data generated by your SLAM implementation including particle poses, landmark poses, and currently observed feature poses. slam_helper.py will write these data as it runs through the saved flight data with your SLAM implementation. You can use animate_slam.py to view the animation from your SLAM.
animation.mp4is the animation generated by our solution code with which you can compare your own animations.
The only thing left for you to do is implement the missing parts of the slam_helper file, which are indicated with TODOs. The intended functionality of each missing method is indicated in the docstrings. You will find that you have already implemented much of the functionality of SLAM in your localization code.
Similar to the particle filter assignment, developing SLAM offboard will require the libraries NumPy, OpenCV, and MatPlotLib. Again, you are welcome to install these dependencies on your own machines or use the department machines to implement this program. The easiest way to work on this project is probably to work over ssh on your laptops and use XQuartz (what the -Y is for when you type ssh -Y) which will allow you to view animations over ssh!
You should develop your SLAM implementation, on the department machines or your own computer, using the provided helper code. When you want to test your implementation, you should first run slam.py to run through your implementation in slam_helper.py with the sample data (takes 1-2 minutes to run) and then run animate_slam.py to read from slam_data.txt and create your animation. If your animation closely follows that from the solution code, you’ve done a great job!
The checkoff for this project is simple, run your slam_helper.py implementation on the sample data for a TA. Show the TA the corresponding animation. An easy way to do this is to login to your account on the department machines with “ssh -Y” and if you have xquartz installed on your computer, you can run your animation from the terminal over ssh.
On-Board Offline on the Drone#
SLAM can run but it is quite slow on the Raspberry Pi. We have gotten it to work on board the Raspberry Pi, but offline. So you fly, record data, land, and then make a map and localize. Details are in the operation manual, and there are a lot of rough edges.
Please be sure to push your finished project directory to GitHub classroom to handin. For your final handin, you should have edited all of the following files:
student_slam_helper.py student_localization_helper.py student_compute_displacement.py student_particle_filter.py