Motor Control and Planning

Participants: Daniel Rasmussen, Christian Denk, Cristian Axenie, Michael Mathieu, Nicolai Waniek, Christian Denk

The goal of this project was to build up robots, get them airborne and control them using eDVS data. One or two eDVS were attached to the drone, one looking downwards and one ahead. Due to time limitations, it was decided to only use the downwards looking eDVS. The project itself colaborated heavily with the Object Detection and Tracking subproject in order to locate a specific mark (arrow) on the ground to hover above and get the drone orient itself along this mark.


  • remotely controlled drones (achieved)
  • eDVS on drone (achieved)
  • eDVS data is streamed offboard using a WIFI link (achieved)
  • control the drone position with offboard controllers (partially achieved)
  • completely autonomous flight (not achieved)
  • drone + SpiNNaker for object recognition / tracking (partially achieved)
  • collaboration with a ground robot / OmniRob? (not achieved)
  • solve SLAM with eDVS data from drone (not achieved)
  • world "dronination" (40 %) ;-)


The experimentation platform that we are providing is a large payload quadrotor system based on PX4 ( ​ ), an open-source, open-hardware project aiming at providing a high-end autopilot controller at low costs and high availability.

Drone  arch

Information on the hardware and software infrastructure:






  • drones are flying remotely controlled
  • drones are equipped with one eDVS
  • eDVS data is streamed offboard using a WIFI link
  • very simple/basic yaw controller was implemented which aligns the drone along a bar on the ground. The bar was made using tape.
  • eDVS data may be streamed to an HMAX implementation running on SpiNNaker to detect more complex marks (arrow)