Table of Contents

This page has deprecated and will be archived. Please go to https://www.bitcraze.io/.

Crazyflie vision system setup

As explained in the vision main page the Crazyflie vision system is modular and is composed of many programs:

This page aims at explaining step-by-step how to get everything working. It is based on the Windows Kinect SDK implementation of the detector using the setup Bitcraze had at the Maker Faire Bay Area 2015.

Hardware requirement

The system requires:

The reason for having 2 PCs is that the controller and the clients are not working properly on Windows yet and the Vision algorithm requires functionalities that are not available in the Kinect Linux driver yet (the depth to 3D coordinate conversion).

Hardware setup

The Kinect 2 should be placed on the ground, facing up, in the middle of the room (or at least pretty far from the walls). Avoid having reflective things on the ceiling.

The Crazyflie shall be equiped with markers to be detected by the detection program. We put a “donuts” in the middle and three points under motors M2, M3 and M4. Dimention of the middle ring are on the first image:

For the markers we use a retro-reflective sticky sheet that is supposed to be stuck on fabric (it was fount at Panduro here in Sweden, similar product should be available in hobby and sport shops):

Software setup

We will assume the Linux PC has the ip 192.0.0.1 and the Windows PC the IP 192.168.0.2.

Image Processing

The image processing runs in the Windows computer.

First of all you should be able to open and compile the Kinect for windows SDK example.

Then you can clone the windows detector in Visual studio.

Install all dependencies listed in the README. You may have to change the project configuration to point to the libraries in your system.

The vision detector will listen on port 1213, make sure to setup the Windows firewall accordingly when the program starts.

You have to stop the program from Visual studio, closing the detector windows currently does nothing.

When working you should see these and the crazyflie can be detected:

Control, Set-poins and Visualizer

These scripts runs in the Linux computer. To start, clone the crazyflie-vision git repos.

The controller is client to all other parts of the system. You can check the IP in the control/kctrl.py file:

Once the IPs are well configured the controller, set-pint GUI and visualizer can be launched. Only the controller is required to fly, the other can optionally be launched as well.

Client

The client runs in the Linux computer.

Note Currently in order for scaling/control to work with the ZMQ input in the client the Vision MUX has to be used. It's currently pretty unstable, but it can be enabled by following these steps:

Working on the system

When everything is setup you can just fly in the picture by keeping the Alt 1 button pressed and release it to let the kinect take control. The setpoint GUI permits to change the setpoints and to control some pre-programmed patterns.

One of advantage of using ZMQ is that we can stop and restart parts of the system without restarting everything. If going fast it is even possible to restart the controller while the Crazyflie is flying.