Skip to content

Commit

Permalink
Initial import of the localization and pheromone systems
Browse files Browse the repository at this point in the history
  • Loading branch information
gestom committed Aug 14, 2015
1 parent fac963d commit 1cbdcae
Show file tree
Hide file tree
Showing 80 changed files with 13,525 additions and 0 deletions.
73 changes: 73 additions & 0 deletions Localization/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
<html>
<head/>
<body>
<h3>What is SwarmCon ?</h3>

SwarmCon is a minimalistic version of the <a href="http://www.youtube.com/watch?v=KgKrN8_EmUA">WhyCon</a> localization system intended for swarm applications.
While the core of the SwarmCon system is the same as the WhyCon one, it was designed to have as least dependencies as possible.
Moreover, the SwarmCon is especially intended for external localization of ground-robot swarms.
Thus, unlike WhyCon, SwarmCon can distinguish between individual robots and calculate their heading.

WhyCon was first presented on ICRA2013 conference and later in the JINT journal.
If you use this software for your publication it is mandatory to cite WhyCon using the references in the provided cite.bib file.
Full URL to this file is: https://raw.github.com/lrse/whycon/master/cite.bib.

<h3>To use it</h3>

To start with the software:
<ol>
<li>Install the <a href="#libraries">SDL libraries</a>.</li>
<li>Download the software from GitHub and go to the <b>src</b> directory.</li>
<li>Adjust the camera resolution in the <b>main/swarmcon.cpp</b>.</li>
<li>Compile the software - just type <b>make</b>.</li>
<li>Download, resize and print one circular <a href="pattern.pdf">pattern</a>.</li>
<li>Try a test run - you need to run the binary from the <b>bin</b> directory. Type <b>./swarmcon /dev/videoX 1</b>, where X is the number of the camera and 1 tells the system to track one pattern.</li>
<li>You should see the image with some numbers below the circle. Pressing <b>D</b> shows the segmentation result.</li>
<li>Open your browser to view localhost:6666. You should see the circle position.</li>
</ol>

To setup a coordinate system and use more robots:
<ol>
<li>Calibrate your camera using the MATLAB (or Octave) calibration toolbox and put the <b>Calib_result.m</b> in the <b>etc</b> directory.</li>
<li>Go to the <b>etc</b> directory and call <b>create.sh N M</b> to generate patterns for <b>NxM</b> robots.</li>
<li>Print the generated file <b>pattern_n_m.pdf</b>, put the eliptical markers on your robots and place the remaining four circular markers at the corners of their operation space.</li>
<li>Modify the dimensions of the operation space in the <b>main/swarmcon.cpp</b>.</li>
<li>Adjust the circle diameter in the <b>main/swarmcon.cpp</b>, default diameter is 30 mm.</li>
<li>Call <b>make</b> to recompile, put your camera facing down and overhead the your swarm.</li>
<li>Go to <b>bin</b> directory and run <b>./swarmcon /dev/videoX Y</b>, where X is the number of your camera and Y is the number of patterns you want to track, i.e. Y=NxM+4.</li>
<li>Once all the patterns are found, press <b>a</b> and the four outermost patterns will be used to calculate the coordinate system.</li>
<li>Each pattern will have four numbers - ID, heading and x,y in mm.</li>
</ol>

For postprocessing :

<ol>
<li>Processing a saved video rather than the camera feed is likely to provide more precise results.</li>
<li>To create a video, simply create an <b>output</b> directory in the place where you run the <b>swarmcon</b>.
<li>If your camera supports MJPEG, then the system will create a video in the <b>output</b> directory.</li>
<li>You can run <b>swarmcon videofile Y</b> to process that video in the same way as when using the camera.</li>
<li>If your camera does not support MJPEG, it will save the video feed as a serie of bitmaps, that you can process later as well.</li>
</ol>

Furhermore:

<ol>
<li>Running the system as <b>./swarmcon /dev/videoX Y nogui</b> causes text-only output.</li>
<b>h</b> displays help.</li>
<li>Pressing <b>h</b> displays help.</li>
<li>Pressing <b>+</b>,<b>-</b> changes the number of localized patterns.</li>
</ol>

<hr>
<h4>Dependencies</h4><a NAME="libraries"></a>

All the following libraries are probably in your packages.

<ul>
<li><b>libSDL-1.2</b> for graphical user interface.</li>
<li><b>libSDL_ttf-2.0</b> to print stuff in the GUI</li>
<li><b>libncurses5-dev</b> to print stuff on the terminal</li>
</ul>

</body>
</html>
69 changes: 69 additions & 0 deletions Localization/SwarmCon.wiki/manual.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
<html>
<head/>
<body>
<h3>What is SwarmCon ?</h3>

SwarmCon is a minimalistic version of the <a href="http://www.youtube.com/watch?v=KgKrN8_EmUA">WhyCon</a> localization system intended for swarm applications.
While the core of the SwarmCon system is the same as the WhyCon one, it was designed to have as least dependencies as possible.
Moreover, the SwarmCon is especially intended for external localization of ground-robot swarms.
Thus, unlike WhyCon, SwarmCon can distinguish between individual robots and calculate their heading.

<h3>To use it</h3>

To start with the software:
<ol>
<li>Install the <a href="#libraries">SDL libraries</a>.</li>
<li>Download the software from GitHub and go to the <b>src</b> directory.</li>
<li>Adjust the camera resolution in the <b>main/swarmcon.cpp</b>.</li>
<li>Compile the software - just type <b>make</b>.</li>
<li>Print one circular <a href="pattern.pdf">pattern</a>.</li>
<li>Try a test run - you need to run the binary from the <b>bin</b> directory. Type <b>./swarmcon /dev/videoX 1</b>, where X is the number of the camera and 1 tells the system to track one pattern.</li>
<li>You should see the image with some numbers below the circle. Pressing <b>D</b> shows the segmentation result.</li>
<li>Open your browser to view localhost:6666. You should see the circle position.</li>
</ol>

To setup a coordinate system and use more robots:
<ol>
<li>Calibrate your camera using the MATLAB (or Octave) calibration toolbox and put the <b>Calib_result.m</b> in the <b>etc</b> directory.</li>
<li>Go to the <b>etc</b> directory and call <b>create.sh N M</b> to generate patterns for <b>NxM</b> robots.</li>
<li>Print the generated file <b>pattern_n_m.pdf</b>, put the eliptical markers on your robots and place the remaining four circular markers at the corners of their operation space.</li>
<li>Modify the dimensions of the operation space in the <b>main/swarmcon.cpp</b>.</li>
<li>Adjust the circle diameter in the <b>main/swarmcon.cpp</b>, default diameter is 30 mm.</li>
<li>Call <b>make</b> to recompile, put your camera facing down and overhead the your swarm.</li>
<li>Go to <b>bin</b> directory and run <b>./swarmcon /dev/videoX Y</b>, where X is the number of your camera and Y is the number of patterns you want to track, i.e. Y=NxM+4.</li>
<li>Once all the patterns are found, press <b>a</b> and the four outermost patterns will be used to calculate the coordinate system.</li>
<li>Each pattern will have four numbers - ID, heading and x,y in mm.</li>
</ol>

For postprocessing :

<ol>
<li>Processing a saved video rather than the camera feed is likely to provide more precise results.</li>
<li>To create a video, simply create an <b>output</b> directory in the place where you run the <b>swarmcon</b>.
<li>If your camera supports MJPEG, then the system will create a video in the <b>output</b> directory.</li>
<li>You can run <b>swarmcon videofile Y</b> to process that video in the same way as when using the camera.</li>
<li>If your camera does not support MJPEG, it will save the video feed as a serie of bitmaps, that you can process later as well.</li>
</ol>

Furhermore:

<ol>
<li>Running the system as <b>./swarmcon /dev/videoX Y nogui</b> causes text-only output.</li>
<b>h</b> displays help.</li>
<li>Pressing <b>h</b> displays help.</li>
<li>Pressing <b>+</b>,<b>-</b> changes the number of localized patterns.</li>
</ol>

<hr>
<h4>Dependencies</h4><a NAME="libraries"></a>

All the following libraries are probably in your packages.

<ul>
<li><b>libSDL-1.2</b> for graphical user interface.</li>
<li><b>libSDL_ttf-2.0</b> to print stuff in the GUI</li>
<li><b>libncurses5-dev</b> to print stuff on the terminal</li>
</ul>

</body>
</html>
Empty file added Localization/bin/.gitignore
Empty file.
111 changes: 111 additions & 0 deletions Localization/etc/Calib_Results.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
% Intrinsic and Extrinsic Camera Parameters
%
% This script file can be directly excecuted under Matlab to recover the camera intrinsic and extrinsic parameters.
% IMPORTANT: This file contains neither the structure of the calibration objects nor the image coordinates of the calibration points.
% All those complementary variables are saved in the complete matlab data file Calib_Results.mat.
% For more information regarding the calibration model visit http://www.vision.caltech.edu/bouguetj/calib_doc/


%-- Focal length:
fc = [ 1920 ; 1080 ];

%-- Principal point:
cc = [ 960 ; 540 ];

%-- Skew coefficient:
alpha_c = 0.000000000000000;

%-- Distortion coefficients:
kc = [ 0.0 ; 0.0 ; 0.0 ; 0.0 ; 0.000000000000000 ];

%-- Focal length uncertainty:
fc_error = [ 1.337241671636265 ; 1.373609160684873 ];

%-- Principal point uncertainty:
cc_error = [ 2.961432073094887 ; 2.710510772307604 ];

%-- Skew coefficient uncertainty:
alpha_c_error = 0.000000000000000;

%-- Distortion coefficients uncertainty:
kc_error = [ 0.009227658503406 ; 0.027828590090207 ; 0.001242661782870 ; 0.001286528905763 ; 0.000000000000000 ];

%-- Image size:
nx = 960;
ny = 720;


%-- Various other variables (may be ignored if you do not use the Matlab Calibration Toolbox):
%-- Those variables are used to control which intrinsic parameters should be optimized

n_ima = 10; % Number of calibration images
est_fc = [ 1 ; 1 ]; % Estimation indicator of the two focal variables
est_aspect_ratio = 1; % Estimation indicator of the aspect ratio fc(2)/fc(1)
center_optim = 1; % Estimation indicator of the principal point
est_alpha = 0; % Estimation indicator of the skew coefficient
est_dist = [ 1 ; 1 ; 1 ; 1 ; 0 ]; % Estimation indicator of the distortion coefficients


%-- Extrinsic parameters:
%-- The rotation (omc_kk) and the translation (Tc_kk) vectors for every calibration image and their uncertainties

%-- Image #1:
omc_1 = [ 2.816883e+00 ; -6.339079e-02 ; -7.385877e-03 ];
Tc_1 = [ -1.166320e+02 ; 6.494460e+01 ; 3.096426e+02 ];
omc_error_1 = [ 3.832989e-03 ; 1.075295e-03 ; 5.394879e-03 ];
Tc_error_1 = [ 1.162721e+00 ; 1.103736e+00 ; 8.646448e-01 ];

%-- Image #2:
omc_2 = [ 2.680203e+00 ; -1.788707e-01 ; -1.099674e+00 ];
Tc_2 = [ -3.886641e+01 ; 6.788083e+01 ; 4.084600e+02 ];
omc_error_2 = [ 3.676747e-03 ; 1.959543e-03 ; 5.077331e-03 ];
Tc_error_2 = [ 1.498691e+00 ; 1.389178e+00 ; 7.524030e-01 ];

%-- Image #3:
omc_3 = [ 2.521936e+00 ; -6.093381e-01 ; -9.915656e-01 ];
Tc_3 = [ -5.439891e+01 ; 9.225411e+01 ; 4.182277e+02 ];
omc_error_3 = [ 3.746584e-03 ; 2.033379e-03 ; 5.069570e-03 ];
Tc_error_3 = [ 1.536173e+00 ; 1.434969e+00 ; 8.687819e-01 ];

%-- Image #4:
omc_4 = [ 2.261140e+00 ; -1.073346e+00 ; -8.385664e-01 ];
Tc_4 = [ -2.809717e+01 ; 8.441936e+01 ; 3.934485e+02 ];
omc_error_4 = [ 3.593168e-03 ; 2.304807e-03 ; 4.948294e-03 ];
Tc_error_4 = [ 1.445248e+00 ; 1.351259e+00 ; 8.417232e-01 ];

%-- Image #5:
omc_5 = [ NaN ; NaN ; NaN ];
Tc_5 = [ NaN ; NaN ; NaN ];
omc_error_5 = [ NaN ; NaN ; NaN ];
Tc_error_5 = [ NaN ; NaN ; NaN ];

%-- Image #6:
omc_6 = [ 1.697437e+00 ; -1.692372e+00 ; -5.425349e-01 ];
Tc_6 = [ 3.849384e+01 ; 6.336176e+01 ; 3.313479e+02 ];
omc_error_6 = [ 3.041744e-03 ; 2.858415e-03 ; 4.616433e-03 ];
Tc_error_6 = [ 1.244274e+00 ; 1.139952e+00 ; 7.936462e-01 ];

%-- Image #7:
omc_7 = [ 2.674554e+00 ; -1.982755e-01 ; -1.096465e+00 ];
Tc_7 = [ -3.175432e+01 ; 7.701946e+01 ; 3.988479e+02 ];
omc_error_7 = [ 3.648009e-03 ; 1.887203e-03 ; 5.019335e-03 ];
Tc_error_7 = [ 1.466694e+00 ; 1.355682e+00 ; 7.399453e-01 ];

%-- Image #8:
omc_8 = [ 2.678649e+00 ; 2.662843e-01 ; 1.116575e+00 ];
Tc_8 = [ -1.345220e+02 ; 1.919394e+01 ; 2.645979e+02 ];
omc_error_8 = [ 4.189608e-03 ; 1.751801e-03 ; 5.364955e-03 ];
Tc_error_8 = [ 1.001054e+00 ; 9.700498e-01 ; 9.529841e-01 ];

%-- Image #9:
omc_9 = [ 2.021862e+00 ; 1.251700e-03 ; 9.813894e-02 ];
Tc_9 = [ -1.133378e+02 ; 9.576570e+01 ; 2.864940e+02 ];
omc_error_9 = [ 3.409664e-03 ; 2.031696e-03 ; 3.963686e-03 ];
Tc_error_9 = [ 1.101395e+00 ; 1.051048e+00 ; 9.007983e-01 ];

%-- Image #10:
omc_10 = [ -1.566072e+00 ; -1.474860e+00 ; -8.910361e-01 ];
Tc_10 = [ -7.813928e+01 ; -3.170304e+01 ; 1.845087e+02 ];
omc_error_10 = [ 2.543392e-03 ; 3.752532e-03 ; 3.687803e-03 ];
Tc_error_10 = [ 6.874330e-01 ; 6.464403e-01 ; 5.459849e-01 ];

46 changes: 46 additions & 0 deletions Localization/etc/create.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
rm ID.txt
convert -size 2970x2100 xc:white -density 10x10 -units pixelspercentimeter -fill white result.png
h=$1
v=$2
for j in $(seq 0 3);do
echo -ne Creating calibration patterns $j of 4 \\r
x=$(($j/2*2470+250))
y=$(($j%2*1600+250));
#ix=$(($j/2*15+40));
#iy=$(($j%2*30+40));
ix=50;
iy=50;
convert result.png \
-fill white -stroke black -draw "ellipse $x,$y 200,200 0,360" \
-fill black -stroke none -draw "ellipse $x,$y 150,150 0,360" \
-fill white -stroke none -draw "ellipse $x,$y $ix,$iy 0,360" \
-fill black -stroke none -draw "line $(($x-200)),$y $(($x-190)),$y" \
-fill black -stroke none -draw "line $(($x+200)),$y $(($x+190)),$y" \
-fill black -stroke none -draw "line $x,$(($y+200)) $x,$(($y+190))" \
-fill black -stroke none -draw "line $x,$(($y-200)) $x,$(($y-190))" \
-fill black -stroke none -draw "point $x,$y" \
result.png
done

id=0
for j in $(seq 0 $(($v-1)));do
for i in $(seq 0 $(($h-1)));do
echo -ne Creating actual robot patterns $(($i+$j*$h)) of $(($v*$h)) \\r
sj=60/$v
si=70/$v
x=$(($i*400+650));
y=$(($j*400+250));
ix=$((($j*10+5)*$sj/10+30));
iy=$((($i*10+5)*$si/10+30));
convert result.png \
-fill white -stroke black -draw "ellipse $x,$y 200,200 0,360" \
-fill black -stroke none -draw "ellipse $x,$y 130,160 0,360" \
-fill white -stroke none -draw "ellipse $x,$(($y+10)) $ix,$iy 0,360" \
result.png
r0=$(printf %.3f $(echo $ix/130|bc -l))
r1=$(printf %.3f $(echo $iy/160|bc -l))
echo $id $r1 $r0 >>ID.txt
id=$(($id+1))
done
done
convert result.png -density 100x100 result_$h\_$v.pdf
Binary file added Localization/etc/font.ttf
Binary file not shown.
Empty file added Localization/obj/.gitignore
Empty file.
68 changes: 68 additions & 0 deletions Localization/src/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
include Mk/local.Mk
SUBDIRS+=common
SUBDIRS+=camera
SUBDIRS+=imageproc
SUBDIRS+=gui

OBJS=$(wildcard ../obj/*.o)
LXXLIBS+=-L/usr/local/lib
LXXLIBS+=-lpthread

COPY_ETC_CMD=../etc/copy.sh
COPY_ETC_SRC=../etc
COPY_ETC_DST=../bin

swarmcon: all obj
$(MAKE) -C main $@; \
$(CXX) $(CXXDEFINE) -o ../bin/$@ $(OBJS) main/$@.o $(CXXFLAGS) $(LXXLIBS) -lSDL -lSDL_ttf

all:
echo $(OPSYS)
@for i in $(SUBDIRS) ;\
do \
echo "making" all "in $(CURRENT_DIR)/$$i..."; \
$(MAKE) -C $$i all; \
done

gui: all
$(MAKE) -C gui all; \
cp gui/*.o ../obj; \

cleangui:
$(MAKE) -C gui clean; \
rm -f ../obj/CGui.o

obj: all
echo "Copy objs"
@for i in $(SUBDIRS) ;\
do \
echo "coping all in $(CURRENT_DIR)/$$i..."; \
cp $$i/*.o ../obj; \
done

../bin/.etc:
echo "Copy dirs from etc"
$(COPY_ETC_CMD) $(COPY_ETC_SRC) $(COPY_ETC_DST)
touch ../bin/.etc

cleanetc:
rm -f ../bin/.etc

#cp -r ../etc/$$i ../bin;
forceetc:
echo "Copy dirs from etc"
$(COPY_ETC_CMD) $(COPY_ETC_SRC) $(COPY_ETC_DST)
touch ../bin/.etc

clean: cleangui
echo $(OPSYS)
@for i in $(SUBDIRS) ;\
do \
echo "cleaning" all "in $(CURRENT_DIR)/$$i..."; \
$(MAKE) -C $$i clean; \
done
$(MAKE) -C main clean; \
echo "cleaning all objs"
rm -f ../obj/*.o
echo "cleaning binaries"
rm -f ../bin/*
2 changes: 2 additions & 0 deletions Localization/src/Mk/local.Mk
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
CXXFLAGS+=-Wall -ggdb
#CXXFLAGS+=-Wall -O4 -march=native -D_FILE_OFFSET_BITS=64
Loading

0 comments on commit 1cbdcae

Please sign in to comment.