Advanced line following robot, using custom deep neural network for curve shape prediction. Robot can recognize straight line and speed up.
- Video1 of woking robot motoko video 1
- Video2 of woking robot motoko video 2
- CPU is ARM Cortex M4F, stm32f303, 72MHz
- with simd instructions, good to have for deep learning
- gyroscope : lsm303
- (or something like THAT - yeah, they are changing it more often then socks)
- motors : pololu HP 1:30, micro metal gear, with magnetic encoders
- motor driver : TI DRV8834
- line sensors : phototransitors for visible light (not IR)
- Iam using white leds, so the line can be colored - not only black
- obstacle detection :
- common IR leds, and laser (still not working as I wish - too slow, but high range)
- accu : LiPol 2S, 150mAh, from dualsky
Some photos from mounting this devilry device :
-
PCB design are in pcb, ready to send to your favorite manufactor
-
Chassis, there are little of 3D printed parts 3d print
- ADC reads line sensor, using quadratic interpolation for precission line position estimation
- for steering PD controller is used
- for motor controll two PIDs are used (data from encoders are filtered by low pass filters)
- target speed is estimated by neural network, using line shape classification
tiny convolutional neural network is used, with architecture : IN1x8x8 - C4x3x3 - P2x2 - C8x3x3 - P2x2 - FC5
the custom embedded CNN framework is used
LineNetwork::LineNetwork()
:EmbeddedNet()
{
input_shape.w = 8;
input_shape.h = 8;
input_shape.d = 1;
output_shape.w = 1;
output_shape.h = 1;
output_shape.d = 5;
layers[0] = new EmbeddedNetConvolutionLayer(layer_0_shape,layer_0_input_shape,layer_0_output_shape,layer_0_weights,layer_0_bias,layer_0_weights_range,layer_0_bias_range);
layers[1] = new EmbeddedNetMaxPoolingLayer(layer_1_shape,layer_1_input_shape,layer_1_output_shape);
layers[2] = new EmbeddedNetConvolutionLayer(layer_2_shape,layer_2_input_shape,layer_2_output_shape,layer_2_weights,layer_2_bias,layer_2_weights_range,layer_2_bias_range);
layers[3] = new EmbeddedNetMaxPoolingLayer(layer_3_shape,layer_3_input_shape,layer_3_output_shape);
layers[4] = new EmbeddedNetFcLayer(layer_4_shape,layer_4_input_shape,layer_4_output_shape,layer_4_weights,layer_4_bias,layer_4_weights_range,layer_4_bias_range);
layers_count = 5;
allocate_buffer();
}
PIDs in differential form are used, with antiwindup and derivation kick avoid trick
where
- e(n) is error
- u(n) is controller output
- k0 = kp + ki + kd
- k1 = -kp -2kd
- k2 = kd
float PID::process(float error, float plant_output)
{
e1 = e0;
e0 = error;
x2 = x1;
x1 = x0;
x0 = plant_output;
u+= k0*e0 + k1*e1 - kd*(x0 - 2*x1 + x2);
if (u > limit)
u = limit;
if (u < -limit)
u = -limit;
return u;
}
motors controllers step response :
There is debug application written in Python. Data received from robot are visualised in custom GUI written using OpenGL. Data are transfered in JSON format.
Robot GUI is also designed in json - to modify gui design, only this file need to be modified. Example of gui.json file format :
"widgets" :
{
"type" : "label",
"position" : [0.0, 0.0, 0.0],
"color" : [1.0, 0.0, 0.0],
"font_size" : 0.1,
"widgets" : [
{
"type" : "bar_graph",
"position" : [-2.0, 1.25, 0.0],
"color" : [0.5, 1.0, 0.0],
"bar_color" : [0.8, 0.8, 0.8],
"font_size" : 0.05,
"width" : 1.5,
"height" : 0.5,
"filled" : false,
"enlight_max_value" : false,
"frame_width" : 0.01,
"label" : "LINE SENSORS",
"min_value" : 0.0,
"max_value" : 1024.0,
"variable" : {
"name" : "line_sensors",
"value" : [3, 40, 500, 1000, 700, 50, 0, 0]
}
},
...