Skip to content

mboiar/AutonomousFollowingAndAvoidance

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Autonomous Following & Obstacle Avoidance Drone

System Overview

License: MIT Platform: Raspberry Pi 4B Language: C++

This repository contains the companion computer software for an autonomous drone that visually tracks a sport climber and avoids obstacles (part of an engineering thesis). The system runs on a Raspberry Pi 4B, using a hybrid tracking algorithm (KCF + YOLO‑Pose) to achieve real‑time performance (20 FPS) with high accuracy (IoU = 0.76). It fuses vision with ultrasonic distance data to generate safe velocity commands sent to the flight controller via MAVLink.


Key Features

  • Real‑time robust object tracking using a hybrid approach:
    • KCF (Kernelized Correlation Filter) for high‑speed tracking between detections.
    • YOLO‑Pose (TensorFlow Lite, INT8 quantized, XNNPACK accelerated) for periodic re‑detection and pose estimation.
  • Vision pipeline:
    • Camera capture with Libcamera.
    • ROI reduction based on previous track position to minimise false positives.
    • Kalman filter for smoothing bounding box coordinates.
  • Obstacle avoidance:
    • Reads 6 ultrasonic distance sensors via the flight controller (MAVLink).
    • Reactive potential‑field method to generate avoidance velocities.
  • Velocity command generation:
    • Computes desired velocities based on bounding box position and size (visual servoing).
    • Combines tracking and avoidance commands with dynamic limits.
  • Multi‑threaded C++ architecture:
    • Separate threads for camera I/O, tracking, obstacle processing, and MAVLink communication.
    • Lock‑free queues for inter‑thread data passing.
  • Extensive validation:
    • Tested on a custom dataset of climbing videos.
    • HIL simulations on actual Raspberry Pi hardware.

Validation & Results

The high‑level control system was evaluated on the target hardware (Raspberry Pi 4B) using both recorded climbing footage and hardware‑in‑the‑loop (HIL) simulations. Below is a summary of the key performance metrics and observations.

Performance Metrics

Metric Value Conditions
Average processing speed 20 FPS Full pipeline (capture + KCF + periodic YOLO)
Tracking accuracy (IoU) 0.76 Average over 4 diverse climbing sequences
YOLO‑Pose inference time 250 ms INT8 quantized, TensorFlow Lite + XNNPACK
KCF tracking time < 10 ms ROI 400×400 pixels
Detection confidence > 90% When ROI limited to predicted bounding box
Obstacle reaction latency ≈ 15 ms From distance sensor read to velocity command

Tracking Accuracy

The hybrid tracker (KCF + YOLO‑Pose) was evaluated on four real‑world climbing videos, manually annotated with ground‑truth bounding boxes (100 frames per video, sampled every 5th frame). The mean Intersection over Union (IoU) across all sequences reached 0.76, indicating strong alignment with the true climber position.

Tracking example Tracking example Tracking example Tracking example

Figure 1: Successful tracking under challenging conditions – partial occlusion, top‑down view, low light, and fast motion. Predicted bounding box (white) vs. ground truth (red).

In the rare cases where the tracker drifted, YOLO‑Pose re‑detection reliably recovered the target within 1–2 detection cycles.

Real‑Time Performance

On the Raspberry Pi 4B, the entire pipeline runs at 20 FPS, satisfying the real‑time requirement for reactive control. Key optimizations include:

  • ROI cropping – KCF operates only on a 400×400 region around the last known position.
  • INT8 quantization of YOLO‑Pose reduces inference time from ~800 ms (float32) to 250 ms.
  • XNNPACK delegate accelerated TensorFlow Lite inference.
  • Multi‑threading – separate threads for capture, tracking, and MAVLink communication prevent blocking.

Obstacle Avoidance Validation

The reactive obstacle avoidance module was tested with simulated distance sensor data injected into the HIL environment. When a virtual obstacle approached from the left, the system correctly generated a rightward velocity correction within 15 ms (including sensor emulation, data transfer, and command generation). The potential‑field method ensures smooth avoidance while respecting the drone’s velocity limits.


Hardware

  • Companion Computer: Raspberry Pi 4B (2GB+ RAM)
  • Camera: Raspberry Pi Camera Module 3 (CSI interface)
  • Flight Controller: MavLink-compatible flight сontroller (e.g. FreeRTOS_FlightController)
  • Distance Sensors: 6× HC‑SR04 (read by flight controller)

Software Architecture

Companion Computer Architecture
Fig. 1: Multi‑threaded architecture with data flow between vision, obstacle avoidance, and MAVLink threads.

Thread Responsibility Communication
Camera Capture frames, convert to OpenCV format Pushes frames to frame_queue
Tracker Run KCF + periodic YOLO‑Pose, Kalman filter Outputs bounding box to
Detection Run YOLO Detection Outputs detection data to the tracker
Obstacle Process ultrasonic data from FC Generates avoidance velocities
Comm (Rx/Tx) Send velocity commands, receive sensor data Bidirectional with FC

Installation & Build

Prerequisites

  • Raspberry Pi OS (Bullseye or later)
  • CMake (≥ 3.13)
  • OpenCV (≥ 4.5)
  • TensorFlow Lite v2.10+
  • Libcamera & libcamera‑apps

Build

git clone https://github.com/mboiar/AutonomousFollowingAndAvoidance.git
cd AutonomousFollowingAndAvoidance
mkdir build && cd build
cmake ..
make -j4

Cross-compilation

  1. Set cross compilation tools path (tools) and target filesystem root (rootfs_dir).
  2. cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_TOOLCHAIN_FILE=./cross-compilation-tools/PI.cmake --no-warn-unused-cli -S . -B build
  3. cmake --build build --target all

About

Companion computer for autonomous drone tracking climbers: hybrid KCF/YOLO-Pose tracking at 20 FPS with reactive obstacle avoidance

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Contributors