top of page

ArUco-Based Robotic Arm Tracking System

PHOTO-2025-04-23-18-25-23.jpg

PROJECT OVERVIEW:
This project involved developing a vision-guided robotic arm system capable of detecting and tracking ArUco fiducial markers in real time for use in anti-explosive scanning and automated inspection scenarios. The robotic arm, powered by a Raspberry Pi 4 and a PCA9685 servo controller, combines computer vision, embedded control, and IoT integration to identify hazardous zones, scan vehicles, and autonomously align to targets while minimizing human exposure to potential threats.

The system was implemented using Python 3, OpenCV, Flask, and PyGame, integrating a camera feedback loop, servo motion control, and data logging through a web interface. This cyber-physical system embodies the fusion of embedded hardware, real-time image processing, and networked robotics, designed to assist in secure border or checkpoint inspections.

KEY TECHNICAL LEARNING:
  1. Real-Time Vision Processing:
    Mastered ArUco marker tracking with camera calibration and pose estimation, optimizing detection at 15 FPS under variable lighting conditions.

  2. Servo Coordination via PWM Control:
    Used hardware-timed output compare via PCA9685 to generate consistent 50 Hz PWM signals for five servo channels, improving positional repeatability to within ±1.5°.

  3. Multithreaded Robotics Control:
    Designed a multi-threaded architecture where camera capture, joystick input, and motion tasks run concurrently without blocking—leveraging Python’s threading and daemon processes for synchronization.

  4. Closed-Loop Feedback Design:
    Implemented early PID-based smoothing logic to prevent servo jerk and overshoot, achieving stable positioning during continuous target tracking.

  5. Hardware-Software Co-Design:
    Integrated hardware interrupts for limit switches and implemented emergency stop ISR (< 50 ms latency), combining embedded reliability with high-level software flexibility.

  6. Data Logging and Analytics:
    Built a scan history logger that records each detection event (timestamp, ID, duration) to Excel, then visualized the data on a web dashboard using Flask endpoints (/graph, /history).

  7. System Validation and Benchmarking:
    Evaluated performance metrics including camera-servo latency (~120 ms), average detection accuracy (100 % success rate in 50 trials), and motion stability using photogrammetry validation.

bottom of page