sajjad ahmed shaaz
← projects
AI/MLRoboticsFinalist

NIDAR

Autonomous multi-UAV precision agriculture system

5th nationally

YOLOOpenCVMAVLinkArduPilotPythonROS
NIDAR cover

Dual-UAV pipeline built under JUMTC. Scan drone runs YOLO-based crop health inference onboard; geo-tagged fertilization targets are relayed over MAVLink telemetry to a spray drone executing closed-loop waypoint spraying — fully autonomous, end-to-end.


What NIDAR actually does

Most precision agriculture demos are exactly that — demos. A single drone, a controlled field, good lighting. NIDAR was built to be an end-to-end autonomous pipeline: two drones, no human intervention once launched.

The scan drone flies a pre-planned grid over a field, running a YOLO model onboard in real-time to identify areas that need fertilization — stressed crops, discolouration, growth anomalies. Each detected target gets geo-tagged using the drone's GPS and relayed over MAVLink telemetry to a base station. The spray drone receives these targets as a live waypoint mission and executes closed-loop spraying.

The hard engineering bits

Coordinating two independent flight controllers over a shared telemetry channel was the central challenge. ArduPilot has no native multi-vehicle synchronisation primitive — we built our own lightweight state machine on top of MAVLink to sequence the handoff: scan complete → targets validated → spray mission uploaded → spray drone armed and launched.

The YOLO model also had to run onboard the scan drone in real-time. We couldn't relay frames to the ground for inference because latency would have made the geo-tagging inaccurate at flight speed. We quantised the model to run on the companion computer's CPU at acceptable frame rates — this was my first real exposure to on-device model optimisation, and it directly seeded the interest that led to the compression work later.

Wind was a consistent enemy during testing. A waypoint that makes perfect sense on a map becomes difficult when the drone is drifting 0.5m laterally in a gust.

The competition and what standing 5th meant

We placed 5th nationally, competing against teams from IITs and established robotics clubs. We were first-years.

I'm not sure I fully processed it at the time. The night before the final demo we were still debugging the MAVLink handshake — one of those situations where you're not sure if the thing will work until it does. When both drones completed the autonomous sequence without intervention in front of the judges, there was a specific kind of quiet in the team that I haven't quite felt since.

What I took away: autonomous systems fail at integration points, not in isolation. Each drone worked perfectly in unit tests. The hard problems only appeared when we tried to make them talk to each other.