Professional aviation surveillance relies on a specific piece of hardware: the EO/IR (Electro-Optical/Infra-Red) gimbal. These are the gyro-stabilized turrets you see on the nose of police helicopters or military drones, capable of keeping a rock-solid lock on a target regardless of how the aircraft maneuvers.
I wanted to replicate this capability to help track aircraft from the ground—building a tool that allows a consumer camera to lock onto and follow a target with similar stability, but without the defense-contractor budget.
The core of this build is a generic PTZ (Pan-Tilt-Zoom) camera, the kind typically used for streaming church services or campus lectures.

While cost-effective, these cameras present a major engineering challenge for tracking any object, let alone aircraft. Their motors are designed for slow, dampened pans across a stage, not for tracking a jet moving at 300 knots. The mechanical and electronics latency is significant; if you simply tell the camera to “follow that plane,” by the time the motors react, the target has often moved out of the frame.
To make this hardware viable, the heavy lifting has to move from mechanics to mathematics.
I built a custom control loop to bridge the gap between the camera’s sluggish motors and the dynamic speed of the targets. The stack fuses three main concepts to help the system maintain a visual lock:
Visual Processing (OpenCV & CSRT)
Prediction (Kalman Filter)
Control (PID + Feed-Forward Loop)
Even with a tuned PID loop, the plastic gears in a consumer PTZ camera have physical limitations. There is always some mechanical play that results in jitter at high zoom levels and the onboard control electronics (including their dampening/smoothing algorithms) introduce latency that prevent perfect mechanical stabilization.
To solve this, I implemented a digital stabilization layer—essentially a “virtual gimbal.” The software crops into the sensor slightly and shifts the image frame-by-frame to counteract the imperfect mechanical stabilization. The result is an incredibly stable image that mimics the expensive mechanical stabilization of professional EO/IR turrets.
An optical lock is useful, but context is better. Since the system knows the camera’s precise azimuth and elevation, I can correlate the visual data with live ADS-B telemetry. When tracking a target, the system queries local ADS-B traffic to find an aircraft at those specific coordinates. This data comes from a local ADS-B receiver with a script monitoring tar1090.
This project is an experiment in sousveillance—monitoring the monitors. It involves taking the technologies used for ISR (Intelligence, Surveillance, and Reconnaissance) and adapting them for civilian use. By understanding how these tracking systems work, we gain a better understanding of the airspace above us and the tools often used to watch it.
This project is available on Github under an MIT license.