Back Original

Project SkyWatch (a.k.a. Wescam at Home)

Professional aviation surveillance relies on a specific piece of hardware: the EO/IR (Electro-Optical/Infra-Red) gimbal. These are the gyro-stabilized turrets you see on the nose of police helicopters or military drones, capable of keeping a rock-solid lock on a target regardless of how the aircraft maneuvers.

I wanted to replicate this capability to help track aircraft from the ground—building a tool that allows a consumer camera to lock onto and follow a target with similar stability, but without the defense-contractor budget.

The Hardware Constraint

The core of this build is a generic PTZ (Pan-Tilt-Zoom) camera, the kind typically used for streaming church services or campus lectures.

An Amazon product page for the AVKANS AI Auto Tracking NDI 6 Camera, priced at $389.00.

The main image features a sleek, matte black PTZ (Pan-Tilt-Zoom) camera with a prominent "NDI HX3" logo on the side arm. Beside the camera is a small black metal mounting bracket. To the left, a vertical gallery shows several thumbnail images and videos of the product in use.

The product title highlights key features: 20X Live Streaming, HDMI SDI USB3.0 connectivity, and compatibility with NDI HX2 & NDI HX3. It is marketed for church worship, events, and social media livestreaming. A blue notification bar at the top indicates the item was "Last purchased Aug 2, 2024." The right sidebar shows "Prime Two-Day" shipping, "In Stock" status, and "Add to Cart" / "Buy Now" buttons.
The camera in question: an AVKANS LV20N, a knockoff of a 20x zoom PTZOptics unit

While cost-effective, these cameras present a major engineering challenge for tracking any object, let alone aircraft. Their motors are designed for slow, dampened pans across a stage, not for tracking a jet moving at 300 knots. The mechanical and electronics latency is significant; if you simply tell the camera to “follow that plane,” by the time the motors react, the target has often moved out of the frame.

To make this hardware viable, the heavy lifting has to move from mechanics to mathematics.

The Software Stack

I built a custom control loop to bridge the gap between the camera’s sluggish motors and the dynamic speed of the targets. The stack fuses three main concepts to help the system maintain a visual lock:

Visual Processing (OpenCV & CSRT)

Prediction (Kalman Filter)

Control (PID + Feed-Forward Loop)

The “Virtual” Gimbal

Even with a tuned PID loop, the plastic gears in a consumer PTZ camera have physical limitations. There is always some mechanical play that results in jitter at high zoom levels and the onboard control electronics (including their dampening/smoothing algorithms) introduce latency that prevent perfect mechanical stabilization.

To solve this, I implemented a digital stabilization layer—essentially a “virtual gimbal.” The software crops into the sensor slightly and shifts the image frame-by-frame to counteract the imperfect mechanical stabilization. The result is an incredibly stable image that mimics the expensive mechanical stabilization of professional EO/IR turrets.

Demonstration: tracking a media helicopter, toggling on digital stabilization shortly after acquiring a lock.

Data Fusion

An optical lock is useful, but context is better. Since the system knows the camera’s precise azimuth and elevation, I can correlate the visual data with live ADS-B telemetry. When tracking a target, the system queries local ADS-B traffic to find an aircraft at those specific coordinates. This data comes from a local ADS-B receiver with a script monitoring tar1090.

Why Build This?

This project is an experiment in sousveillance—monitoring the monitors. It involves taking the technologies used for ISR (Intelligence, Surveillance, and Reconnaissance) and adapting them for civilian use. By understanding how these tracking systems work, we gain a better understanding of the airspace above us and the tools often used to watch it.

This project is available on Github under an MIT license.