AR Recovery HUD — How It Works
ExperimentalPoint Your Phone, See the Rocket
You're close. The map says you're standing within a hundred meters of where the rocket came down, but the field is shoulder-high in dry grass and the rocket is somewhere in there. You raise your phone, and a small targeting reticle settles in the live camera view exactly where the rocket's GPS coordinate sits in the world. You walk toward it. The reticle stays put as the camera pans across the landscape.
That is what the AR Recovery HUD does. It takes the rocket's GPS position and draws it back into the camera image using the device's own sensors, so the dot you're looking at is anchored to a place in the world rather than to a place on the screen.
The Idea
A modern smartphone is a small inertial reference unit with a camera bolted to it. Three sensor groups inside it give us, together, everything we need to project a GPS coordinate back onto a video frame:
- A magnetometer that measures the Earth's magnetic field and tells the phone which way is north.
- An accelerometer that measures the gravity vector and tells the phone which way is down.
- A gyroscope that measures how quickly the phone is rotating, which lets the system smooth and stabilize the other two.
Fused together, these give a continuously updated attitude — the phone's full orientation in a world reference frame, expressed as a 3×3 rotation matrix that captures yaw (which way the phone is facing), pitch (how far up or down it's tilted), and roll (how far it's turned around its long axis), all at once and without the gimbal-lock failures of simpler heading-and-tilt approaches.
We ask the operating system for an attitude in a frame anchored to true north and gravity — X axis pointing north, Z axis pointing up. When that's available, the rocket's bearing and elevation from your position can be turned into a unit vector in that frame, rotated through the phone's attitude matrix into the camera's body frame, and projected onto the image plane using the standard pinhole camera model. The result is a pixel coordinate where the rocket "is" in the current camera frame.
That's the whole trick. The reticle is not floating on the screen — it's pinned to a real-world point and re-projected at every screen refresh, sixty times per second.
What You See and Why
The targeting reticle. A diamond at the rocket's projected position in the camera view. As you pan, tilt, or rotate the phone, the camera image moves underneath the reticle and the reticle stays on the rocket. If the rocket falls outside the camera's field of view, the reticle goes with it — out of the frame — rather than sliding to the edge of the screen pretending to point. We'd rather show you nothing than show you a lie.
The GPS uncertainty ellipse. A dashed shape around the reticle representing the area on the ground where the rocket might actually be, given the accuracy of its last GPS fix. It's drawn as an ellipse, not a circle, because we're projecting a flat circular area on the ground onto a tilted camera view — the same reason a coin lying on a table looks like a circle from straight above and an ellipse from across the room. As you tilt the phone toward the horizon, the ellipse compresses; as you tilt down, it opens up toward a full circle. The shape is doing the perspective math for you.
The compass ribbon. A scrolling strip along the top showing your current heading in degrees, with a small triangle marking the rocket's bearing. The reticle and the ribbon answer related but different questions. The reticle shows the rocket if it's in the camera view; the ribbon tells you which way to turn when it isn't. One glance at the ribbon — is the triangle to the left or right of center? — tells you which way to swing the phone to bring the reticle into view.
The direction strip. A small panel on the right with the distance to the rocket and a directional arrow. When the rocket is within about seven meters, the strip says "nearby" rather than offering a direction. At that range, the GPS error on both ends — yours and the rocket's — is larger than the gap between you, and the bearing between two such points is no longer a meaningful number.
Color theme. Tap the HUD to cycle through four color schemes: green, amber, cyan, and white. This is not just decoration. Bright sun on a phone screen washes out cyan and white; dusk or dense shade is gentler with amber. Pick the one that you can actually read in the light you're walking through.
How to Read It
When the reticle is centered and the distance is shrinking, walk straight. When the compass-ribbon triangle drifts off center, the rocket has moved out of frame — pan the phone to bring the triangle back to the middle and the reticle will reappear. When the distance strip says "nearby," stop walking and look around inside the GPS uncertainty ellipse on the ground.
The reticle's position in the frame is the trustworthy quantity. Its exact placement on a single object is not — the rocket's last reported GPS coordinate is a fix with some accuracy floor of its own, typically a few meters in good sky view. The ellipse around the reticle is your visualization of that floor. Treat it as the search area, not the search target.
When It Earns Its Keep
The HUD shines on the final approach — the last hundred meters or so, when the map has gotten you to the right field but the rocket itself is hidden in tall grass, brush, or behind low terrain. A camera view with a world-anchored target is more intuitive at that scale than a top-down map: you're already looking at the world, and now the world is annotated.
It is also the right tool when the launch site has trees, boulders, or other vertical features that can hide a rocket on the ground but can't hide its GPS. You sweep the phone across the area, and the reticle drops onto the right tree-line, the right side of the boulder, the right corner of the brushpile.
For the bulk of the walk in — anything beyond a few hundred meters — the Recovery Map is a better tool. The HUD is for finding, not for navigating.
What This Isn't
It is not an arrow that points at your rocket. Other AR-finder apps use a screen-locked arrow that always points roughly toward a target. We don't do that, because an arrow doesn't tell you whether the target is fifty meters away or behind a hill. The reticle is a projected point in the world. It only appears when the rocket is in the camera frame; otherwise the compass ribbon takes over the directional job.
It is not immune to magnetic interference. The magnetometer is a small sensor in a small phone, and it's reading a small magnetic field. Steel rebar in a launch pad, the engine block of a parked vehicle, a metal toolbox, or even the steel in your boots can deflect the heading by 10° to 30°.
Calibration is also outside the app's control. We ask iOS for an attitude anchored to true north and gravity, but if true-north calibration isn't available the system silently falls back to magnetic north — off from true north by your local declination, up to ~15° in much of North America. In a worse case, with no usable compass at all, it falls back to an arbitrary horizontal reference, and bearings are no longer meaningful in absolute terms. The only fixes are walking a few meters away from interference and letting the phone re-settle, or performing iOS's figure-8 calibration motion.
It is not three-dimensional in the way a video game is. The reticle's vertical placement uses the rocket's height above its ground — a Kalman-fused estimate from the onboard barometer, accelerometer, and GPS, updated with every telemetry packet. That number is good. What the math then assumes is that you are standing at the rocket's ground level. If the rocket landed in a valley below the hill you're walking on, or up on a ridge above you, the elevation angle is off by the difference. On terrain that's roughly flat between you and the landing area, the reticle is where the rocket is. On terrain that isn't, the reticle is too high or too low by the relative-elevation error.
It does not know about trees. The rocket's reported height is what its sensors measured. If it transmitted while caught in a tree at twenty meters AGL, that height is in the data and the reticle uses it. If telemetry stopped before it stopped moving and the last good packet had it near the ground, the reticle is on the ground regardless of where the rocket actually settled.
It is not a precision sight. GPS is good to a few meters in open sky. The phone's attitude estimate is good to a degree or two when the magnetometer is healthy. At a hundred meters out, those errors stack up to a several-meter cone of uncertainty around the reticle. The dashed ellipse is showing you that cone in projection. Use it to pick a search area; don't expect it to land on a single bush.
What You Need
- A device with a working rear camera, magnetometer, and motion sensors (any modern iPhone)
- Camera permission, granted the first time you open the HUD
- GPS location services enabled
- An active telemetry connection to a flight computer with a GPS receiver (TeleMega, TeleMetrum, or TeleGPS)
The HUD works without any network connection — Bluetooth telemetry from your TeleBT ground station is the only data link it needs.
This is an experimental feature. We'd particularly like to hear how it behaves around metal launch hardware, in different light conditions, and on the very last few meters of a recovery walk.