Keypoint Analysis

How it works

Our algorithm uses ORB (Oriented FAST and Rotated BRIEF) feature detection to find distinctive points in each image. These keypoints are then matched between consecutive frames using a brute-force matcher with Lowe's ratio test for quality filtering.

To improve accuracy, we apply a cloud mask that filters out bright areas (typically clouds) and focuses on ground features like coastlines, mountains, cities, and rivers. The RANSAC algorithm removes outlier matches, and the remaining valid matches are used to calculate the displacement in pixels.

The green lines in the images below show matched keypoints between two consecutive photos. The pixel displacement, combined with the Ground Sample Distance (GSD) calculated from the ISS altitude, gives us the actual distance traveled, which we divide by the time interval to get the orbital speed.

Algorithm Steps

1. Image Capture: Photos are taken every 5 seconds using the Raspberry Pi HQ Camera.

2. Cloud Masking: Bright pixels (value > 180) are masked out to focus on ground features.

3. Feature Detection: ORB detects up to 10,000 keypoints per image.

4. Feature Matching: Keypoints are matched using Hamming distance with ratio test (threshold 0.75).

5. Outlier Removal: RANSAC removes geometric outliers, IQR removes statistical outliers.

6. Speed Calculation: Median displacement × GSD ÷ time = speed in km/s.

Keypoint Comparisons