This project focuses on the automatic detection of craters on the lunar surface using descent video data transmitted by Blue Ghost, the lunar lander that recently achieved a successful landing on the Moon.
Craters are critical features of the lunar terrain, and their detection is important for navigation, hazard avoidance, and landing site evaluation.
The detection pipeline leverages Meta's Segment Anything Model (SAM), a state-of-the-art foundation model for image segmentation.
SAM allows us to generate pixel-accurate masks for objects in images without prior training on crater-specific datasets.
These masks are further processed to identify elliptical crater-like shapes, enabling robust crater localization in video frames.
- Frame Extraction – Process video frames sequentially from the Blue Ghost descent video.
- Segmentation with SAM – Apply the Segment Anything Model to generate segmentation masks.
- Ellipse Filtering – Fit ellipses to segmented regions to detect crater-like structures.
- Visualization – Annotate frames with crater masks and detected ellipses.
- Export – Save results as annotated video and CSV files containing crater parameters.
- Segment Anything Model (SAM) by Meta
- Optionally, FastSAM (lightweight real-time alternative)
- OpenCV for video processing and ellipse fitting
- PyTorch for deep learning model execution
- Matplotlib for visualization
- Pandas for structured crater data export
- Annotated Video: Original Blue Ghost descent video with detected crater overlays.
- CSV Report: Crater coordinates, major/minor axes, and angles for each frame.
The ultimate goal of this project is to contribute toward autonomous lunar landing systems by enabling real-time crater detection from descent imagery.
By leveraging the Blue Ghost mission data, this project demonstrates how AI-powered segmentation can enhance landing safety, hazard avoidance, and lunar exploration mission planning.