For the past 15 years, NASA Mars Reconnaissance Orbiter made tours around the red planet studying its climate and geology. Every day, the orbiter returns a treasure trove of images and other sensor data that NASA scientists used to search safely landing sites for rovers and to understand the water ice dispensing on the planet. Photos of the orbiter’s crater are of particular interest to scientists, which can provide a window into the deep history of the planet. NASA engineers are still working on a Mars sample return mission; without the rocks that will help them calibrate distant satellite data with surface conditions, they have to make a lot of educated guesswork when it comes to determining the age and composition of each crater.
For now, they need other ways to reveal this information. One proven method is to extrapolate the age of the oldest craters from the characteristics of the most recent on the planet. Since scientists can know the age of some recent impact sites in a matter of years, or even weeks, they can use them as a benchmark to determine the age and composition of much older craters. The problem is to find them. Exploring a planet’s image data for telltale signs of a new impact is tedious work, but it’s exactly the kind of problem an AI was designed to solve.
Late last year, NASA researchers used a machine-learning algorithm to discover new Martian craters for the first time. The AI has discovered dozens of them hiding in the image data of the Mars Reconnaissance Orbiter and revealed a promising new way to study the planets in our solar system. “From a scientific point of view, it’s exciting because it increases our knowledge of these features,” says Kiri Wagstaff, a computer scientist at NASA’s Jet Propulsion Laboratory and one of the leaders of the research team. “The data was there all the time, we just hadn’t seen it ourselves.”
The Mars Reconnaissance Orbiter is equipped with three cameras, but Wagstaff and his colleagues trained their AI using images only from Context and HiRISE imagers. The context is a relatively low-resolution grayscale camera, while HiRISE uses the largest reflecting telescope ever sent to deep space to produce images with resolutions roughly three times higher than the images used on Google Maps. .
First, the AI received nearly 7,000 orbital photos of Mars – some with craters already discovered and others without any – to train the algorithm to detect a new attack. Once the classifier could accurately detect the craters in the training set, Wagstaff and his team loaded the algorithm onto a supercomputer at the Jet Propulsion Laboratory and used it to scan a database of over 112,000 images of the orbiter.
“There is nothing new with the underlying machine learning technology,” says Wagstaff. “We used a fairly standard convolutional network to analyze the image data, but being able to apply it at scale remains a challenge. It was one of the things we had to struggle with here.
The most recent craters on Mars are small and may be as little as a few feet in diameter, meaning they appear as dark pixelated spots on pop-up images. If the algorithm compares the image of the candidate crater with a previous photo of the same area and finds that the dark spot is missing, there is a good chance that it has found a new crater. The date in the previous image also helps to establish the timeline of when the impact occurred.
Once the AI identified promising candidates, NASA researchers were able to make follow-up observations with the orbiter’s high-resolution camera to confirm the craters actually existed. Last August, the team received its first confirmation when the orbiter photographed a cluster of craters that had been identified by the algorithm. It was the first time that an AI had discovered a crater on another planet. “There was no guarantee that there would be new things,” Wagstaff says. “But there were a lot of them, and one of our big questions is, what makes them harder to find?”