Tech
NASA’s Perseverance Mars Rover Takes the Wheel, Completes First Drive Planned by AI
NASA’s Perseverance rover rolled across the rim of Jezero Crater for 700 feet on December 8, 2025, and another 800 feet a few days later, but these were no ordinary journeys. In fact, this was the first time on another planet that artificial intelligence handled route planning on its own, selecting safe courses without the assistance of human specialists on Earth.
For years, human teams at NASA’s Jet Propulsion Laboratory have completed all of this work. They study orbital images, inspection maps, and rover status updates before meticulously mapping a set of fixed locations spaced approximately 330 feet apart. Commands are then sent all the way to Mars, a 140 million-mile journey, where we must wait for the signal to bounce back before sending the next set of instructions.
LEGO Technic NASA Mars Rover Perseverance Building Toys – STEM Model Kit for Boys & Girls, Ages 10+ Years…
- Feed a passion for science and technology – Kids can learn more about the challenges of space exploration with this LEGO Technic NASA Mars Rover…
- Conduct a test flight – This advanced building kit for kids ages 10 and up includes a buildable toy version of NASA’s Ingenuity helicopter, which…
- AR brings the mission to life – The accompanying augmented reality app experience lets kids dive into the details of the rover and its mission
Perseverance then goes about its business, carefully performing each stage of the mission, pausing to relay back data, and hanging out to await the next bit of direction from Earth. It keeps the missions safe but restricts how much progress we can make in a day and necessitates hours of meticulous study by engineers back on Earth.
So the engineers decided to try a new approach and resorted to generative AI, beginning with vision-language models from Anthropic’s Claude family. The machine examined all of the same data that human planners do, including high-resolution photos from NASA’s Mars Reconnaissance Orbiter’s HiRISE camera, as well as comprehensive slope information from digital elevation models. From that information, the AI identified all of the major features, such as bedrock exposures, rocky outcrops, boulder clusters ripe for trouble, and moving sand ripples. Then it generated a continuous itinerary, marked by waypoints where the rover might get new directions.
Safety remained a key issue, as the team ran the entire plan through a digital twin, a virtual copy of Perseverance that we had created at JPL, before sending any instructions to the rover. The simulation went through almost 500 000 telemetry points to ensure that the plan matched the rover’s actual flight software and that we wouldn’t run into any trouble. Only after this was completed will the instructions be sent over the Deep Space Network.
On Martian day 1707, the initial drive covered a total distance of 689 feet. The second journey on sol 1709 was a bit longer, 807 feet, and took around 2 and a half hours. The navigation cameras captured everything in real time, showing the rover moving steadily over the crater rim. Later on, we were able to review the footage and observe how closely the AI’s plan matched the real trail that the rover took.
Vandi Verma, one of the space roboticists working on these systems at JPL, stated that the real big deal here is that generative AI can make off-world driving a lot easier because it clearly shows what the terrain looks like, knows exactly where the rover is, and chooses the safest path forward.