Technology

A New Algorithm Supports Autonomous Vehicles in Locating Themselves

A New Algorithm Supports Autonomous Vehicles in Locating Themselves

An algorithm enables machines to teach themselves how to recognize landscapes, even as the seasons change. Autonomous systems can easily become disoriented in the absence of GPS. A new algorithm developed at Caltech now allows autonomous systems to recognize their location simply by looking at the terrain around them – and, for the first time, the technology works regardless of seasonal changes in that terrain.

The details of the process were published in the American Association for the Advancement of Science journal Science Robotics (AAAS). The general method, known as visual terrain-relative navigation (VTRN), was invented in the 1960s. Autonomous systems can locate themselves by comparing nearby terrain to high-resolution satellite images.

The issue is that the current generation of VTRN requires that the terrain it is looking at closely match the images in its database in order for it to work. Anything that changes or obscures the terrain, such as snow or fallen leaves, causes the images to mismatch and clogs the system. As a result, VTRN systems can be easily confused unless there is a database of landscape images under every conceivable condition.

An algorithm lets machines teach themselves how to recognize landscapes, even amid the changing seasons.

To overcome this challenge, a team from the lab of Soon-Jo Chung, Bren Professor of Aerospace and Control and Dynamical Systems and research scientist at JPL, which Caltech manages for NASA, used deep learning and artificial intelligence (AI) to remove seasonal content that impedes current VTRN systems.

“The rule of thumb is that both images — the one from the satellite and the one from the autonomous vehicle — must have identical content for current techniques to work; the differences that they can handle are about what can be accomplished with an Instagram filter that changes the hues of an image,” says Anthony Fragoso (MS ’14, Ph.D. ’18), lecturer and staff scientist, and lead author of the Scie “However, in real systems, things change dramatically depending on the season because the images no longer contain the same objects and can no longer be directly compared.”

A-New-Algorithm-Supports-Autonomous-Vehicles-in-Locating-Themselves-1
New algorithm helps autonomous vehicles find themselves, summer or winter

The method, which Chung and Fragoso devised in collaboration with graduate student Connor Lee (BS ’17, MS ’19) and undergraduate student Austin McCoy, is known as “self-supervised learning.” Unlike most computer-vision strategies, which rely on human annotators carefully curating large data sets to teach an algorithm how to recognize what it sees, this one allows the algorithm to teach itself. The AI searches for patterns in images by highlighting details and features that humans are likely to overlook.

Using the new system in conjunction with the current generation of VTRN results in more accurate localization: in one experiment, the researchers attempted to localize images of summer foliage against winter leaf-off imagery using a correlation-based VTRN technique. They discovered that performance was no better than a coin flip, with navigation failing in half of the attempts. In contrast, incorporating the new algorithm into the VTRN performed significantly better: 92 percent of attempts were correctly matched, and the remaining 8% could be identified as problematic in advance and then easily managed using other established navigation techniques.

“Computers can detect even the smallest trend and find obscure patterns that our eyes cannot see,” says Lee. According to him, VTRN was on the verge of becoming an unfeasible technology in common but difficult environments. “In resolving this problem, we saved decades of work.”

The system has applications for space missions in addition to its utility for autonomous drones on Earth. For the first time on Mars, the entry, descent, and landing (EDL) system on JPL’s Mars 2020 Perseverance rover mission used VTRN to land at the Jezero Crater, a site previously considered too dangerous for a safe entry.

“A certain amount of autonomous driving is required with rovers like Perseverance,” Chung says, “because transmissions take seven minutes to travel between Earth and Mars, and there is no GPS on Mars.” The team considered the Martian polar regions, which also have intense seasonal changes and conditions similar to Earth, and how the new system could help with scientific objectives such as they search for water.

The technology will then be expanded by Fragoso, Lee, and Chung to account for changes in weather such as fog, rain, snow, and so on. If they are successful, their research could aid in the development of navigation systems for self-driving cars. The Boeing Company and the National Science Foundation provided funding for this project. McCoy took part in the Summer Undergraduate Research Fellowship program at Caltech.