Inclement weather — particularly rain and snow — threaten to stop autonomous vehicles in their tracks. That’s because precipitation covers cameras critical to the cars’ self-awareness and tricks sensors into perceiving obstacles that aren’t there. Plus, bad weather has a tendency to obscure road signage and structures that normally serve as navigational landmarks.
Fortunately, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory and Lincoln Laboratory are on the case. In a paper that will be published in the journal IEEE Robotics and Automation Letters later this month and presented in May at the International Conference on Robotics and Automation (ICRA), they describe a system that uses ground-penetrating radar (GPR) to send very high frequency (VHF) electromagnetic pulses underground to measure an area’s combination of pipes, roots, rocks, dirt, and other features. The GPR builds a basemap that an onboard computer correlates, contributing to a three-dimensional GPS-tagged subterranean database.
According to paper lead author and CSAIL Ph.D. student Teddy Ort, it’s the first time developers of self-driving systems have employed ground-penetrating radar, which has previously been used in fields like construction planning, landmine detection, and lunar exploration. “If you or I grabbed a shovel and dug it into the ground, all we’re going to see is a bunch of dirt,” he said. “But [localizing ground-penetrating radar] can quantify the specific elements there and compare that to the map it’s already created so that it knows exactly where it is, without needing cameras or lasers.”
The researchers found that on a closed country road in snowy conditions the navigation system’s average margin of error was about an inch in snowy conditions compared to in clear weather. The GPR had a bit more trouble with rainy conditions — the precipitation caused more water to soak into the ground, leading to a disparity between the original readings and the current conditions — but it was off by only an average of 5.5 inches. More impressively, over a six-month testing period, the team never had to take the wheel.
Ort and coauthors note that the approach wouldn’t work entirely on its own since it can’t detect things aboveground. Also, the GPR data sets are currently difficult to stitch together because of aboveground factors like multi-lane roads and intersections, and the current hardware is too bulky and wide to fit into most commercial vehicles.
But they say that the GPR could easily be extended to highways and other high-speed areas and that its ability to localize in bad weather means it could possibly be coupled with existing approaches, like cameras and lidar. Another advantage? The system’s underground maps tend to hold up better over time than maps created using vision or lidar, since the features of an aboveground map are much more likely to change. As an added bonus, they take up roughly 20% less space than the traditional 2D sensor maps that many companies use for their cars.
MIT spinout WaveSense, which came out of stealth in August 2018, is already working to commercialize the system. It’s using a version that Lincoln Labs researchers demonstrated could guide an SUV centimeters within a lane on a road freshly coated with snow. This system was first developed for military vehicles in regions with poor or nonexistent road markings.
“Our work demonstrates that this approach is actually a practical way to help self-driving cars navigate poor weather without actually having to be able to ‘see’ in the traditional sense using laser scanners or cameras,” said senior author and MIT professor Daniela Rus.