Technology

Experts Alarmed by Videos of Tesla Full Self-Driving Totally Screwing Up

Experts Alarmed by Videos of Tesla Full Self-Driving Totally Screwing Up

So much for Tesla’s claim to have the world’s most advanced self-driving technology. Teslas outfitted with the company’s beta Full Self-Diving (FSD) malfunctioning in alarming ways have been inspected and certified by a panel of specialists established by The Washington Post. They discovered, predictably, that the driving aid technology is likely causing more harm than good.

Tesla was obliged to issue software updates to around 54,000 of its cars using the company’s hallmark FSD technology in early February 2022 due to the cars’ proclivity to not totally stop at stop lights. However, a growing number of videos showing Tesla drivers’ self-driving cars going berserk appear to illustrate that the problems with FSD extend far beyond rolling stops.

WaPo’s team of experts concluded after viewing six such movies that FSD may be too unsafe to deploy on public roadways. The six experts — whose backgrounds include self-driving research, autonomous vehicle safety analysis, and self-driving development — pointed out in one of the more alarming analyses that one video of a Tesla not slowing down enough when a pedestrian crosses light rail tracks shows the Tesla FSD’s difficulty recognizing pedestrian walk signs and doesn’t seem to understand that pedestrians may walk off sidewalks and into the roadway.

Another of the more upsetting videos reviewed by the WaPo’s panel shows how the FSD prompted its driver to take control of the vehicle when they became confused by a truck partially blocking the street, but when the driver does so, they struggle to regain control from the driver assistance program and must repeatedly sharply turn the wheel. “I’m taking over,” the driver declares as he jerks the steering wheel. “I’m — I’m making an effort.”

“At that point, it’s unclear who precisely is in charge,” said Andrew Maynard, an Arizona State University professor who works at the school’s Risk Innovation Lab. “There’s an unusual problem here when the driver and the car appear to have a brief battle for control. There appear to be circumstances in which both the driver and the vehicle may lose control at some point.” Maynard went on to say that the fault in the video is “significant” because it illustrates potential issues in “the human driver’s ability to assure safety” in the event of a breakdown.

Along with the experts’ assessments of the dangers posed by Tesla’s nascent FSD, WaPo spoke with the drivers behind the videos to confirm their veracity — and one, Chris from Fenton, Michigan, said that after using his Tesla’s driver assistance program for about a year, he believes it will be another decade before the program is truly ready to be put on the road. Expert criticism of Tesla has become almost a cottage industry, but it’s surprising to hear a Tesla owner confess that the company’s self-driving mode isn’t ready for public consumption.