Musk said no self-driving Tesla had ever crashed. Regulators counted 8

Elon Musk has long used his mighty Twitter megaphone to push the idea that Tesla’s automated driving software is not only safe, it’s safer than anything a human driver could achieve.

That campaign kicked off last fall when the electric carmaker expanded its fully self-driving “beta” program from a few thousand people to a fleet that now exceeds 100,000. The $12,000 feature supposedly allows a Tesla to drive itself through neighborhood roads and streets, changing lanes, making turns and obeying traffic signs and signals.

As critics chastised Musk for testing experimental technology on public roads without trained safety drivers as backups, Santa Monica investment manager and Tesla frontman Ross Gerber was among allies who jumped to his defense.

“There have been no accidents or injuries since the FSD beta release,” he said he tweeted in January “Not one. Not a single one.”

To which Musk responded with a single word: “Right.”

In fact, at that time dozens of drivers had already filed safety complaints with the National Highway Traffic Safety Administration for incidents related to fully autonomous driving, and at least eight of them involved accidents. Complaints are in the public domain, in a database on NHTSA’s website.

One driver reported that FSD “pulled right into a truck” before accelerating into a median pole, causing an accident.

“The car went into the wrong lane” with the FSD engaged “and I was hit by another driver in the lane next to my car,” said another.

YouTube and Twitter are full of videos exposing FSD bad behavior, including a recent one publication which appears to show a Tesla steering into the path of an oncoming train. The driver pulls the steering wheel to avoid a frontal collision.

It’s almost impossible for anyone but Tesla to say how many FSD-related accidents, injuries, or deaths have occurred; NHTSA is investigating several recent fatal crashes in which he may have been involved. The agency recently ordered automakers to report serious crashes involving automated and semi-automated technology to the agency, but has not yet released the details of each crash to the public.

Robot car companies like Cruise, Waymo, Argo and Zoox are equipped with software in the air that immediately reports accidents to the company. Tesla pioneered such software in passenger cars, but the company, which does not maintain a media relations office, did not respond to questions about whether it receives reports of automated accidents from cars using FSD. Automakers without over-the-air software must rely on public reports and communications with drivers and service centers to judge whether an NHTSA report is necessary.

Attempts to reach Musk were also unsuccessful.

Gerber said he was not aware of the crash reports in the NHTSA database when he posted his tweet, but believed the company would have been aware of any collisions. “Due to the fact that Tesla records everything that happens, Tesla is aware of every incident,” he said. He said it was possible the drivers were to blame for the crashes, but he did not review the reports himself.

There are no accurate public statistics on automated car accidents because the police who write accident reports only have the statements of the drivers. “We’re not experts in how to extract that kind of data,” said Amber Davis, spokeswoman for the California Highway Patrol. “At the end of the day, we ask for the best memories of how [a crash] happened.”

Only Tesla knows the data that a Tesla vehicle’s automated driving system collects and transmits back to headquarters, said Mahmood Hikmet, head of research and development at autonomous transportation company Ohmio. He said Musk’s definition of an accident or an accident may differ from how an insurance company or the average person might define it. NHTSA requires crash reports for fully or partially automated vehicles only if someone is injured, an air bag deploys, or a car must be towed.

The FSD crash reports were first unearthed by FSD critic Taylor Ogan, who runs Snow Bull Capital, a China-focused hedge fund. The Times downloaded and evaluated the data separately to verify Ogan’s findings.

The data, covering a period of January. January 1, 2021 until. 16, 2022: Shows dozens of safety complaints about the FSD, including many reports of phantom braking, in which a car’s automatic emergency braking system slams on the brakes for no apparent reason.

Below are excerpts from the eight accident reports in which FSD was involved:

  • Southampton, New York: A Model 3 traveling at 60 mph collided with an SUV parked on the side of the road. The Tesla drove itself “directly through the side of the SUV, tearing off the car’s mirror.” The driver called Tesla to say “our car went crazy.”
  • Houston: A Model 3 was traveling at 35 mph “when the car suddenly jumped the curb, causing damage to the bumper, wheel and a flat tire.” The accident “appeared to be caused by a discolored patch on the road that gave the FSD the false perception of an obstacle it was trying to avoid.” Rejecting a warranty claim, a Tesla service center charged $2,332.37 and said it would not return the car until the bill was paid.
  • Brea: “As I was turning left, the car went into the wrong lane and another driver hit me in the lane next to my car.” The car “on its own took control and forced itself into the wrong lane … putting everyone involved at risk.” The car is badly damaged on the driver’s side.”
  • Collettsville, NC: “The road swerved to the left and as the car took the curve it swerved too wide and went off the road… The right side of the car went up and over the start of the rock incline. The right front tire blew and they just deployed the side air bags (both sides). The car traveled about 500 meters on the road and then turned off.” Estimated damage was $28,000 to $30,000.
  • Troy, Mo.: A Tesla was turning a curve when “suddenly, about 40% into the curve, the Model Y straightened its wheel and crossed the center line into the direct path of the oncoming vehicle. As I attempted to return the vehicle to my lane, I lost control and went into a ditch and through the woods, causing significant damage to the vehicle.”
  • Jackson, Mo.: A Model 3 “rolled right into a truck, then pulled left into the median poles while accelerating and the FSD would not turn off….Owned this car for 11 days when our accident occurred.”
  • Hercules, California: “Phantom braking” caused the Tesla to stop suddenly and “the vehicle behind me didn’t react.” A rear-end collision caused “serious damage to the vehicle.”
  • Dallas: “I was driving with full self-driving assist… a car was in my blind spot, so I tried to take over the car by pulling the steering wheel. The car sounded an alarm saying it was going to hit the left median. I think I was struggling with the car to regain control of the car and ended up hitting the left median which bounced[ed] the car to the right, hitting the median.”

Critics say the name Full Self-Driving is a misnomer and that no car available for sale to an individual in the United States can drive itself. FSD “is totally a fantasy,” said New York University professor Meredith Broussard, author of the book “Artificial Unintelligence,” published by MIT Press. “And it’s a security nightmare.”

California regulations prohibit a company from advertising a car as fully autonomous when it is not. The state Department of Motor Vehicles is conducting a review of Tesla’s marketing, a review well into its second year.

DMV chief Steve Gordon has declined to speak publicly about the matter since May 2021. On Wednesday, the department said, “The review is ongoing. We’ll let you know when we have something to share.”

Leave a Reply

Your email address will not be published. Required fields are marked *