NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths


National Highway Traffic Safety Administration (NHTSA) concluded his investigation After reviewing hundreds of accidents, including 13 fatal ones that resulted in 14 deaths, it entered Tesla’s Autopilot driver assistance system. The organization decided that these accidents occurred due to the misuse of the system by the drivers.

However, NHTSA also found that “Tesla’s impaired driver engagement system was inconsistent with Autopilot’s permitted operating capabilities.” In other words, the program did not prioritize driver attentiveness. Drivers using Autopilot or the company’s Full Self-Driving technology were “not engaged enough” because Tesla “didn’t adequately keep drivers focused on the driving task.”

The organization investigated nearly 1,000 accidents between January 2018 and August 2023, which resulted in a total of 29 deaths. NHTSA found that there was “insufficient data to evaluate” about half of those crashes (489). In some cases, the other party was at fault or Tesla drivers did not use the Autopilot system.

The most serious were 211 accidents where “Tesla’s front plane struck a vehicle or obstacle in its path,” and these were often attributed to Autopilot or FSD. These events caused 14 deaths and 49 injuries. The agency found that drivers had enough time to react, but not in 78 of those incidents. Although these drivers had at least five seconds to act, they failed to brake or steer to avoid the danger.

This is where complaints against software come into play. NHTSA says drivers will simply be too complacent if they assume the system will handle any hazards. By the time the reaction came, it was already too late. The organization writes that “crashes with no attempted or delayed escape attempts by the driver have been detected in all Tesla hardware versions and crash situations.” The imbalance between driver expectations and Autopilot’s operational capabilities has resulted in a “critical safety gap” that has led to “anticipated misuse and preventable accidents.”

NHTSA also took umbrage at Autopilot’s branding, calling it misleading and suggesting it allows drivers to assume the software has full control. For this purpose, competing companies use branding with words such as “driver assistance”. Autopilot indicates an autonomous pilot. California’s attorney general and the state Department of Motor Vehicles They are also investigating Tesla for deceptive branding and marketing.

Tesla, on the other hand, said it warned its customers to be careful when using Autopilot and FSD. according to The Verge. The app has regular prompts to remind drivers to keep their hands on the wheels and eyes on the road, the company says. NHTSA and other safety groups say those warnings don’t go far enough and are “not enough to deter abuse.” Despite these statements from security groups, CEO Elon Musk recently promised the company will keep going “Balls to the wall for autonomy.”

The findings may represent only a small fraction of the actual number of autopilot and FSD-related accidents and crashes. NHTSA noted that “gaps in Tesla’s telematics data create uncertainty about the crash rate of Autopilot vehicles.” That means Tesla only collects data from certain types of crashes, while NHTSA claims the company collects data on about 18 percent of crashes reported to police.

There is organization with all this mind opened another investigation To Tesla. This looks at the latest OTA firmware released in December two million cars were recalled. NHTSA will evaluate whether Tesla’s Autopilot recall fix is ​​effective enough.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *