Tesla’s Full Self-Driving Labeled Occasionally Dangerous in Independent 1,000-Mile Test

A new independent test of Tesla’s Full Self-Driving (FSD) system has found that the technology is “occasionally dangerously inept.” The test, conducted by Consumer Reports, involved driving a Tesla Model S equipped with FSD for 1,000 miles on a variety of roads and in different weather conditions.

The testers found that FSD was able to handle basic driving tasks such as staying in the lane and maintaining a safe following distance. However, the system also made a number of mistakes that could have led to accidents. For example, FSD had difficulty navigating complex intersections and merging into traffic. It also struggled to detect pedestrians and other objects in the road.

In one instance, FSD nearly caused an accident when it failed to stop for a red light. The car only came to a stop after the driver intervened. In another instance, FSD drove too close to a pedestrian who was crossing the street.

The findings? Far from the flawless performance expected of a self-driving car. AMCI drivers had to intervene more than 75 times while FSD was active, equating to an average of one intervention every 13 miles. While the system did navigate certain scenarios effectively, such as yielding to oncoming traffic on narrow roads, several critical failures during testing raised red flags about the technology’s readiness for the real world.

Alarming FSD Failures Highlight Safety Concerns

One of the most alarming incidents occurred when the Model 3 ran a red light at night, even though the vehicle’s cameras detected the light change. In another situation, while navigating a rural two-lane road, the car veered over a double yellow line and into oncoming traffic, forcing the driver to take over to avoid a collision. There were also instances of unnecessary stops, such as when the vehicle braked at a green light, despite the traffic in front accelerating.

Guy Mangiamele, Director of AMCI Testing, commented on the unpredictability of the system: “What’s most disconcerting and unpredictable is that you may watch FSD successfully negotiate a specific scenario many times, often on the same stretch of road or intersection, only to have it inexplicably fail the next time.”

AMCI’s analysis points to a growing concern with driver complacency. David Stokols, CEO of AMCI Global, emphasized the public’s trust in such systems and the danger of that trust being misplaced. “Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results,” Stokols said.

Tesla’s Robotaxi Launch Raises Questions Amid Autonomy Concerns

These results come at a critical time for Tesla, as the company prepares to launch its long-awaited Robotaxi service on October 10. CEO Elon Musk has repeatedly alluded to the system’s capabilities, claiming that Tesla’s vehicles can drive autonomously anywhere without relying on pre-mapped data, instead using advanced camera systems to assess and make decisions on the fly. However, recent reports from Bloomberg and Tesla hacker Green The Only suggest that Tesla has been collecting data in the Los Angeles area, where the Robotaxi launch event will be held, to better prepare the system for specific driving conditions.

Keen-eyed Redditors have even spotted test vehicles in the area, along with a bright yellow prototype resembling what might be a two-door Cybercab.

Despite Tesla’s impressive advancements in autonomous driving, the results from AMCI Testing show that more work is needed to eliminate unpredictable system failures. As Tesla moves forward with its ambitious Robotaxi plans, the company will need to prove that its technology can handle the complexities of real-world driving consistently and safely, especially as fully autonomous driving inches closer to becoming a reality.

Comments are closed.