Current:Home > MySurpassing Quant Think Tank Center|How bad is Tesla's full self driving feature, actually? Third-party testing bodes ill -FundTrack
Surpassing Quant Think Tank Center|How bad is Tesla's full self driving feature, actually? Third-party testing bodes ill
Surpassing Quant Think Tank Center View
Date:2025-04-08 04:50:15
According to testing firm AMCI,Surpassing Quant Think Tank Center Tesla’s FSD software can’t drive more than 13 miles without needing intervention.
We’re Just weeks out from Tesla’s big RoboTaxi presentation, where the automaker's self-driving shuttle will be revealed, and third-party independent research firm AMCI Testing has some bad news that could hang over the event like a cloud. AMCI just completed what it claims is “the most extensive real world test” of Tesla’s Full Self Driving (FSD) software, ostensibly the technology that would underpin the RoboTaxi's driverless tech, and the results are not confidence inspiring.
AMCI says its test covered over 1,000 miles of use and, in short, showed that the performance of Tesla’s FSD software is “suspect.” This isn’t the first time Tesla has caught criticism for FSD. For years, Tesla FSD software has been a source of controversy for the automaker. Tesla has dealt with everything from being called out by the California DMV for false advertising to being investigated by NHTSA.
There have been so many incidents involving Tesla’s Autopilot and FSD that we had to build a megathread to keep track of them all. It's worth noting that Tesla claims FSD is still in "beta," so it's incomplete, but it also sells the feature as a five-figure option on its current lineup of EVs, allowing owners to opt into being, essentially, real-world test dummies for the system. They must acknowledge that the system requires driver oversight and is not, as its name implies, a fully self-driving system today. Still, Tesla is essentially offloading the kind of testing other automakers conduct scientifically, with engineers and oversight, to customers in the real world. And AMCI’s findings on how reliable FSD is—or rather, is not—are just the latest road bump for Tesla and FSD.
AMCI says it conducted its tests in a Tesla Model 3 with FSD versions 12.5.1 and 12.5.3 across four different driving environments: city streets, rural two-lane highways, mountain roads, and freeways. AMCI was impressed with FSD’s ability to rely solely on cameras. (Tesla is the only automaker whose driver assistance systems of FSD's ambition operate using only cameras and, essentially, short-distance parking sensors, rather than a more complex—and expensive—combination of cameras, sensors, radar, and lidar, which can paint a much clearer picture with more redundancies than Tesla's camera array.) However, AMCI found that, on average, when operating FSD, human intervention is required at least once every 13 miles to maintain safe operation.
“With all hands-free augmented driving systems, and even more so with driverless autonomous vehicles, there is a compact of trust between the technology and the public. When this technology is offered the public is largely unaware of the caveats (such as monitor or supervise) and the tech considered empirically foolproof. Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results,” said David Stokols, CEO of AMCI Testing’s parent company, AMCI Global. “Although it positively impresses in some circumstances, you simply cannot reliably rely on the accuracy or reasoning behind its responses.”
You can see the full results of the test for yourself, but here is the gist from AMCI:
- More than 1,000 miles driven
- City streets, two-lane highways, mountain roads, and freeways
- Day and night operation; backlit to full-frontal sun
- 2024 Model 3 Performance with Hardware 4
- Full Self Driving (Supervised) Profile Setting: Assertive
- Surprisingly capable, while simultaneously problematic (and occasionally dangerously inept)
- The confidence (and often, competence) with which it undertakes complex driving tasks lulls users into believing that it is a thinking machine—with its decisions and performance based on a sophisticated assessment of risk (and the user’s wellbeing)
If you think 13-miles intervals between instances where a driver must grab the wheel or tap the brakes is pretty good, it's not just the number of interventions required, but the way those situations unfold. AMCI’s final point is the most eyebrow-raising (emphasis theirs): “When errors occur, they are occasionally sudden, dramatic, and dangerous; in those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident—or possibly a fatality.”
To back up its report, AMCI released three videos showing some of the instances in which FSD performed unsafely. Tesla has yet to publicly respond to this report, though we wouldn’t hold our breath for that. Again, the automaker can fall back on the idea that the software is still in development. Common sense, however, suggests that putting a feature with the FSD name and purported future self-driving capabilities into the hands of regular people now—when decisions the system makes or can flub—have dire consequences, and AMCI's testing proves that FSD's shortcomings rear their heads quite often.
veryGood! (69)
Related
- Can Bill Belichick turn North Carolina into a winner? At 72, he's chasing one last high
- Upset alert for Notre Dame, Texas A&M? Bold predictions for Week 5 in college football
- Officials warn that EVs could catch fire if inundated with saltwater from Hurricane Helene
- Shohei Ohtani 50-50 home run ball: Auction starts with lawsuit looming
- Krispy Kreme offers a free dozen Grinch green doughnuts: When to get the deal
- Will Ferrell recalls his biggest 'fear' making Netflix film with trans best friend
- District attorney’s office staffer tried to make a bomb to blow up migrant shelter, police say
- Fifth Harmony Alums Camila Cabello & Normani Reunite for First Time in 6 Years at Paris Fashion Week
- Federal appeals court upholds $14.25 million fine against Exxon for pollution in Texas
- New Orleans, US Justice Department move to end police department’s consent decree
Ranking
- Tom Holland's New Venture Revealed
- Judge tosses lawsuit against congressman over posts about man not involved in Chiefs’ rally shooting
- A rare condor hatched and raised by foster parents in captivity will soon get to live wild
- Virginia Tech misses out on upset of No. 9 Miami after Hail Mary TD is overturned
- From family road trips to travel woes: Americans are navigating skyrocketing holiday costs
- Maggie Smith Dead at 89: Downton Abbey Costars and More Pay Tribute
- Ariana Madix Weighs in on Vanderpump Rules' Uncertain Future—and the Only Costars She Talks to
- NMSU football play-caller Tyler Wright's social media has dozens of racist, sexist posts
Recommendation
Military service academies see drop in reported sexual assaults after alarming surge
Fossil Fuel Presence at Climate Week NYC Spotlights Dissonance in Clean Energy Transition
How Steamy Lit Bookstore champions romance reads and love in all its forms
Horoscopes Today, September 27, 2024
US appeals court rejects Nasdaq’s diversity rules for company boards
Dame Maggie Smith, 'Downton Abbey' star and Professor McGonagall in 'Harry Potter,' dies at 89
Why 'My Old Ass' is the 'holy grail' of coming-of-age movies
Ohio’s fall redistricting issue sparked a fight over one word. So what is ‘gerrymandering,’ anyway?