Jump to content
  • Tesla recalls every car with Autopilot as feds say it’s too easily misused


    Karlston

    • 378 views
    • 5 minutes
     Share


    • 378 views
    • 5 minutes

    Inadequate driver monitoring means that Autopilot is far too easily misused.

    More than 2 million Tesla electric vehicles are subject to a new safety recall today. At issue is the much-criticized Autopilot driver-assistance feature, more specifically the Autosteer component.

     

    At one time, Tesla claimed that Autosteer cut crashes by 40 percent—a statistic that turned out to be completely false once a third party analyzed the data. Now, following an ongoing engineering analysis by the National Highway Safety Administration Office of Defects Investigation that found Tesla has inadequate driver monitoring and that the system could lead to "foreseeable misuse," the automaker has finally reacted.

     

    Autopilot is Tesla's name for a suite of advanced driver assistance systems, but the two principal components are "traffic-aware cruise control" and Autosteer. The former maintains the car's speed relative to a vehicle in front, and the latter reads lane markers on the road and keeps the car between them. The system was originally based on one supplied by Mobileye, although that relationship broke down, and Tesla was dropped as a customer by Mobileye due to Mobileye's concern that Tesla was "pushing the envelope in terms of safety."

     

    Since then, Tesla has developed its own system, removing sensors like forward-looking radar to save costs. However, its vision-only approach led to hundreds of complaints to NHTSA detailing events where Tesla's cameras registered false positives while driving and slammed on the brakes.

     

    A particular problem with Autopilot has been the mixed messages from the Automaker. While its website states that "[c]urrent Autopilot features require active driver supervision and do not make the vehicle autonomous," Tesla CEO Elon Musk has repeatedly given the impression that the system is autonomous, particularly in TV interviews with mainstream news outlets. (Tesla's website also still hosts a video claiming that the system can drive itself.)

     

    (There are also reports that Musk operates his own Teslas in a partially autonomous state with no driver monitoring at all.)

     

    Since Autopilot's release, other automakers have brought advanced partially automated driving systems to market, including General Motors' Super Cruise, Ford's Bluecruise, and BMW's Driving Assistance Professional. Unlike Autopilot, these systems have a more tightly controlled operational design domain; they will only activate on restricted access highways that have been GPS-mapped, and all three include a dedicated infrared camera that uses gaze-tracking to ensure the driver is looking at the road ahead.

     

    By contrast, until relatively recently, Tesla merely used a torque sensor on the steering column, which was easily defeated by hanging something heavy like a water bottle off the steering wheel rim. Recently, the company has claimed that a wide-angle camera built into the rearview mirror is capable of driver monitoring, although it apparently does not work well enough to prevent a giant stuffed bear from being recognized as a human driver.

    That it took this long is a regulatory failure

    Ars has reported on a litany of Tesla Autopilot safety flaws over the years. For instance, the company has used inadequate driver monitoring systems that could be easily be defeated, allowing a Tesla to drive on public roads with no human in the driver's seat.

     

    Almost a dozen Teslas, operating under Autopilot, have crashed into emergency vehicles at the side of a road. And despite Musk's repeated claims that his company builds the safest cars available, Autopilot was implicated in 273 crashes between July 2021 and May 2022. Earlier this year, the German publication Handelsblatt found that Tesla was aware of more than 3,000 customer complaints about Autopilot's unsafe behavior.

     

    At the start of 2023, an ongoing lawsuit brought by relatives of engineer Walter Huang, who died in 2018 when his Tesla Model X smashed into a highway gore while operating under Autopilot, revealed that Tesla staged a widely viewed demo in 2016, which emails showed Musk personally oversaw. And in January, we learned that the US Department of Justice has also been investigating Autopilot.

     

    In November of this year, a Florida court ruled that there is "reasonable evidence" to conclude that Musk and Tesla knew of Autopilot defects but refused to fix them, and last week, the former Tesla employee and whistleblower who leaked thousands of accident reports to Handelsblatt told the BBC that Tesla was conducting "experiments" on public roads.

     

    Many of these issues had been flagged by the National Transportation Safety Board, but unlike NHTSA, NTSB has no regulatory authority and cannot order an automaker to recall a dangerous product.

    Yes, it’s a recall, even if the fix is software

    It's important to note that this is an official safety recall, even if the fix is a software update. How the fix happens is immaterial to NHTSA's safety recall process; the point is that the public and owners are notified that there is a safety defect and that there is a remedy.

     

    In this case, the update "will incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged, which includes keeping their hands on the steering wheel and paying attention to the roadway."

     

    Additionally, some Teslas will get more prominent visual alerts on the screen, and there will be "additional checks upon engaging Autosteer and while using the feature outside controlled access highways and when approaching traffic controls" that will lock users out of being able to activate Autopilot if they fail to "demonstrate continuous and sustained driving responsibility while the feature is engaged."

     

    NHTSA says that its investigation will remain open so the agency can monitor whether or not Tesla's proposed fixes actually solve the problem.

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...