
The editorial image shows the interior of the new Tesla Model 3 with Full Self-Driving activated.
Nurphoto | Getty Images
The National Highway Traffic Safety Administration has escalated an investigation into Tesla‘s “Full Self-Driving” systems, according to filings on the agency website out Thursday.
The probe into Tesla’s FSD is looking into possible safety defects that make it risky for drivers to use in fog, glaring sun or other “reduced roadway visibility conditions.”
The investigation, which started last year, involves 3.2 million Tesla vehicles, including Model S, X, 3, Y and Cybertruck EVs that can use the company’s FSD-branded driver assistance systems, according to a filing on the agency’s website.
The agency wrote that Tesla FSD may sometimes fail: “to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants.”
In crashes reviewed by the agency, Tesla’s system “did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”
The probe has been elevated to an “engineering analysis,” after a string of complaints about collisions in which FSD was in use within 30 seconds of a crash, including one in which a Tesla driver who was using FSD struck and killed a pedestrian.
Tesla did not immediately respond to a request for comment.