banner
Home / Blog / Tesla's purported hands
Blog

Tesla's purported hands

Jan 24, 2024Jan 24, 2024

The discovery of a secret Tesla Autopilot configuration that allows the self-driving system to operate without driver attention isn't sitting well with US regulators who have cut a special order to get more information for their ongoing investigation.

Hidden in the depths of Tesla's software, the National Highway Traffic Safety Administration (NHTSA) contends, is a secret configuration that a Tesla hacker who goes by greentheonly discovered in June and dubbed "Elon Mode."

According to the NHTSA's letter [PDF] and special order sent to Elon Musk's car company in July but only published to the Administration's website this week, regulators are worried that public knowledge of Elon Mode may lead to more Tesla owners trying to activate it, and more safety issues.

"The resulting relaxation of controls designed to ensure that the driver remain engaged in the dynamic driving task could lead to greater driver inattention and failure of the driver to properly supervise Autopilot," the NHTSA asserted.

In its special order, the NHTSA is requesting dates that Elon Mode – which it is calling the "subject software update" – was introduced to Tesla's engineering vehicles and consumer vehicles. It also wants to know the software/firmware version that first contained the subject software update, as well as the number of vehicles in which the update is installed and/or activated in both groups. The agency also wants to know how easy it is to activate Elon Mode and any other differences in functionality that come with the configuration.

According to greentheonly, the feature is buried pretty deeply in Tesla's software such that the average Tesla owner wouldn't be able to turn it on. "Even regular people that do rooting cannot affect this," greentheonly told The Register in an email. "That said I know of at least three cars that have this unofficially and probably a handful more that are capable, but not using this at the time."

The NHTSA declined to share any additional details, citing the ongoing investigation into the matter, a spokesperson told us.

The NHTSA gave Tesla a deadline of August 25 to respond, and the company has done so, but the regulator is keeping the response private due to the presence of confidential business information.

The NHTSA has been investigating Tesla Autopilot since 2021 over concerns that the advanced driver assistance system (ADAS) was making drivers behave unsafely, largely because of the way Tesla markets it as "Autopilot" and its beta product as "Full Self-Driving" (FSD) when in reality it's no better than any other level 2 ADAS system on the road (if not worse).

The NHTSA upgraded its probe from a preliminary analysis to a full engineering analysis in 2022 after it found reason "to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver's supervision."

Tesla has always maintained that drivers should keep their hands on the wheel at "all times" while using Autopilot, and its vehicles are typically configured to require drivers to regularly torque the wheel. In some models, internal cameras monitor driver attentiveness as well. Drivers with their hands off the wheel get what Tesla owners refer to as a "nag," and if they don't respond to it, Autopilot can disengage.

However, Musk said at the end of 2022 that he agreed with an assertion that Tesla owners with more than 10,000 miles (16,093km) of FSD beta driving should be given the option to turn off the steering wheel nag. In April, Musk said Tesla was "gradually reducing [the wheel nag], proportionate to improved safety."

This week, Musk expressed agreement with the idea that eliminating wheel nags would be a "game changer" for user satisfaction.

Autopilot-related accidents, including fatal ones, continue to be reported to the NHTSA by Tesla, and in February the NHTSA forced Musk's company to recall and update FSD software in as many as 362,758 vehicles.

According to the NHTSA, the recall was issued because FSD bugs "may allow the vehicle to act unsafe around intersections… or proceeding into an intersection during a steady yellow traffic signal without due caution."

"In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver's adjustment of the vehicle's speed to exceed posted speed limits," the NHTSA said [PDF] in its recall notice.

To complicate the matter, Tesla engineers earlier this month claimed under oath that the company knew Autopilot was unsafe, yet did nothing to fix it, in a civil lawsuit brought against Tesla regarding a crash that killed 50-year-old father Jeremy Banner in 2019.

It was alleged in both this case and another accident in 2016, Tesla vehicles with Autopilot activated failed to respond to the presence of cross traffic. The lawsuit alleges this led to the cars passing under a semi truck trailer, shearing the tops off the vehicles and killing the drivers. Depositions of two Autopilot engineers suggest Tesla knew Autopilot had issues recognizing cross traffic, and that it could also be activated in unsafe situations, but "no changes were made to Autopilot's systems to account for cross traffic."

Tesla has said that it is facing other investigations from the US Department of Justice and the California Attorney General's office. As for the NHTSA probe, acting Administrator Ann Carlson told Reuters last week that it hoped to wrap up its Autopilot investigation soon; it's unclear whether the special order will delay the Administration's planned end date.

Tesla did not respond to questions for this story. ®

Send us news

44Get our4