Tesla admits Autopilot doesn’t require constant driver monitoring and attention

In a letter to U.S. senators, Tesla has defended the safety of the company’s advanced driver assistance systems Autopilot and FSD but acknowledged that these systems require continuous monitoring and focus.

On February 8, U.S. Democratic Senators Richard Blumenthal and Ed Markey wrote to Tesla CEO Elon Musk, Expressed “serious concerns” about advanced driver assistance systems such as the company’s Autopilot and FSD. The systems have previously drawn scrutiny from safety regulators for their involvement in multiple crashes.

join us on telegram

In a March 4 letter to senators, Rohan Patel, Tesla’s senior director of public policy and business development, said the systems improve customers’ ability to drive, making them “more capable than ever before.” The average American driver is safer to drive.”

Both systems “require constant monitoring and attention from the driver,” Patel noted, adding that Tesla vehicles are only able to perform “some but not all of the dynamic driving tasks” compared to a human driver.

Tesla’s official website states that Autopilot allows the vehicle to automatically steer, accelerate and brake, “requires active supervision of the driver, and does not allow the vehicle to drive itself.”

In a statement, Blumenthal and Markey said the letter “simply confirms Tesla’s evasion and shifting focus. Despite the company’s worrying safety record and fatal crashes, Tesla appears to be Always hoped that everything goes as usual.”

Tesla did not respond to a request for comment. In the letter, Patel said Tesla “recognizes the importance of educating vehicle owners on Autopilot and FSD capabilities.”

The Autopilot system sometimes allows the driver’s hands to take their hands off the steering wheel. But Patel said torque-based steering wheel hand-hold detection helps ensure drivers stay focused while the vehicle is moving.

More than a year ago, Tesla launched a beta version of the FSD system, enabling its vehicles to test Autopilot-related functions on city streets. Tesla has deployed the FSD system to more than 60,000 users, prompting a flood of criticism that the company jeopardizes traffic safety by letting untrained drivers test its technology on public roads.

Tesla is currently facing multiple investigations. In their letter, the senators said, “The complaints and investigation paint a disturbing picture of Tesla’s repeated software releases without adequate consideration of the risks and implications, causing damage to everyone on the road. Serious danger.”

Under pressure from regulators, Tesla agreed in January to recall about 54,000 U.S. electric vehicles by modifying software to prevent vehicles from ignoring stop signs.

Leave a Comment