Tesla FSD System Evaluation: Drive on tram tracks, don’t recognize stop signs, and almost hit pedestrians

Tesla’s Full Self-Driving System (FSD) seems to have been regularly exposed to bugs in recent times. The most recent occurred during the Spring Festival: Tesla recalled 53,822 vehicles for violating US traffic regulations with the FSD’s ‘rolling stop’ feature.

join us on telegram

That is, at certain intersections marked with ‘all-way stop’ signs, the software will allow some cars to proceed at a low speed, rather than having a complete stop.

At the time, Tesla said that as of Jan. 27, it had not received any news of warranty claims, crashes, injuries or deaths related to the recall. As a result, Tesla CEO Elon Musk tweeted that there is ‘no safety issue’ with the feature.

Currently, Tesla has disabled the FSD’s ‘rolling stop’ feature, but discussions about the safety of this feature continue. Meanwhile, more questions about FSD began to surface.

In a report on February 10, The Washington Post mentioned that they convened a panel of experts to analyze some driving videos uploaded by Tesla owners frame by frame, and the results showed that Tesla’s FSD exists.

Fundamental weakness. None of these issues are easy to fix, experts say, and patching one could introduce new complexities. Some of the questions are hilarious, such as after a right turn, the Tesla drove directly into the tramway:

The videos they analyzed were from YouTube, and more problems with FSD included:

  • Crashed into a bike lane bollard at 11 miles per hour;
  • Failure to stop and yield before a pedestrian crossing;
  • Long-distance parking;
  • Compete for control of the steering wheel;
  • Some traffic signs cannot be recognized.

Hit the bollard

On a clear day, a Tesla with an FSD Beta turned right through an intersection in San Jose at 15 mph. A bike lane is on the inside of the road. The car then hit the bike lane’s guard post at 11 mph.

‘The problem is both a cartographic problem and a perception problem. As permanent bollards rather than temporary cones, they should be on the map,’ said Brad Templeton, a longtime self-driving car developer.

‘As to why the FSD didn’t sense these bollards in time, maybe it’s because the shape and color of the bollards are so uncommon that the system didn’t see them during training,’ said Templeton, a Tesla owner and fan.

Tesla’s ultrasonic sensors are expected to detect these hazards, but their location (front bumper, etc.) could be a weak point. “They may not see sparse, thin things like stickers,” Templeton said. ‘

almost hit a pedestrian

The second example also occurred in San Jose. After turning right at a green light, Tesla nearly hit a pedestrian who was about to cross a zebra crossing. Pedestrians stopped suddenly after seeing Tesla, and Tesla also slowed down, but when it slowed down, it had already passed most of the zebra crossing.

After analyzing the video and others like it, the Washington Post panel said the FSD did not appear to have recognized sidewalk signs or anticipated that a stationary pedestrian might venture across the road. Professor Andrew Maynard, director of Arizona State University’s Risk Innovation Lab, said: ‘It is unclear whether the car responded to the presence of pedestrians, but clearly the driver was frightened. ‘

Hod Finkelstein, chief R&D officer at lidar technology company AEye, said he does not believe cameras alone are sufficient to detect pedestrian intent in all situations because they are not good at measuring distances to distant objects and can be blocked by car headlights and the sun’ Blind’ eyes. Traditional self-driving car makers have used a combination of cameras, lidar, traditional radar and even ultrasonic sensors.

Tesla’s software combines machine learning software with simpler software ‘rules’ such as ‘Stop at stop signs and red lights. ‘ But as one researcher pointed out, machine learning algorithms always learn things they shouldn’t. For example, if the software is told ‘never hit a pedestrian’, it may incorrectly understand that if a pedestrian is about to be hit, they will move out of the way.

Software developers can create a ‘rule’ that cars must slow down or stop for pedestrians. But then the software will be paralyzed because the city is full of people. So, this is ultimately a long tail problem.

long distance parking

In another video recorded by the same driver in early December, the Tesla would stop when there were pedestrians crossing the road outside the crosswalk, and it started to stop when the pedestrians were far away. At this point, many human drivers will choose to keep driving.

The video suggests that Tesla may be ordered to slow down if pedestrians are heading in the direction of the road. But one expert has raised another possibility: The car may have stopped because of an optical illusion.

In this video, the red sign between Tesla and pedestrians overlaps with a tree on the sidewalk, creating an image that resembles a stop sign. A video uploaded in February showed the same phenomenon, showing that stop sign illusions are indeed tricking cars.

Scramble for control of the steering wheel

Of course, the Tesla in these videos poses no real danger, as most of the time the owner takes over in time when the system makes the wrong choice. Tesla’s official website states that when using Autopilot and FSD, drivers must ‘keep their hands on the wheel at all times’ and ‘maintain control and responsibility for the car’ at all times.

But sometimes takeovers can be troublesome. In another instance, the same Tesla was passing a truck parked on a narrow street with vehicles parked on both sides. When unsure what to do, the car’s software prompts the driver to take over. However, the driver had difficulty controlling the vehicle at this time because the steering wheel swayed violently from side to side.

In this case, both the computer system and the human are trying to drive the car through a tight spot with little wiggle room. In most cases, the driver takes over by yanking the steering wheel in the opposite direction the software is trying to turn. However, in this case, such movement is impossible, so it is not clear whether the car or the person is in control. With no sharp turns, the driver can’t regain control by turning the steering wheel, so things get tricky.

An unrecognizable stop sign with only letters

There’s also a serious problem with Tesla’s cars not responding to the ‘Stop here on the red’ sign. A Michigan driver, Chris, posted a video in November in which he was forced to slam on the brakes.

A self-driving researcher said the signs, which are ubiquitous on American roads, will trouble Tesla engineers. Unless car cameras can recognize the letters on a sign, the computer will have to look for other clues, such as arrow signs or white lines drawn on the road. But this can create problems in other situations, causing the car to stop incorrectly when it sees a line or an arrow-like sign on the road.

Many of Tesla’s competitors use high-definition maps to guess where to stop and turn, but that strategy raises other problems, such as the risk that any map won’t keep up with the constantly changing road network.

So, after using the FSD for about a year, Chris believes that cars are still a decade away from reliably autonomous driving.

In addition to Autopilot, Tesla recently recalled nearly 580,000 vehicles for other issues.

Nearly 580,000 vehicles recalled as pedestrian warning system sounds muted

On February 10, U.S. regulators said Tesla was recalling 578,607 vehicles in the U.S. because pedestrians may not be able to hear the necessary siren sound as the car approaches because the car’s ‘Boombox’ feature plays music loudly. or other sounds.

In the past four months, Tesla has issued 10 recalls in the United States, including four in the last two weeks. The Texas-based company is under increasing scrutiny from the National Highway Traffic Safety Administration (NHTSA). Tesla said it has not identified any crashes, injuries or fatalities related to the vehicle alert issue that triggered the latest recall.

Tesla launched ‘Boombox’ in December 2020, which allows drivers to change the sound of car horns, car owners can customize other sounds to replace the normal car horn sound, and can also play music in the car through external speakers, such as while driving While playing your favorite songs outside the car.

‘Boombox’ allows sounds to be played through external speakers while the vehicle is moving, which could potentially mask the desired pedestrian warning system sound. The National Highway Traffic Safety Administration said it failed to comply with federal motor vehicle safety standards for minimum volume levels for electric vehicles. As a result, Tesla is recalling 2020-2022 Model S, Model X, Model S, and 2017-2022 Model 3 vehicles.

Electric cars tend to be harder to ‘hear’ in time than gasoline engines at low speeds. According to the U.S. Congress, automakers must add sound playback to electric vehicles when they are traveling at speeds of up to 18.6 miles (30 kilometers) per hour to prevent injuries to pedestrians, cyclists and the blind. At higher speeds, tire noise, wind resistance and other factors can drown out the siren, the NHTSA said.

After the software update, Tesla will disable the Boombox function in D, N, and R modes. In fact, Tesla’s recent multiple recalls have been to address software issues. Tesla has had several recalls shortly after NHTSA questioned the features and complaints of Tesla vehicles. Regulators are currently investigating Tesla’s driver-assistance system Autopilot and in-car gaming features.

Under pressure from the NHTSA, Tesla agreed in January 2021 to recall 135,000 vehicles with potentially faulty touchscreens. In this case, NHTSA took the unusual step of formally requesting a recall.

Tesla has tried to fix the problem with OTA updates, but NHTSA said in early 2021 that those updates may be ‘procedurally and materially insufficient’.

According to Tesla, NHTSA issued a request for information in January 2021, and the two sides held several online meetings on the issue in the following months.

Tesla said that by September 2021, NHTSA escalated its investigation into the matter. In October, Tesla defended the tests and rationale used to determine compliance with the Boombox, but after a meeting in December, Tesla finally agreed to recall the vehicles.

Leave a Comment