Tesla Autopilot braking caused the chain collision in Norway — is total nonsense
A chain collision happened on the E6 European route near the Verdal Municipality in Norway and the local police blamed a Tesla Model 3 that was running on Autopilot, according to the local police, the Tesla Autopilot made the car come to a full stop after suspecting a hurdle in front of it.
The Municipality’s Police Chief Anne Marie Dypdahl explained the situation to a local news outlet Innherred.no as (Google Translate):
It may look like a truck in the southbound direction has come, while a Tesla in the northbound direction has run on autopilot. Then the car has obviously registered that there is an obstacle in the road, and thus put on full brake. Shortly afterward, it was hit from behind.
Looks like the Police Chief does not know much about Tesla Autopilot besides that it is a self-driving software that drives the Tesla cars when the driver is not paying any attention — this is when the Tesla and EV evangelist from Norway Bjørn Nyland had to step in and make the following rebuttal video.
Tesla Phantom Braking
Bjørn Nyland says he has passed through this route at least 30-50 times during his long journeys and was able to find the spot on Google Maps (street view), according to him, the car coming to a complete stop suddenly must not have been the fault of the Autopilot, he has experienced what he calls ‘Phantom Braking’ in such situations but the car just reduces the speed down from -10 – 20 km/h (-6 – 12 mph) only.
He further explains Phantom Braking happens when the Tesla Autopilot sees an oncoming large truck or vehicle as potentially dangerous and the car’s AI triggers the AEB (automatic emergency braking), but in his hundreds of journeys accumulating hundreds of thousands of km in all Tesla Models including the X, S, and 3 — none of the Tesla car suddenly came to a complete stop. Let’s listen to this phenomenon in his video below.
Bjorn further says that he has experienced and complained about the Phantom Braking in his earlier live stream videos but still if the drivers behind had kept a safe distance, this chain collision would have been avoided. Another thing to note is that the older AP1 cars are not affected by this bug only Autopilot 2+ cars have this issue which has the native Tesla software versus the AP1’s Mobileye version.
Tesla should find a fix this bug ASAP as it can bring them a lot of trouble and defamation like the Apple engineer’s sad demise in California a couple of years ago while he was using Autopilot and playing a video game on his iPhone but still, the Silicon Valley automaker had to face the lawsuit and the CEO Elon Musk had to explain it to the entire world.