Motor Vehicle Accidents
9 Injured In 8 Car Pile-Up Caused By Tesla Autopilot
Tesla has come under scrutiny from the peak traffic safety authority in the US following an 8-car pileup last year on Thanksgiving (24 November) caused by a Tesla Model S.
The 76-year-old driver told police he was operating the car in full self-driving mode when it allegedly malfunctioned, causing the vehicle to change lanes and brake abruptly while travelling across the San Francisco Bay bridge.
This self-driving car crash isn’t the first. The US National Highway Traffic Safety Administration (NHTSA) is also investigating separate reports of Tesla cars braking “without warning, at random, and often repeatedly in a single drive” while in autopilot and full self-driving modes.
In this article, we’ll look at the theories about what might have caused the autopilot car crash and explore the risks involved in using self-driving technology.
How did the Tesla car accident in San Francisco occur?
According to the Guardian, the driver of the Tesla Model S told authorities that he switched the car to full self-driving mode shortly before the crash occurred. The police report states that the vehicle was travelling at 55mph when it shifted into the left lane and braked hard, slowing the car to around 20mph. Drivers behind the Tesla in the left lane were suddenly forced to brake, ultimately causing a chain reaction of crashes that resulted in 2 children being hospitalised.
This news report shows security footage from the San Francisco Bay Bridge at the time of the Tesla autopilot crash. In the video, the white Tesla is seen to change lanes and brake, causing the cars in the left lane to crash:
Are Tesla cars safe?
As Forbes points out, Tesla offers two different systems that can take control of Tesla vehicles:
- The autopilot function: This comes as standard with all Tesla cars and can perform some driving functions such as guiding the car to highway exits and automatically engaging the blinker. The autopilot function has been designed to be used while drivers still have their hands on the wheel, so they can take over if the system does something wrong.
- The full self-driving function: This can be purchased as an add-on from Tesla, even though it’s only a prototype. Currently, the technology doesn’t comply with Australian road rules. Full self-driving mode does not work on highways: if the system detects that you’ve entered a highway, it will automatically switch over to autopilot.
The Tesla Vehicle Safety Report states, “Tesla vehicles are engineered to be the safest cars in the world.” The autopilot function is one of the “Active Safety” features included in all Teslas, enabling Automatic Emergency Braking (AEB) when it detects vehicles, pedestrians or objects in front of the car.
While Tesla insists that their cars are the world’s safest, evidence is mounting that the autopilot function is far from flawless. There have been several reports of Teslas braking suddenly when there was nothing in front of the car. According to the Daily Mail, the NHTSA has investigated 35 crashes since 2016 where Tesla’s autopilot or full self-driving systems were in use. Together, these accidents have resulted in the death of 19 people.
How many accidents have been caused by self-driving cars?
According to Drive, NHTSA data shows 392 car accidents from July 2021 to June 2022 involving cars equipped with driver assistance systems. Of the 12 car brands in that sample, Tesla cars accounted for 273 (70%) of the crashes. And of the six fatalities where self-driving technology was a factor, Tesla vehicles accounted for 83% of these incidents.
What are the risks of self-driving cars?
The autopilot Tesla crash is bringing greater public awareness to the potential risks of autopilot driving and AI-powered cars. Unexpected actions such as AEB can happen when the autopilot system malfunctions, which can lead to back and spinal injuries and whiplash.
Some of the other risks associated with self-driving technology include:
- Hacking: Like all computers, self-driving technology is susceptible to being hacked. If a hacker were to gain access to the system that runs your car, they could potentially take complete control of the vehicle.
- Driver distraction: A major issue with self-driving technologies is that drivers tend to stop being vigilant. Accidents and injuries can occur when the car decides to do something that’s unsafe, and the driver isn’t paying attention to correct it.
- Technical malfunctions: The computers that run self-driving cars aren’t 100% perfect. Sometimes bugs and errors can occur, which can lead to accidents.
- Lack of education: To gain a driver’s licence in Australia, you must pass a driving test. However, there is no test that allows people to operate a self-driving car, meaning there’s more room for error.
Who is liable when a self-driving car causes a crash?
From a legal standpoint, the Tesla autopilot crash is complicated. Under US law, when a car accident is caused by human error, the victim may make a claim based on the driver’s negligence. This is the basis for most car accident claims made in the US. However, if the car was operating on autopilot and essentially ‘driverless’ at the time of the collision, the cause of the crash might be a defect in the car. In this case, the liability would fall on the car manufacturer, not the driver.
As self-driving car technology is still new, the team at LHD Lawyers are watching the Tesla autopilot crash lawsuit to see how the American court system handles the case.
Do you have a motor vehicle accident claim?
LHD Lawyers help everyday Australians receive the benefits they are entitled to with motor vehicle accident claims, including injuries related to accidents with self-driving cars.
We are so sure of our abilities to win your case that we stand firmly by our No Win No Fee Policy. Call 1800 455 725 to arrange a consultation.
Author: James Bodel
Original Publish Date: January 23, 2023
Last Updated: August 16, 2024
Check if you’re eligible or get free claim advice now