When it comes to cars, the way forward will always be automation. It started with manufacturing, with assembly lines being filled with robot arms connecting parts precisely. Now, the focus is on automated driving, with companies like Tesla and Waymo testing their technology on the road.
In an interview with ZDNet, the vice president of GM Cruise said that self-driving cars can provide transportation that’s safer, more convenient, and more accessible. After all, the goal of automation is to take human error out of the equation and ensure that every result is accurate and precise. However, the overall effectiveness of self-driving cars when it comes to safety has been a hot topic among companies, consumers, and policymakers alike.
Self-driving Car Accidents
While self-driving cars do filter out human error, the competition is quite tough. Humans are actually doing well when it comes to road safety. In the United States, the rate of crash fatalities per 100,000 of the population (11.2) is half of what it was 40 years ago (22.5). And this is despite the population continuing to grow. The rate of deaths per 100 miles traveled is also at a small 1.13. Those are tough figures for an AI to beat. AI cars have to drive hundreds and thousands of miles for a few years for them to be deemed safe.
A few companies have already experienced issues with their self-driving platforms, whether they were already released or still in their testing phase.
The most significant instances of self-driving car accidents were from Tesla and Uber. The former has an “autopilot” feature for its consumer vehicles. This means that it can switch lanes, slow down, speed up, and navigate without the driver’s intervention. The feature on the Tesla Model X was recently implicated in a 2018 crash, which took the life of the driver onboard. Autopilot was enabled when the accident happened, according to the company. The driver’s hands weren’t detected on the wheel about six seconds before the car crashed into the concrete barrier, Tesla claimed.
Uber’s self-driving car accident also happened in 2018. The crash took the life of a 49-year-old woman who was wheeling her bike across the road when she was hit by the self-driving vehicle. The operator was ultimately to blame for the accident, according to National Transportation Safety Board (NTSB) investigators. The operator of the car was found to be streaming a TV show on their phone mere seconds before the crash happened.
Accountability in Self-driving Car Accidents
If you get hit by a self-driving car, it’ll be difficult for you and your personal injury lawyer to figure out who’s accountable for the accident. This is because there are many possibilities to take into consideration. Both Uber’s and Tesla’s cases were difficult for investigators to conclusively close.
As The Verge pointed out on their report on the Uber crash, the driver was definitely complacent because they weren’t looking at the road when the crash occurred. However, the self-driving system failed to do its job immediately. It didn’t identify the victim as a human being. It didn’t get to predict the victim’s path on the road, too. The same thing can be said about the Tesla incident. The system didn’t identify the concrete barrier fast enough, so the emergency brakes weren’t triggered.
Conclusion: Automation Complacency
In the same The Verge article, NTSB board member Bruce Landsberg stated that the phrase “automation complacency” should be in everyone’s vocabulary when it comes to handling these types of cases. This is because today’s self-driving cars are still considered low-level in terms of automated driving. There are around six levels of automated driving, with one having the most human assistance and six being fully-automated with no human intervention.
Today’s cars are still around level two and three. This means that humans are still an integral part of maneuvering self-driving cars. And those who were involved incrashes, whether they be the owners themselves or independent contractors hired by companies to generate miles, likely had too much trust in their vehicle’s self-driving system. This is despite most manufacturers’ guidelines recommending that drivers always keep their eyes on the road.
Self-driving automation is definitely getting better every year. However, it’s far from the perfect level six automation that some people may think it’s in. It’s vital for users of these vehicles to continue to be fully aware of the road despite the convenience of cruise control, automatic emergency braking, automatic lane changing ,and automatic navigation to keep themselves, passengers, and their fellow drivers safe.