People have always been excited about new technology. When Apple came out with the first iPhone back in 2007, the world was watching; excited about what the new technology has in store and how it can make life more convenient. True enough, because of the iPhone’s launch, a thousand more other features and advances in technology has taken place. Information has become readily available to anyone with a smartphone and an Internet connection; you might even be reading this article on a smartphone or tablet. The same expectations from people apply to autonomous or self-driving cars. People expect that this technology would not only be more convenient but would also be safer on the road.
The hope for self-driving cars is that it could replace human drivers in the future. Human drivers can become distracted, sleepy, or even be driving under the influence of drugs and alcohol, a problem that self-driving cars will ultimately be eliminating. According to the National Highway Traffic Safety Administration (NHTSA), nearly 40,000 Americans died on the roads last year, and about 90 percent of those deaths are due to human error. However, just like any other technology, self-driving cars still have a lot to go through before it’s perfected, and there lies the problem.
Self-driving cars are expected to adapt to the environment they are driving in for them to “learn” and for their software to be updated accordingly. Self-driving cars get tested on actual roads to learn real-life navigation, but because these cars are not yet perfect, accidents happened. As of writing, Tesla’s Autopilot system has had two fatalities; one occurred in 2016 and the other last March 23, 2018. You can read Tesla’s statement about the 2018 crash here. Another incident is a self-driving Uberhitting a woman pushing her bike in Arizona.
The question now is who will be liable when you get into an accident with a self-driving car? This all depends on the automation level of the vehicle that got into an accident. There are five levels of automation levels for autonomous cars:
1. Level one. – Can handle one automatic task at a time, like automatic braking.
2. Level two. – These would have at least two automated functions.
3. Level three. – Can handle dynamic driving tasks but might still need human intervention.
4. Level four. – These are officially driverless in certain environment.
5. Level five. – Cars that operate on their own without any driver presence.
In the case of a level four or level five autonomous car, the liability would most probably fall on the vehicle manufacturer because the accident doesn’t involve any human input at all. The vehicle may have gotten into an accident because of a failure in its system or a glitch on its sensors making the manufacturer responsible for the damages in case of a crash.
In cases like the Tesla Autopilot system, classified as a level 2 autonomous vehicle, the driver inside the car may still be held liable. In the first incident that happened in 2016, the NHTSA conducted a six-month investigation which resulted in Tesla not being liable since the driver had 7 seconds to step on the brakes. This incident shows that a driver can still be liable and that auto insurance can still be relevant in a world where self-driving cars are the norm.