The prospect of seeing self-driving cars on the road has been a human fantasy for decades. Over the past several years, we are starting to see it become a reality.
As what should come as a shock to no one, something this groundbreaking certainly comes with a fair share of growing pains. For as much hype as they are getting, there have been equal amounts of (if not more) skepticism surrounding self-driving cars and the impact they will have on everyday life. After all, the core concept of this innovation essentially puts human lives completely in the hands of computers.
Even the President of the United States has voiced his disapproval.
Let’s talk about a few of the major points of discontent surrounding autonomous vehicles.
No. 1: Autonomous Technology is Not 100% Flawless
In mid-March, President Donald Trump reportedly joined the 71% of Americans wary of autonomous vehicles. Trump allegedly told a fellow golf club member, who was raving about the self-driving capabilities of his new Tesla, “Yeah that’s cool, but I would never get in a self-driving car… I don’t trust some computer to drive me around.”
To many, this is a valid point. Uber and Tesla, two companies working to lead the charge of the self-driving car industry, have faced several major hurdles. Over the past few years, there have been a multitude of crashes stemming from faulty autopilot features; some of which have ended in fatalities.
To reiterate, the basic principle of self-driving cars is to put the passenger in the hands of autonomous technology. Any malfunction whatsoever may potentially be the difference between life and death. In other words, there can be absolutely no margin for error.
Putting their lives in the hands of a machine is an extremely tough reality for most people to grasp. Even with all the testing, refinement, and promising statistics, there is always a chance that something could go wrong. Unpredictable weather, terrain, obstacles, etc., all come into play when assessing trust in driverless technology. A single instance of malfunctioning technology will be enough to hold people back from embracing self-driving cars.
No. 2: Shifting Accountability
One of the most interesting byproducts of self-driving cars is that human accountability is essentially non-existent.
So who does the blame shift to in the event of an accident?
The most common answer would be the automaker. Any slight malfunction with the autonomous capabilities could result in a hefty lawsuit. If this is the case, why would auto manufacturers even want to produce self-driving cars when anything that goes wrong could automatically be their fault?
Tesla has been in hot water from car accidents stemming from their autopilot feature. Unfortunately, they have not been overly accepting when it comes to accountability. In March of 2018, a driver by the name of Wei Huang was using the autopilot feature in his Model X. The vehicle ended up slamming into a divider barrier and went up in flames. Huang, unfortunately, died as a result of the crash. Below was Tesla’s response:
“The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”
As self-driving cars take over the road, it’s a logical assumption that the lemon law is going to be more influential following accidents, as human error will effectively be eliminated from the equation.
As a lemon law lawyer in San Diego, I have seen multiple cases related to faulty software. Currently, the lemon law applies to both hardware and software issues. As time goes on, defective software will certainly play a much larger role in how cases shake out. It appears that this trend will become even more prevalent in the progression of autonomous features.
No. 3: Lack of Adequate Regulations
In many ways, conventional regulations will not translate to driverless cars. These regulations are traditionally based on the Federal Motor Vehicle Safety Standards (FMVSS). Developed over many decades, these regulations spell out performance requirements for safety-related parts of a vehicle. These include components like brakes, airbags, mirrors, lamps, etc.
The manufacturer of the vehicle must meet all of these requirements before it is placed on the market. For conventional, human-operated vehicles, federal regulations don’t go too in-depth about how exactly manufacturers test vehicles. Then again, they don’t really have to. The development and testing of conventional cars happens in private facilities where the vehicles pose no threat to the public.
With driverless cars, the testing process is completely different. Taking humans out of the picture, the most important capability of driverless cars is their adaption to real-world situations. These obstacles are nearly impossible to replicate on private tracks. That said, manufacturers must place autonomous vehicles on public roads for safety testing to clearly demonstrate the product is safe. The result is that other drivers on the road are essentially involuntary participants in the testing.
In September of 2017, the House passed legislation called the SELF DRIVE Act, which would create broad exemptions from the FMVSS for driverless cars. This would require manufacturers to submit safety reports explaining the key safety features of self-driving cars.
While this may be a step in the right direction, effective regulation for driverless cars remains one of the biggest question marks surrounding this innovation.
On the surface level, the idea of driverless cars is a dream come true for some. The prospect of getting to sit back and relax while the vehicle does all the driving is enough to excite anyone. However, when looking beneath all the hype, there are a lot of kinks that need to be worked out.
The general public’s skepticism of autonomous vehicles certainly raise some good points. Given how much has happened with this technology in the past five to ten years, it’s going to be extremely interesting to see what the future holds.