The Wild Wild West of Arizona’s Self-Driving Cars

By Tessa Patterson.

Self-driving cars have always been a staple of the future. In this Jetson-esque paradise, cars would talk, deliver riders safely to their destinations, and, ideally, fly. Although cars have yet to hit the sky, Arizona continues to attract companies to its desert streets, positioning itself as one of the most automated-vehicle-friendly states in the nation.

This past October, Alphabet’s Waymo took to Phoenix to offer riders a fully driverless experience. Once only available to beta testers, customers in Phoenix can now step into the future and buckle up for an artificial-intelligence-controlled ride. Automated vehicles have sped into other industries too. With Arizona State University rolling out autonomous food-delivery robots in Tempe and Walmart’s recent partnership with self-driving car company Cruise to deliver groceries in Scottsdale, the future for automated vehicles is an endless winding road – albeit a potentially bumpy one.

Uber’s Deadly Collision and Its Swerve Around Liability

With drivers doing less and less behind the wheel of automated vehicles, what happens if there is an accident? In September 2020 that question was answered when the operator of a self-driving Uber was charged with negligent homicide for killing a Tempe resident.

As Elaine Herzberg, 49, wheeled her bicycle across the road, Uber operator Rafaela Vasquez sat behind the wheel going 39 miles per hour, watching “The Voice” on Hulu via her phone. Vasquez struck Herzberg, ultimately killing her, making for the first death involving a self-driving car.

Following the accident, the Yavapai County Attorney’s Office found that there was “no basis for criminal liability for the Uber corporation” and found no evidence with which to charge Uber. Although an investigation after the case revealed the emergency brake had been disabled in this autonomous vehicle, the company was not found to be vicariously liable for the death.

This type of reasoning tracks with the current method of assigning blame in the typical car accident. When two drivers collide, it would not make sense to hold the manufacturer of the vehicle liable for the actions of the driver. But what about when an accident occurs at the fault of a fully-driverless vehicle? As long as there is not a distracted driver sitting in the driver’s seat, the answer may be found in traditional product liability law.

Applying Product Liability Law to Driverless Vehicles

If someone is injured by a self-driving vehicle, product liability appears to be the most appropriate route for providing a remedy. In a products liability claim, plaintiffs have to carry a heavy burden and prove two essential elements of their case: the defendant sold a defective product and the defect was the proximate cause of the plaintiff’s injury.

In building their case, a plaintiff can take multiple routes in showing a product is defective. First, a plaintiff can prove a product is defective by design. Design defects are inherent flaws that exist before the product is manufactured. This would mean that the product, while potentially serving a useful purpose, is unreasonably dangerous due to a specific design flaw. In self-driving cars, an example of this would be a car with sensors that could only detect other vehicles or pedestrians once it is one foot away. When a self-driving vehicle is that close to something, the likelihood of a collision is inevitable. The product would be inherently flawed by design.

Second, by proving a product has a manufacturing defect, a plaintiff could bring a successful products liability claim. A manufacturing defect is an issue in a product that occurs solely due to the production or construction of the product. Instead of being an issue of design, manufacturing defects will typically arise in only a few end products, rather than an entire product line. An automated vehicle with a manufacturing defect could have a sensor that registers obstacles from 30 feet away, but due to the sensor being installed incorrectly, the sensor fails to alert and stop the car.

Finally, product liability claims can also be brought due to a defect in marketing. This type of claim is brought when a manufacturer fails to place warning labels or instruct users of reasonable risks related to the product. In a car, these labels are frequently found on sun visors, warning the driver of the risk airbags pose to children sitting in the front seat. Where a fully-automated vehicle would not have a driver to actually read the label, this type of claim is not applicable.

After identifying a defect, the plaintiff would then have to prove that the defect was the proximate cause for their injury. In a case involving an automated vehicle, this means the plaintiff needs to show the collision or injury would not have happened if not for the defect.

Even if the plaintiff is able to prove these essential factors, an automated vehicle company could still avoid liability under the risk-utility test. Under the risk-utility test, if a defendant can prove the benefits of the design outweigh the risk of danger, the defendant may avoid liability. In the context of automated vehicles, where the majority of car accidents are due to human error, courts may see some benefit in incentivizing future development and choose not to punish these companies.

What Is the Future for Fully-Automated Cars?

While the majority of automated vehicles currently have people in the driver’s seat, and likely will for some time, electric vehicle manufacturers like Tesla are striving to make their cars even more autonomous. However, where the human element appears to be the most dangerous factor in driving, this fact suggests that the only way a street full of automated vehicles could be safe on roads is for the majority of the population to use them. Furthermore, where most self-driving cars currently give drivers the option to control the car, this function would likely need to be removed for automated vehicles to achieve their optimum success.

With names like Ford and Chevy deeply engraved in the American culture, this transition may be a hard one to accomplish. Although this change will not be “automatic” and will certainly require a “shift” in thinking, it is the only way that will allow automated vehicles to fully kick into gear.

"Waymo self-driving van" by robpegoraro is licensed under CC BY-NC-SA 2.0

Share with Your Network

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Share on email
Email
Share on print
Print

By Tessa Patterson.

Tessa Patterson, J.D. Candidate, 2022

Tessa Patterson is a 2L Staff Writer from Tempe, Arizona. Before law school she graduated from the University of Arizona in 2017 with a degree in Journalism and worked in business development. In law school, Tessa has enjoyed externing at the Maricopa County Attorney’s Office, the Governor’s Regulatory Review Council, and is currently at the District Court. When she’s not studying, you can find her at the local CrossFit gym or rewatching The Office for the 100th time.

The opinions expressed herein are those of the individual contributors to the ASLJ Blog and should not be construed as the opinions of the Arizona State Law Journal or the Sandra Day O’Connor College of Law at Arizona State University.