The race to bring self-driving cars to market has turned roadways into testing labs and humans into guinea pigs—and the consequences have been deadly.
Earlier this year, an Uber self-driving vehicle struck a pedestrian in Arizona and killed her, marking the second fatal crash involving an Uber autonomous vehicle.
These fatal crashes raise critical questions about the safety of self-driving cars, and whether autonomous vehicles are ready to share our roads.
A few states have unleashed this technology on public streets without the consent or awareness of its residents. California began issuing autonomous vehicle permits in 2014 and testing has been under way in states like Arizona and Pennsylvania for the last few years.
The Cost of Rushing Unproven Technology to Market
The automotive industry has a long history of taking shortcuts on safety to gain a competitive edge in the marketplace and certainly the autonomous vehicle space is no exception. Self-driving companies are under tremendous pressure to win the race to market.
In litigation earlier this year between Waymo, a self-driving technology company, and Uber, texts from Waymo co-founder Anthony Levandowski to Uber’s CEO revealed the pressure to get autonomous vehicles on our roads regardless of safety. Levandowski stated, “we need to think through the strategy to take all the shortcuts we can find.” Levandowski continued, “I just see this as a race and we need to win, second place is the first looser [sic].” (See https://www.recode.net/2018/2/5/16974668/uber-alphabet-waymo-self-driving-trade-secrets.)
Many self-driving car companies have been testing their products on public roadways without federal regulations in place to govern the testing. In some states, these companies are not required to disclose how their vehicles are performing, and the data they provide is very much at the discretion of the companies themselves.
Unfortunately, we are forced to rely on the “goodwill” of self-driving companies to provide testing and performance data, and sometimes this information can be incomplete or misleading. Therefore, we know very little about how these vehicles are designed, how they make decisions and how they react in emergencies.
Legislation currently under consideration would require autonomous vehicle developers to provide safety assessments that are entirely voluntary. Without accurate, reliable data in the testing stages, consumers are left guessing whether autonomous vehicles are ready for our communities. In the meantime, crashes and fatalities continue to occur.
Sources:
The Waymo v. Uber Trial has shaken my confidence in self-driving cars (The Verge)