Should Driverless Drivers Be Irresponsible?

1028 Words3 Pages

Self-driving cars have become a trendy topic of discussion recently, and the main moral dilemma discussed is basically a reiteration of “the trolley problem”- in the event of an unavoidable accident, would the cars hit a pedestrian in order to save the driver? Should they sacrifice the driver in order to save several pedestrians? Because it is ethically impossible to put value to human lives, it is impossible to prove an answer correct. However, it should be clear that human lives are more valuable than the reputation of corporations. Because manufacturers would be held responsible for driverless-car-related accidents, producers are withholding the cars until they are as safe as possible, especially since possible buyers are doubtful from …show more content…

Corporations fear that they could be faced with an excess of lawsuits if driverless cars go commercial before they are as safe as possible. While it is good that they focus on improving safety before release, how safe do they need to be? Up to 90% of accidents are due to or are partially caused by driver error (Smith). Schellekens’ article points out that “The standard ‘as safe as a human driven car’ could be made more precise in the following ways: The automated car should statistically be safer than human drivers, or the automated car should be safer than the best human driver.” Since driver error is so common, the first would be an improvement and could still save many lives just by preventing a few drunk or distracted driving accidents and statistically, some self-driving cars are already safer than the average driver. According to Allstate, the average rate American drivers have driving accidence is once every 10 years, and according to the Federal Highway Administration, the average mileage per year is 16,550, meaning on average, U.S. drivers have one accident every 165,000 miles. Between 2009 and May 2015, Google’s self-driving car program reported 12 collisions in one million miles of autonomous driving on the road, meaning one collision every 83,000 miles. Of these 12 collisions, 5 reports include the words “was rear-ended by another vehicle” and 8 happened while being manually driven or the driver took control at the time of impact (Google Self-Driving Car Project Monthly Report). But designers are holding back until the cars are as safe as possible, for their own protection. While Google’s driving software is not yet equipped to handle conditions such as weather and construction, drivers would still be able to take control when they see fit

Open Document