In March, a woman in Arizona died after being struck by a self-driving car operated by Uber. This incident marked the first time a self-driving car had killed a pedestrian. The car was traveling at approximately 40 miles per hour at the time of the collision and did not appear to slow down once the pedestrian entered the roadway. This incident sparked debate over the safety of self-driving cars.
But even before this accident, Americans had concerns about the safety of self-driving cars. Fully automated vehicles that drive us instead of us driving them are inching closer to becoming a part of our daily lives. But the technology has not been fully developed, and there are some serious concerns about whether this future mode of transportation will be safe.
Accidents Involving Self-Driving Cars
The March incident in Arizona was not the first collision involving a self-driving vehicle. In another incident in Arizona, a car that was turning left across three lanes of traffic was struck by a self-driving vehicle operated by Uber. No one was seriously injured, and the driver of the turning vehicle was cited for failing to yield the right of way. In Florida, the operator of a self-driving Tesla died when the car failed to stop after a truck turned in front of it. It was determined that the car did not apply the brakes because it was unable to differentiate between the whiteness of the truck and the bright sky. In November, a self-driving shuttle bus and delivery truck collided when the delivery trucked backed into the shuttle that did not move out of the way. In most of these cases, it was determined that driver error was the source of the accident.
What are the Concerns?
Two leading causes of accidents have been identified as safety concerns with self-driving vehicles. The first is that sensors on the vehicles are not correctly detecting what’s happening around them. In the Tesla crash, the car was running on Tesla’s Autopilot system and failed to recognize when a white truck was turning in front of the car. It was determined that the Autopilot system might have been confused by the bright sky behind the truck.
The second cause of accidents may be that a self-driving car does not know how it should react when it encounters a situation that the people who wrote the car’s software didn’t plan for. For instance, a truck not seeing what’s behind it and backing up or a pedestrian walking into the street where there is no crosswalk. Self-driving cars are programmed to obey traffic laws and follow the rules of the road. But they also need to be programmed on how to behave when pedestrians and other vehicles do something out of the ordinary.
While accidents involving self-driving cars are not currently a daily occurrence, regular car accidents are all too frequent. If you’ve been seriously injured in a car accident, you should consult with an attorney. At Bonina & Bonina, P.C., we have over 50 years of experience helping injured New Yorkers. Contact us online or call us at 1-888-MED-LAW1 to schedule your free consultation. Home and hospital visits available. Se habla espaňol.