Self-driving vehicles are becoming increasingly common across the United States. Many cities feature numerous self-driving cars, and companies are making advancements in technology to increase the availability of autopilot cars. Although these technological advancements are exciting, self-driving cars are far from perfect.
Some errors that occur with autopilot cars can cause traffic accidents, which raises the question, “Who’s responsible in the event of a self-driving car accident?” The answer to that question is evolving as self-driving cars become more common, but previous cases offer helpful indicators of what you can expect if you experience a wreck that involves an autopilot car.
Currently, the liability of self-driving car accidents is placed on the company that manufactured the vehicle, assuming the self-driving car caused the wreck rather than human error. For example, there have been incidents involving Tesla’s autopilot vehicles causing traffic accidents because of flaws with the vehicles’ sensors. Several lawsuits have been brought against Tesla this year alone. Most lawsuits against developers of self-driving cars have settled rather than gone to court.
Some states already have laws governing self-driving vehicles, and plaintiffs in these states have the option of suing different parties who are potentially at fault for a self-driving car accident. Plaintiffs can sue companies that created the autopilot technology, the vehicle manufacturer, and the company that installed the technology into the vehicle. Additionally, plaintiffs can also sue the person who was sitting behind the wheel of the self-driving car. Although plaintiffs have options when it comes to suing for damages and personal injuries, some states limit the number of parties that a plaintiff can sue.
Statistically, self-driving cars are more likely to get into a wreck than standard vehicles, but most of the time, the self-driving car accident is caused by human error rather than faulty technology. In one study of 38 traffic accidents involving self-driving cars, only one was caused by vehicle technology.
When an accident involving a self-driving vehicle occurs, human error is investigated through witnesses, police reports, and crash details. The person sitting behind the wheel of a self-driving car could be found at fault if they were distracted or impaired by drugs or alcohol. Driverless vehicles typically feature error messages informing the passenger in the driver’s seat to override autopilot in the event of an issue or an impending crash. The person who sits in the driver’s seat is responsible for paying attention to these error messages, and they could be held liable for the accident if they ignore them.
Many self-driving car accidents occur because the vehicle’s owner wasn’t paying attention to error messages indicating an issue. If you’re the owner of a self-driving car, you still need to perform due reasonable care, which is a legal term meaning that you acted as a reasonable person would, and be prepared to take over autopilot in the event of an issue.
Laws dictating self-driving vehicles and liability are evolving. We’re only at the beginning of autopilot cars, and as it becomes more commonplace, more issues and questions will arise regarding who’s liable for self-driving car wrecks. This is a complex area of the law, so you need proper legal support if you were involved in a self-driving car accident.
For expert Indiana car accident attorneys, contact Poynter & Bucheri Attorneys at Law. Legal support is critical in recovering money for damages and personal injuries sustained in a car accident, and the lawyers at Poynter & Bucheri will work tirelessly to get you the compensation you deserve from your accident. Call us at 1-800-265-9881 or click here for a free case review.