Self-driving cars, once the stuff of science fiction, are rapidly becoming a reality. While they promise a future of safer roads and less traffic congestion, this exciting new technology also raises complex legal questions. As the number of accidents involving automated cars mounts, the debate around liability is heating up.
Not All Self-Driving Cars Are Created Equal
First, let’s sort out the concept of self-driving cars. For all the publicity that Tesla generates, even the company’s so-called Full Self-Driving (FSD) feature requires active driver supervision. The same goes for Tesla’s most recent headline-grabber, the futuristic-looking Cybertruck. Like its predecessors, it features some level of automation but has yet to be equipped with FSD.
In fact, not even cars with no driver behind the wheel have reached Level 5, the highest level of automation as defined by SAE International. For instance, Waymo’s driverless robotaxis are ubiquitous in cities like San Francisco and Los Angeles. Although considered “fully autonomous,” Waymo’s vehicles are backed by a team on remote standby for high-level guidance if a vehicle finds itself in a novel or ambiguous situation that it isn’t confident about handling on its own.
The driving automation levels break down as follows:
- Level 2: Assisted Driving (like Tesla’s Autopilot): The driver remains responsible for monitoring the road and intervening when necessary. These systems can provide steering and braking assistance, but the driver must be attentive.
- Level 3: Conditional Automation: The car can handle some driving tasks in specific conditions, but the driver must be prepared to take over control when prompted.
- Level 4: High Automation: Level 4 self-driving cars can handle most situations on their own, but unlike Level 3, they can take control if something unexpected happens. While a driver or remote operator can still manually intervene, these vehicles are designed to operate without constant human supervision.
- Level 5: Full Automation: The car is completely self-driving, with no human intervention required. These are still in development and not yet commercially available.
Accidents Showcase Issues
The promise of driverless technology has been far from problem-free. On Oct. 2, 2023, a Cruise robotaxi struck a woman crossing a street in San Francisco, leaving her in serious condition. The woman had initially been hit by a person driving another vehicle, which flung her into the path of the robotaxi. While the Cruise vehicle came to a stop after the collision, it then moved to the side, dragging the pinned woman 20 feet. NPR reports Cruise, a driverless car startup, is now facing fines, investigations, and an uncertain future.
Then, in January of 2024, a Waymo robotaxi had a close call with a moped. In that case, the error was due to a remote human operator overseeing the vehicle. According to Waymo, the remote operator missed the red light and instructed the vehicle to proceed through the intersection. Fortunately, the car’s sensors detected the oncoming moped, and the automated braking system prevented a collision.
Promises vs. Reality
Cruise and Waymo point to millions of driverless miles driven without fatalities, suggesting their technology improves road safety. However, NPR reports San Francisco residents have reported numerous close calls and malfunctions involving these robotaxis. Local news footage shows self-driving cars confused in residential areas, obstructing construction zones, and even disobeying traffic signals.
This growing disconnect between industry promises and real-world experiences has fueled public frustration. The activist group Safe Street Rebels has documented over 500 such incidents, highlighting concerns about the technology’s reliability.
Safety Concerns Extend Beyond Civilians
The challenges posed by self-driving cars go beyond public perception. San Francisco’s emergency responders have also expressed concerns. During government hearings, police and fire departments reported nearly 75 instances where self-driving cars impeded their work. These incidents included blocking fire engine access, disregarding emergency lights and tape, and even driving over fire hoses.
The Legal Fallout: Who’s Accountable?
The emergence of driverless vehicles presents new challenges for determining fault in accidents. Current laws hold human drivers responsible for accidents. But as automation increases, the question becomes: who’s the driver in a self-driving car? In California, where comparative negligence applies, accidents involving both a human driver and a self-driving system become complex.
For instance, in the Cruise accident, the human driver who caused the initial collision would likely be assigned a significant portion of the blame. However, the self-driving system’s behavior after the impact, potentially dragging the victim, could also hold Cruise liable.
A lawsuit might argue that a human driver would have acted differently, avoiding further harm to the pedestrian. This case differs from accidents with driver-assistance features, where human control remains primary, and manufacturers can deflect blame.
Here are the potential parties who could be liable:
- The Car Manufacturer: If a car’s design or manufacturing defect causes an accident, the manufacturer could be held responsible. Some legal experts propose holding manufacturers strictly liable regardless of fault, incentivizing them to create failsafe systems that prioritize safety above all else.
- The Software Company: Similar to the manufacturer, the company that develops the self-driving software could be liable for malfunctioning code leading to a crash.
- The Passenger: Depending on the level of automation and the specific circumstances, there could be situations where the passenger might share some liability. For example, if a passenger in a Level 3 car ignores repeated prompts to take over control, they could be partially at fault.
The Legal Landscape is Evolving
There are currently no federal laws specifically addressing self-driving car liability. However, some states have passed legislation, and the debate at the national level is ongoing. As self-driving technology continues to develop, so does the legal framework surrounding it.
Need Counsel? Contact Penney & Associates
If you or a loved one has been involved in an accident with a self-driving car, don’t wait to seek legal counsel. At Penney & Associates, we specialize in personal injury law, including cases involving emerging technologies like autonomous vehicles.
Contact us today to schedule a free consultation and let our experienced team of trial attorneys help you understand your rights and pursue the compensation you deserve. Trust Penney & Associates to advocate for you every step of the way.
Keep Reading
Injured in a car accident? A guide to taking the right legal steps
Smart car privacy: A recent court ruling and what car owners should know
Personal injury statistics: What three studies reveal about California