Another Day, Another Reason Not to Use Self-Driving Technology

Another Day, Another Reason Not to Use Self-Driving TechnologyThanksgiving is the busiest travel day of the year, so we understand the urge to take steps to avoid the hassle where you can. Perhaps that means traveling in the dead of night, or choosing to celebrate the holiday over the weekend. For one driver on the West Coast, it meant utilizing their Tesla’s “full self-driving” software to navigate the heavy traffic.

According to a recent CNN Business report, however, that self-driving software is now at the center of an eight-car pile up “that led to nine people being treated for minor injuries including one juvenile who was hospitalized.” Police still cannot confirm whether the software was actually in use at the time of the collision, but the report says the driver claimed the car braked suddenly, its speed dropping from 55mph to 20mph, thus triggering the chain-reaction crash.

We are inclined to believe this driver. According to AP News, in 2022 “Automakers reported nearly 400 crashes over a 10-month period involving vehicles with partially automated driver-assist systems, including 273 with Teslas, according to statistics released Wednesday [June 15] by U.S. safety regulators.” Furthermore, the automaker is already under a “preliminary evaluation” by the National Highway Traffic Safety Administration (NHTSA) for sudden, unexpected braking while Autopilot is engaged. Per CNN Business:

The braking occurs “without warning, at random, and often repeatedly in a single drive,” according to NHTSA. It estimates that 416,000 vehicles may be affected. The agency did not say whether it was aware of any collisions, injuries, or fatalities associated with the issue.

The agency’s office of defects investigation has received 354 complaints alleging unexpected braking on 2021-22 Tesla Model 3s and Tesla Model Ys in the past nine months. The alleged braking occurs while using Autopilot, Tesla’s suite of driver-assist features, which allow the vehicle to brake and steer automatically. Tesla owners sometimes refer to the unexpected brake activation as “phantom braking.”

The news report notes that the vehicles involved were all use cameras, as opposed to radar (like older models), to inform the software.

Self-driving software in vehicles is neither helpful nor safe

It has been well-documented that vehicle safety technology can make people less safe while driving (or being driven). Self-driving tech is no exception; the Insurance Institute for Highway Safety said back in 2020 that at best, self-driving vehicles may stop about a third of all car accidents because vehicles can’t react to complex scenarios on the road. The only way for these vehicles to “learn” these scenarios is to constantly keep testing and updating software – and in order to do that, they need to be out on the roads, driving themselves. It’s a bit of a Catch-22.

We also have grave concerns about safety which have not been adequately addressed. The NHTSA’s report on collisions was a good first step, as it has been criticized for not being more aggressive in regulating these vehicles. More concerning, perhaps, is the slipshod way autonomous vehicle (AV) regulation is addressed. Automotive World explains:

Under current NHTSA regulations, automakers must equip vehicles with features designed for human operation, such as steering wheels and accelerator and brake pedals. However, these common features are superfluous in AVs with Level 4 capabilities. In these situations, NHTSA may annually exempt 2,500 vehicles per manufacturer from compliance with certain safety regulations. To date, NHTSA has not granted any AV-related exemptions and manufacturers are forced to comply with standards dating back to the 1960s. The proposed legislation would authorise NHTSA to exempt 15,000 AVs per manufacturer, for each of the first three years, from such regulations that are hindering the mass-adoption of AV technologies.  Beyond three years, NHTSA would have the power to exempt 80,000 AVs annually per manufacturer if the technology is “at least as safe as human-driven vehicles.”

There are a few elements of concern here. One, the NHTSA and automakers are discussing Level 4 automation: “the vehicle performs all driving tasks under specific circumstances. Geofencing is required. Human override is still an option.” Tesla’s “full self-driving” software is only categorized as Level 2, even though that’s the designation for a feature like cruise control. That makes us wonder what these agencies think Level 4 would actually look like.

Two, we’d like to know more about these exemptions. Tesla sold 169,507 cars in the United States in 2021. None of them came with “full self-driving” software, but anyone could request it. So how many owners opted for the software? Was it up to 2,500? Were they all exempted? And what does the exemption cover? These are things we should know before NHTSA starts authorizing 15,000 exemptions per automaker each year for three years.

Three, what does “at least as safe as human-driven vehicles” mean? According to Drive Safe Alabama, there’s a traffic crash about every four minutes in our state, for a total of 134,039 crashes last year. Being “at least as safe” does not seem like the right goal.

Who is liable in a collision with an autonomous vehicle in Huntsville?

The short answer is, the law doesn’t say. Alabama’s laws regarding AV only apply to commercial motor vehicles. As it stands, fully autonomous trucks are allowed on our roads even without an operator, as long as they carry at least $2 million in liability insurance. If an autonomous truck causes a multi-car crash, it is safe to say that this will not be enough insurance to cover the expenses of all injured parties.

As of right now, the assumption is that the operator – even if she or she is not actually operating the vehicle – would be liable in the event of an accident. There is an equally valid argument to be made saying that the manufacturer could be liable, especially if there’s a defect that causes the crash. This is why you want a Huntsville car accident lawyer to help you if you are injured in a wreck with an AV. We have the resources to handle cases where multiple parties may be liable, and the experience it takes to build a case that seeks the best possible outcome for our clients.

Martin & Helms, P.C. is based in Huntsville, maintains an additional office in Decatur, and serves Madison, Athens, and all of North Alabama. If you were injured in a car accident with a human driver or a self-driving vehicle, we want to help. Please call us or fill out our contact form to schedule your free consultation.