fac_bk_img

Driving Blind: An In-depth Look at Semi-Autonomous Vehicles and Product Liability

The Tesla brand is amazing. A car company hasn’t had the American public this excited about cars in years. These cars are sleek, fast, and completely electric. I still recall the first time I saw one in person. It was on a trip to Columbus, Ohio a few years back. I was at a large shopping center in search of the Lego Store. As I walked out of the main complex I noticed a Tesla Gallery off to the left. (Tesla calls their dealerships Galleries because the experience is much different than going to a traditional car dealer.) There sat a shiny new Tesla Model S. I remember going into the Gallery to look at their display models. They were pristine. In that moment I knew I wanted one of these cars.

Everything about Tesla is interesting. Even founder Elon Musk has been called a real-life Ironman. In addition to the Model S, Tesla has also introduced its first SUV, and a new smaller; more affordable model is due out next year. The most interesting thing about all of these cars from Tesla is the fact that they can drive themselves…or can they?

Have you seen the YouTube videos in which Tesla owners demonstrate the car’s Autopilot? It’s a really neat feature. While on the interstate, Tesla’s autopilot feature will steer the car for you and control the throttle. It’s like a cruise control that’s been dialed up to 11. But, every Tesla employee from Elon Musk down to the person starting their first day in maintenance will tell you that it’s not a self-driving feature. It’s a semi-autonomous driving feature. Self-driving cars don’t exist yet. The Tesla feature is called Autopilot because it mimics the features found in planes that do very similar tasks. Tesla states that there has to be an alert driver behind the wheel when Autopilot is engaged. Will there ever be self-driving cars? I think it’s safe to say yes. But that technology does not exist yet.

There are two things we need to discuss in more detail; first, what makes a feature semi-autonomous and second, how people abuse autonomous features, and how that impacts safety.

You may not be aware of this fact, but a lot of cars made today have some sort of semi-autonomous feature. If your car has automatic braking, adaptive cruise control, lane keep assist, hill descent control, or even a self-parking feature, then it has some form of semi-autonomous control. Most of these features use sensors, cameras, radar, or some combination of the three, to take control of your car. These features are useful. I recently rented a car with adaptive cruise control, and I am seriously considering it for my next car. But, it’s important to note that these features aren’t perfect, they don’t work all of the time, and, they can be finicky.

In February of this year, I found out how tricky these systems can be. At the Cleveland Auto Show, I tested the all-new 2016 Chevy Malibu. It had an entire suite of driver-assist features. The Chevrolet spokesperson who accompanied me on the test drive wanted to test the lane departure warning system. I was hesitant because I didn’t want to act like I was wrecking, but she wanted to see how well it worked. I tried three times to get it to engage, but each time it failed. She insisted that the sensors could read the lines on the road, and if I were to depart from my lane, the seat would start vibrating. I really did not need to test this out, but I kept trying. When we eventually gave up and just chose to stay in the lane in which I was driving, the representative informed me that I was causing the feature to not work. She stated that the car could tell I wasn’t actually leaving my lane. First of all, yes I was leaving my lane, and second, how could it possibly know that I was playing a trick on it? My theory is much more believable. If you haven’t experienced winter in Cleveland, then you haven’t known true pain. The weather was awful. It snowed earlier in the week and the roads were covered in salt. Salt just happens to be the same color as the white lines that marked the lanes of this road we used. I am no expert, but it seems only right to think that the car’s sensors can’t make out the lines on the road if they are covered in salt. It turns out I may have been on to something here.

No driver assist or semi-autonomous feature is ever going to be perfect. This was made very clear recently as Tesla announced an individual was killed while using Autopilot earlier this year. A Tesla Model S occupied by Joshua Brown collided with a semi-truck on May 7th, 2016. The car was in Autopilot mode, and the tractor-trailer was crossing the highway. The Tesla didn’t recognize the trailer section of the truck and did not slow down or stop. The Tesla impacted the trailer with the windshield and then rolled under the trailer. The car continued to roll down the highway until it came to a stop after hitting a fence. This is according to accident reports and an interview which was given by the driver of the tractor-trailer.

The National Highway Traffic Association is currently investigating the accident. However, Tesla has released a statement and there appear to be two main reasons the Autopilot failed.

Tesla is very open about the limitations of Autopilot. It won’t always detect skinny objects or objects hanging from the ceiling. So, since the car sits much lower than the trailer it collided with, the sensors did not detect the trailer. If it would have been near the tires of the trailer, or by the truck itself, the system would likely have detected the truck, and the automatic braking would likely have kicked in.

The second limitation is also pretty important. Apparently, this accident occurred at sunset. The car appeared to be driving into the sun. The light from the sun combined with the white paint of the trailer made the tractor trailer invisible. The windshield camera could not detect the trailer whatsoever. This actually appears to be a limitation of almost all semi-autonomous driving aides. Subaru, which has the highest rated accident avoidance system according to the Insurance Institute for Highway Safety, has some limitations with their system too. They call their Adaptive Cruise Control/Auto braking feature Eyesight. It works much like the system found in the Tesla but it does not steer the car. Subaru is clear about the limits of Eyesight. They even have an added safety feature that will shut Eyesight off if visibility is compromised in any way. An icon will even appear in the instrument cluster showing that it’s been disabled.

So, we have established that these Autonomous features have limits, but they are merely meant to be driving aides. This brings us to our second main point, abuse. A lot of people aren’t using these features as intended and they are ignoring safety warnings. Tesla advises drivers to be alert behind the wheel and to keep their eyes on the road. In fact, several warnings are displayed before engaging the features. Once again, Autopilot is simply a driving aid. However, videos have emerged online of owners using features, especially in the Tesla, improperly. One video even shows the owner of a Tesla asleep behind the wheel while Autopilot is engaged. So this does happen, and there is some speculation that Mr. Brown was distracted at the moment of impact when his Tesla hit the trailer. The owner of the semi-truck states that he could hear a Harry Potter movie playing when he approached the accident scene. And the accident report claims that a portable DVD player was found in the wreckage. So it’s possible that Mr. Brown was watching a DVD when he impacted that car.

So, who is liable here? We can’t speculate too much as there is an investigation under way. Granted, the vehicle does have disclaimers about the system, and it appears to be upfront about the limitations of the system. However, a disclaimer or a warning is not a free pass for a company. If Tesla is found at fault, in this case, it could fall under a product liability case. We are not currently aware of any lawsuits. It is not clear if the family of Mr. Brown will pursue anything against Tesla. So, there is a lot that will be determined by the NHTSA investigation. Tesla has been very cooperative with the investigation, and they are not hiding from this tragedy. Technology is changing the way we drive. Though this was the first death in a vehicle like this, it won’t be the last.

The bottom line with any driver-assist feature is to treat it like an assistant. You likely wouldn’t feel comfortable if a doctor let their surgical assistant take over during a procedure, and then left the operating room. The same should be true about driving. Never allow yourself to become distracted. Even if you’re just using cruise control, keep your feet near the pedals in case you have to make an emergency stop. If you get a new car, make sure you are fully aware of all of the features, especially is you’re using a feature that is foreign to you.

Have you been involved in an accident in which your driving aide failed you? If so, call us today for a free consultation. Our Toll-Free number is: 1-877-526-3457 or, fill out this form, and a specialist will call you at a better time.

We Won't Take “NO” for an Answer®

To Schedule an Appointment, Call Us Toll Free at 1.877.873.8208 or Email Us for a Prompt Response.

Jan Dils, Attorneys at Law

Jan Dils, Attorneys at Law
N/a