The evolving landscape of autonomous and semi-autonomous technology is set to fundamentally reshape the legal framework surrounding truck crash litigation.
The transition from human-operated vehicles to those utilizing Advanced Driver Assistance Systems (ADAS) is complicating the determination of liability. Traditionally, trucking accidents involved straightforward claims against the driver and the motor carrier. However, as software and hardware take a more prominent role in vehicle operation, the legal "ecosystem" is expanding to include a much broader range of defendants.
A shift toward multi-party, complex litigation that incorporates product liability claims alongside traditional negligence could lie ahead. This transition means that original equipment manufacturers (OEMs), as well as software and hardware developers, are increasingly likely to find themselves in the crosshairs of plaintiffs' attorneys.
These emerging legal theories will likely focus on defective design or failure to warn claims, shifting some of the burden away from the driver. While widespread adoption of fully automated commercial vehicles is still in its early stages, recent judicial precedents are already setting the stage for this new era. Specifically, experts point to a landmark Tesla Autopilot case in the Southern District of Florida as a critical bellwether. This case demonstrates that courts are now willing to explore OEM liability when automated technologies are involved, signaling a future where truck crash lawsuits will become significantly more intricate and expert-intensive.
Contents of this video
00:00 10-44 intro; Autonomous Truck Accidents and Liability
00:47 Driverless Trucks and More Complex Litigation
02:47 Lessons from the $200M Tesla Autopilot Case
05:04 Product Liability vs. Traditional Negligence
07:01 Trucking Regulations and Catching Up with Technology
07:57 The "Patchwork" of State vs. Federal Regulations
09:36 Future Outlook: The Next "Big Case" in Trucking
Speaker 1:
Liability in a truck crash is already difficult to pin down, but imagine how it's going to be when you remove the driver from the equation. Hey everybody. Welcome back. I'm Jason Cannon and my co-host is Matt Cole. Truck crashes seem like something that should be pretty straightforward. Something hit the truck or the truck hits something, so let's figure out whose fault it is so we know who to sue. But technology's muddied the water because when the driver is using ADOS features, the driver isn't fully in control. And in the future, they may not be there at all.
Speaker 2:
As technology takes on a larger role in what happens in the cab and on the road, it could expose more parties to truck crash lawsuits, putting truck and engine OEMs and hardware and software developers in plaintiff's attorney's sites right beside the driver and the motor carrier.
Speaker 3:
I think the big theme with the caveat that obviously these are very much edge cases right now. In other words, we obviously don't have widespread adoption of these sorts of automated commercial vehicles yet on public roads in a way that would give us a sufficient sample size to know. So just like to frame it within that context. But what I would forecast is that we have a more multi-party complex litigation environment. So it's going to lead to an increase in the size of the legal ecosystem. And what I mean by that is that where you would traditionally have, as we do now, the quintessential trucking liability case where the driver can have direct negligence claims levied against them, and then the carrier or owner of the vehicle through vicarious liability can be brought in as well, an expansion of that ecosystem to include the original equipment manufacturers as well.
So all of a sudden you go from pure theories of direct negligence and vicarious liability to product liability claims as well. So you're going to have plaintiff's attorneys that are likely going to be exploring these theories of liability in a way that wouldn't have been applicable previously in this sort of trucking context and transportation generally, but obviously in the trucking sphere here specifically, so that in my view, that could lead to much more commonplace multi-party cases, more complex expert litigation, and ultimately not necessarily increase the volume of the cases per se, but definitely increase the complexity of each and every one of these cases. So I think big picture, it's going to be a sort of hybrid negligence is what I'd call it from just a direct or vicarious liability claims to include this product liability aspect as well. And you could see that manifest in failure to warn claims or defective design claims that are levied against the actual equipment manufacturers rather than the driver or the carrier.
I think that's something that I could foresee happening once this becomes more commonly adopted on our roads.
Speaker 1:
Now this idea really isn't all that farfetched and there's important precedent that establishes OEMs can be held liable in a crash involving use of their technologies.
Speaker 3:
We don't have a true case study in the trucking sphere yet, as far as I'm aware, but I think an important bellwether and something that would be useful for our discussions to discuss briefly is a big case that occurred here actually where I'm at in South Florida just last year. So this was a Tesla autopilot case. The first real one of its kind was the Southern District of Florida, federal court, and very briefly the facts of the case involved the driver using a level two Tesla model, autopilot vehicle, standard usage of the vehicle driving in the Florida Keys headed south towards the T intersection. And based on my understanding of the facts, he was distracted, dropped his phone, and in the process of doing that, drove through the T intersection and collided with a vehicle that was resting on the other side of the T intersection on the shoulder, leading to one death and then very severe injuries for the other individuals in that vehicle.
The reason the case was so significant is because Tesla was brought in as an OEM. So the plaintiffs asserted theories on these product liability of sort claims that we've been discussing against Tesla. And really there, the claims being advanced were that there were expectation mismatches from the user. In other words, the driver as to what a level two Tesla could do or could not do. And at least what plaintiff's asserting and what ultimately the jury agreed with to an extent was that there was a mismatch between what was being marketed or claimed to be the capacity of these autopilot tools and what it could actually do and that over-reliance contributed to the accident in that case. And so ultimately the jury there apportioned 33% of the fault to Tesla. The remainder was apportioned to the driver, but that was a significant apportionment to Tesla and it was a 200 plus million dollar verdict, 200 million in punitive damages.
It's being appealed right now, but the trial court judge has already upheld the judgment at least, so it'll go up on appeal. But to the extent we can apply that to the trucking context, that's a perfect example of the OEM being brought in to the suit as we were just discussing and expanding that landscape.
Speaker 2:
The Tesla crash was a car using level two ADAS, which still requires a lot of human driver input and requires the driver to remain fully engaged and supervise at all times. But what happens when the driver's full attention isn't needed or they aren't even there?
Speaker 3:
I think a very interesting question is on the product liability end. Traditionally, if we apply traditional principles of product liability to products that are non-automated vehicles such as trucks, for example, at the point of sale, the product is static. It's not meant to change anymore. Any product you buy at a Walmart or any other store, as soon as you buy it, that's the manner that it should remain once you utilize it. But here with, for example, a level four truck, you have a situation where even after the point of sale, that product is going to be continuously upgraded, new software's going to be adapted. So you have a product that's continuously changing. And when you have issues like that, it becomes a question of, one, if you didn't as the user or the driver of the carrier timely, make sure that these upgrades to the software are installed, are you liable for that?
Or is it a question of notice on the OEM? There's all sorts of issues that arise with these continual changes to the software even after the point of sale, which obviously don't exist currently. Very briefly, the last point I'll make is, one of the big theories we have in this field, obviously dangerous instrumentality is the owner of a vehicle. Once someone is permissively using the vehicle, you can still be held liable, whether that's a carrier with a truck driver or whether that's a parent that lets their kid drive the vehicle. All of a sudden with a level four, you have what could be ultimately a question of first impression to it and extent, which is who is a permissive user at that point when you don't have a driver, is it the original equipment manufacturer? Is the OEM now the permissive user? And can the owner be held vicariously liable?
There's all sorts of questions there doctrinally with a law that very much need to be figured out as it pertains to the legal principles that we currently apply to these sorts of cases.
Speaker 1:
Regulations have a long way to go to catch up with technology because sophisticated ADOS completely changes the game.
Speaker 3:
Currently, as they're structured, whether it be NHTSA guidelines or the FMCSRs, they obviously assume that there'd be a driver in the vehicle. Hours of service, qualification, the entire regulations are formatted in a way that assumes that there is a human driver in the vehicle. And if that, at least in some circumstances, changes, well, it'll require a wholesale re-imagining and application of these guidelines for the circumstances where there is not a driver in the vehicle. You'll have to rethink what does hours of service mean when you have a level four vehicle on the road and what are the qualifications for that? I think as it goes to the carriers, obviously anywhere between a level two or a level three where the driver maintains control, the key there is to ensuring that there is adequate training as it pertains to these tools so that you don't have a situation like the Tesla case that we discussed earlier.
Speaker 2:
Figuring out the rules of the autonomous road has largely been left to the state, which has created a patchwork of rules across most of the US where autonomous technologies are legal to varying degrees.
Speaker 3:
It does not surprise me. At the same time, I think it's a disservice to the industry to not have uniform parameters that the industry can abide by. I think you're limiting the development of certain of these tools and their safe application to certain states, creating this patchwork of regulations. You have an entirely different system. Like you said, in Texas or Oklahoma or California is not conducive, in my opinion, at least, to a more effective, safer implementation of these tools. I mean, you can take examples from other industries, other sectors, whether it be the legalization of cannabis on the federal level and how ultimately there's a patchwork of state regulations and it took a very long time for it to be more uniformly regulated at the federal level. And even to this day, with limited knowledge as to this, it still hasn't been uniformly regulated in a way that would assist the businesses that are working in that space.
And I think it doesn't surprise me in the same capacity of sorts, but obviously it's doing a disservice to the industry. And I think as with everything else happening right now, just with automation, artificial intelligence, generally, I think the government's just having a really hard time keeping up with the radical pace of change that's happening right now. And that might be part of the explanation, but ultimately right now, the only real federal guidelines we have are NITSA standing order that any sort of level two or level three level four accidents need to be reported so they can document all of those. But beyond that, there really isn't anything else. And I think, again, that can only hurt the industry.
Speaker 1:
So back to the question of who's responsible when an autonomous truck hits something or is hit by something? Well, the answer is we don't know. We're going to need more autonomous crashes to happen and see who sues who, who wins, and then we can figure all this out.
Speaker 3:
I think the reason I refer to the Tesla case, or if you look at Waymo's, which are not ubiquitous where I am here in Miami and many other cities, I think the reason I'm interested in those examples, and obviously it's not trucking, but those will be the bellwether of what liability moving forward looks like for the trucking industry as well. Those are the tests, those are the application. That's where we'll see how the legal principles are applied. So that's why I'm keeping an eye on the developments in that space because I think they'll forebode what's to come in trucking at least. And I think that's something to keep an eye out. Obviously, there's distinctions between a non-commercial trucking vehicle, but nonetheless, I think that's important moving forward. There's a possibility that regulations are instilled before that hypothetical big accident occurs, and those could be challenged or questioned.
So that could be a potential avenue where that leads to this discussion taking place. In other words, the regulations themselves are what give rise to the conversation. But yeah, I mean, there is a likelihood that you and I have this conversation in a year's time or less or more because there's been a Tesla equivalent case, but in the trucking space. That's not unlikely at all.
Speaker 1:
That's it for this week's 10:44. You can read more on ccJdigital.com. While you're there, sign up for our newsletter and stay up to date on the latest in trucking industry news and trends. If you have any questions or feedback, please let us know in the comments below. Don't forget to subscribe and hit the bell for notifications so you can catch us again next week.









