The freight world is full of inconsistencies, like why FMCSA regulations require a certified, licensed, qualified medical examiner to periodically attest to the health of a Class 7 or 8 commercial vehicle driver, but somehow many government groups are considering allowing autonomous vehicle manufacturers to self-certify.
Drivers have to follow a lot of regulations to be allowed to drive big rigs. FMCSA rules describe driver health requirements in § 391.41 Physical qualifications for drivers. There has been a lot of attention in the last year with making drivers follow the rules, get properly trained, speak English, file the right forms, etc. They don’t get to self-certify, in fact, that is something illegal as seen in the news over the years about various schemes for bypassing health requirements that have been exposed and shut down.
So why are some groups thinking it is okay for AVs to be self-certified? Do airline pilots get to self-certify? Ship captains? Railroad engineers? What is special about a computer driving a truck that makes it so different?
The logic seems to be that only the manufacturer really knows what’s going on with their own technology. That technology is proprietary so independent reviews are not viable. The theory offered is that only the manufacturer can say if it’s safe or not.
I guess all things computer based are then considered fault free. Computers don’t get illnesses. Don’t have sensory challenges. Don’t get old.
Evidence to the contrary exists in millions of updates all our electronic devices are constantly getting. Further evidence is the mere existence and need for over-the-air update capability for vehicles. Still more evidence is in the massive number of automotive and truck recalls that are software related.
But hey, it’s not the computer’s fault. Computers, their components and software, are practically perfect, no worries. It must be those pesky error-prone humans in the loop causing all the issues.
Human drivers have to have regular medical examinations by third parties qualified to do them. One of the rules states, the driver “has no mental, nervous, organic, or functional disease or psychiatric disorder likely to interfere with his/her ability to drive a commercial motor vehicle safely.”
Would a computer virus resident in an AV’s millions of lines of code constitute a functional disease if the computer were treated like a human driver? Would faulty software, you know, the kind that has all those recalls and required updates, constitute a mental or organic issue?
Another rule talks about visual acuity. Human drivers have to be able to recognize colors like traffic lights and signs. They have to have 20/40 vision in each eye (or with corrective lenses). So, a computer- based driver that can’t see the warning lights on a school bus or parked safety vehicle might have its license pulled if it were human. But no autonomous car has ever run into parked emergency vehicles and never drives right by school buses, right? So, it’s not an issue, really.
The FMCSA rule also requires reasonably good hearing, where the driver “first perceives a forced whispered voice in the better ear at not less than 5 feet with or without the use of a hearing aid” or meets hearing test requirements. A few years ago, I attended a TMC meeting where representatives of AV advocacy groups surprised me by saying no AV could hear. At that time, in the rush to automate driving, no one had considered a need to listen to auditory warnings like, oh, I don’t know, maybe sirens? They also really had no game plan for recognizing flashing emergency lights. Like those ever are seen in traffic. I expect the situation has improved since then — technology, like babies, need to learn to crawl before they sprint, but those are health questions a human commercial truck driver has been asked and had to prove for decades, so not new topics.
Automated vehicles represent significant hope and opportunity for reducing traffic accidents. There is no argument that human commercial truck drivers are involved in accidents. Data is, however, pretty thin on how many autonomous vehicles also are involved in accidents because much of that data is often shielded from the public. There are also a very small number of autonomous heavy-duty trucks actually operating commercially without human drivers, so historical statistics are not very helpful in measuring accident risk. We’ll have to wait for greater adoption and use over time to get data as rich as that available for human drivers.
Should manufacturers be required and allowed to self-certify autonomous vehicles? Self-certification is a well-established feature of many regulations. Part of that is due to a push to keep government regulators from having to staff and operate testing facilities and maintain data bases. There are clear cost ramifications to the government and to the manufacturers — costs that favor self-regulation. Many of the labels stuck inside truck owners’ manuals and on various locations of the truck indicate a manufacturer has made the vehicle in compliance with various regulations, often through self-certification.
Sometimes that self-certification has to prove itself. Some highly visible mistakes have been exposed in self-certifying vehicles by a variety of manufacturers, particularly in the automotive world. Many of these were related to emission rules. A more recent visible challenge has been with electronic logging devices, and the government has been wise to now require government authorized third-party certification for future devices and has been pairing down the list of previously self-certified devices.
When government permits self-certification, there generally is a requirement to validate that self-certification to the satisfaction of the governing agency. That is when hard test data is submitted to government experts and their hired contractor groups to review the procedures and test data and decide if it is acceptable. That often is not revisited for years or decades or not at all. Some regulations from the 1970s likely have not been reevaluated in decades.
If regulations allow AV manufacturers to self-certify their products, who outside those companies will know if they really do meet requirements? Will there be efforts to periodically review the processes and test data for self-certification? A decade from now, will those much improved vehicles continue to be self-certified based on 10-year-old qualifications by government officials?
AVs are great technology. Perhaps manufacturers are the only ones capable of truly knowing their own product’s capabilities. But the real world has a habit of inserting itself and finding issues. Future accident investigations may reveal inadequacies in workmanship, quality, capabilities, and reliability. They may reveal that omissions have been made, questions not asked or not answered. Self-certification also has liability aspects.
In the latest rage to minimize government regulations, if AVs are allowed to self-certify, will we see a push to let human commercial truck drivers also self-certify?











