Drivers in West Virginia who are following the developments in self-driving vehicle technology should know that many automakers are hiring individuals to be in-vehicle fallback test drivers for AI-driven cars. What these people sounds simple and perhaps even enviable: They sit for hours in a car that drives itself and only intervene if the car seems headed for danger. Yet the job itself is fraught with danger.
It starts with the fact that many firms, believing that it takes little skill to be an IFTD, hire people indiscriminately. These people, perhaps unfamiliar with AI technology, will begin to overestimate it, especially during long stretches when nothing goes wrong. This, combined with the boredom of sitting and watching one’s surroundings, can cause IFTDs not to react in time when something does go wrong.
Another problem is that AI development teams may make coding changes to the on-board driving system without telling the IFTD. This can lead to the IFTD becoming under-reactive or over-reactive.
For these and other reasons, the Automated Vehicle Safety Consortium has released a document of best practices for IFTD selection and training. IFTDs should have proven driving skills, undergo a pre-trip readiness procedure and be briefed on acceptable practices in the car. That means, among other things, no phone use or other distracting activities.
Self-driving cars still have a long way to go. One may recall how a self-driving Uber vehicle struck and killed a pedestrian in Arizona in 2018. It is still unclear what the IFTD was or wasn’t doing at the time of that crash. If an IFTD practices negligent driving, though, he or she may be held liable for a crash. Victims, for their part, might want a lawyer to determine how they can go about establishing fault and filing a claim.