January 21, 2017

Who’s Driving? Part two.

A few months ago, I wrote about the first fatality accident involving a “driverless” car–Joshua Brown’s Tesla Model S–that crashed underneath the trailer of an 18-wheeler tractor-trailer, killing Brown in the process.  (Have a look at the original post or ask Mr. Google for a bit more background.)  Anyway, the National Highway Transportation Safety Administration (NHTSA) released its report today and concluded, in the most clinical of terms:  “The Automatic Emergency Braking (AEB) or Autopilot systems may not function as designed, increasing the risk of crash.”

As simple a statement as that is, it’s a loaded one.  It’s not saying that there was something wrong with Brown’s Tesla or that Tesla made a defective product.  (Tesla–and everyone else who considers the use of driverless technology–does, after all, counsel users against checking out of the process and encourage a human driver to be ready to take over from the machine, and the NHTSA notes that systems like Tesla’s Autopilot “require the continual and full attention of the driver to monitor the traffic environment.”)  The Office of Defects Evaluation, a division of the NHTSA, set out to find if there was some kind of production error or a faulty part or some other kind of failure unique to Brown’s vehicle that caused the accident that lead to his death. It didn’t find one. No, the culprit of the piece is the technology, itself, or perhaps better stated, over-reliance upon it.  And that’s a real setback for a tech that some are describing as potentially life-saving.

Troglogdyte that I am, I’ve had my doubts about self-driving tech since I first heard of it, and the potential legal ramifications are something quite extraordinary. Even cities as progressive and tech-savvy as San Francisco are resisting the new technology, at least for now.  Today’s NHTSA report is good for Tesla in the short run, but not so much in the long.  It means it’s likely not responsible for Joshua Brown’s collision–it warned against doing precisely what Brown did, after all–but it means the company’s technology just isn’t ready for primetime, at this point. To say that’s a set-back is an understatement. Where I see risk for Tesla is in continuing to market a technology that NHTSA says can’t reliably do what it’s designed to do. Human nature being what it is, we rely on conveniences and technology. (Who would have thought, ten years ago, that you’d consider a smartphone indispensable?) Elon Musk‘s company is introducing robotic thinking into a human and nature-dominated environment and one that was created by the messiness and disorganization that characterize human affairs.  (In parts of Texas, after all, cattle trails and water sources defined where roads would later run.)  Small wonder that it’s not all perfect, yet.  For Tesla, for now, it’s back to the drawing board.