A humorous exploration of a Canadian's life in Australia.

Friday, March 22, 2019

I will never own a self-driving car.

This isn't to say I will never ride in a self-driving car, but rather that I will never own one. Some might argue that self-driving cars won't be viable in my lifetime, or that I'm just past the point of no return to becoming a grumpy, old fart; However, there is a very large elephant in the room when it comes to autonomous vehicles that I'll choose to avoid at all costs: Liability.

My poor old Honda Jazz is coming up due for an upgrade. Today I can head down to a Volkswagon dealership an buy a new Jetta or Golf... but I won't.  These vehicles, and several others are now being fit with "active" systems to override steering and braking. Now they downplay the negative perception by calling it a "steering recommendation" and assure us that it can be overridden. Now to put it in context, how well do you think the reaction to these active, but override-able assist systems are going for Boeing right now?

I own a BMW, and the car is packed with sensors, cameras, and passive systems. If it senses I'm leaving my lane without indicating, it rumbles the steering wheel. If it thinks I'm approaching something too fast, it brings up a red car indicator in the head's up overlay. These are all passive indicators to me, the responsible, liable, control system for the vehicle. For as long as I am responsible for the control of the vehicle, or pilots are responsible for the control of an aircraft, systems like this should simply alert us to suspected problems. I can choose to act on the alert, or verify it against other inputs and ignore it. Pilots should not have to frantically flip through manuals to figure out how to turn off a system intent on planting them into the ground because it mistakenly thinks they're going to stall, and they should not be blamed for not "Reading the Fucking Manual". They shouldn't have had to. The malfunctioning sensors should have been doing nothing but warning the crew, and readily ignored and disabled when they determined it was in error. They could then figure out or remember how to disable the sensor, which is a lot simpler to do when you're not frantically fighting the system for your life.

The elephant roaming around the room with autonomous vehicles lies around liability. Who is liable when a self driving car, or even a car fitted with "recommendations" causes an accident? If I am approaching a cyclist and leave my lane to safely pass him, but my car detects I'm leaving a lane and "recommends" me back, clipping the cyclist, how do I explain that to a court? Do I even want to risk ending up in that situation? Several owners of Toyota vehicles got dragged through courts and public opinions over cases where their vehicles went through incidents of "uncontrolled acceleration". It was blamed on the drivers, and then blamed on the floor mats.  I owned a Toyota many, many years ago as they continued to press for more "drive by wire" technology, and that car suffered from a persistent problem in the winter with a single chip called the MAP sensor. This faulty chip was apparently accumulating moisture in the winter while the car was warm, but then that moisture would freeze around the chip while parked, and then melt and short it out. The car would start, but then as soon as you hit the accelerator, it would stall. (Thank goodness it didn't fault the other way and go into an uncontrollable burn-out!!) I could start it easily again, and as soon as I touched the pedal, *stall*.  In one case it played up while stopped at a red light. To get home I literally had to hammer the gas pedal. I had that damned chip replaced 3 times, and towards the end of my warranty I asked if the chip failed again next winter, would I be expected to be out of pocket? The response was less than encouraging so I sold the car that spring. (I did run into the new owner a few years later, and thanks to the warmer winters, or possibly that they did finally manage to fix that stupid sensor, he didn't have any issues with it.)

Now when it comes to the idea of a self driving car, I will not ride in a vehicle that has the ability to make its own determinations of its surrounding while offering me the ability, or responsibility to override those determinations. If a car and it's autonomous system is going to be trusted to drive the streets, then it will have to do so entirely without me. A "Johnny Cab", but without the joystick for the ridiculous car chases, and preferably without the creepy mannequin as well.


Autonomous cars should not, cannot have manual overrides. You have to be able to trust them the exact same way you trust a cab driver or a bus driver every time you step into one of these vehicles. If you have the ability to override the vehicle, then the situation becomes murky when the vehicle damages something, injures someone, or kills them. Why didn't you override, or did you override and were responsible for the crash?  If I'm in a cab, I'm not watching over the driver's shoulder, ready to overpower him and take the wheel, and I don't want that responsibility dumped on me. I don't want to pay for his speeding tickets, or any damage he causes, or have my insurance premiums/ratings affected by his mistakes. I want to trust that he is capable of getting me where I want to go reasonably safely to both me, and the people around me.

I'm not a skeptic that we cannot trust technology to be as reliable as a human being, but merely pointing out to the fact that once it can be trusted, there is no reason for me to "own" it. Companies like Uber have the right idea. Once self-driving cars are a reality, individuals don't need to own them because they will virtually be like cabs that can roam around town without needing a toilet break. You should be able to request a vehicle with criteria (such as # of seating) on demand and expect a nearby vehicle to come and pick you up. People will still have the choice to own one for themselves (like a personal chauffeur) but they will likely have to pay for extremely common, routine inspections and servicing, and have to face those nagging questions if/when their personal "Johnny" misses something or simply "flips out".  Something as simple as a splatter of bird shit could cause an autonomous controller to miss out on a critical piece of data that causes an accident, or prevent the vehicle from leaving until you get your human ass out and clean it up.

"The ideal flight crew is a computer, a pilot and a dog. The computer’s job is to fly the plane. The pilot is there to feed the dog. And the dog’s job is to bite the pilot if he tries to touch anything."

So if you're still keen to own a self driving car, be prepared to keep that sucker polished, and remember to feed the dog.

About Me

I live around sunny Brisbane working around the city and generally trying not to make too much of a nuisance of myself.