A humorous exploration of a Canadian's life in Australia.

Wednesday, April 3, 2019

Let your children forge their own path through the snow.

Life is a field full of snow. When we are young and new to the world, the snow fascinates us. We tromp wildly through it up to our knees, even our waists, cutting new paths through it, digging tunnels, and building things out of it. The deeper it gets, the happier we are.



As we grow older, we accept that there are places in life we need to travel between, and moving through the snow is tiring, and hides things that might cause us to fall or experience pain. So we walk down the same path each day, packing down the snow to make our journey less of a chore and less of an unknown. While we often need to, or want to reach new destinations, we habitually look for paths that other people have already packed down to ease our journey.  Before long we have lost sight of the joy that can come from cutting a path into fresh snow, we see only a lot of work and buried hazards. It's safer to stick to the paths.

The trouble is when we start teaching our children to stick to the same paths we forged, or have chosen for them when they just want to set out on their own through the snow. We feel like we're just helping them avoid our mistakes, our regrets, but we are denying them the experience of discovering something we didn't even know was out there, because we simply stopped looking. Go to this school, get good grades, become a professional in this field, because I didn't do it, but it is what I believe you need to be happier and more successful that I was.

I've always doubted Faith growing up, feeling that an afterlife wasn't a reward, how could it be? Our lives here are but a brief moment, and we truly don't comprehend what eternity means. Exploring and experiencing something new is the whole point of being anything, and I believe this is why we are born with absolutely no memory of what came before, and a instilled sense of curiosity about our surroundings. I'm sure most of us have dreamed of being able to go back in time and re-live their lives with knowledge of what was to come, to correct past mistakes. But this dream is futile. Nothing will change because we've already forged our network of paths we are comfortable following. We have 1000 reasons and excuses for why we can't do something or be something other than what and where we are right now. What would we accomplish starting over again with that burden, and the burdens of 100 previous lifetimes? It wouldn't change if you cannot justify a reason to shed that burden today.

Childhood is life's way of giving each of us a fresh blanket of snow. Look out away from the well worn, safe path you've been walking along because it's still out there all around you. Let your children explore it for themselves, and if you're still not able to tromp off beside them through that untouched snow, then follow in their footsteps, resisting the urge to give them directions, but ready to help them back up if they slip. Marvel in the new places they might lead you.

Friday, March 22, 2019

I will never own a self-driving car.

This isn't to say I will never ride in a self-driving car, but rather that I will never own one. Some might argue that self-driving cars won't be viable in my lifetime, or that I'm just past the point of no return to becoming a grumpy, old fart; However, there is a very large elephant in the room when it comes to autonomous vehicles that I'll choose to avoid at all costs: Liability.

My poor old Honda Jazz is coming up due for an upgrade. Today I can head down to a Volkswagon dealership an buy a new Jetta or Golf... but I won't.  These vehicles, and several others are now being fit with "active" systems to override steering and braking. Now they downplay the negative perception by calling it a "steering recommendation" and assure us that it can be overridden. Now to put it in context, how well do you think the reaction to these active, but override-able assist systems are going for Boeing right now?

I own a BMW, and the car is packed with sensors, cameras, and passive systems. If it senses I'm leaving my lane without indicating, it rumbles the steering wheel. If it thinks I'm approaching something too fast, it brings up a red car indicator in the head's up overlay. These are all passive indicators to me, the responsible, liable, control system for the vehicle. For as long as I am responsible for the control of the vehicle, or pilots are responsible for the control of an aircraft, systems like this should simply alert us to suspected problems. I can choose to act on the alert, or verify it against other inputs and ignore it. Pilots should not have to frantically flip through manuals to figure out how to turn off a system intent on planting them into the ground because it mistakenly thinks they're going to stall, and they should not be blamed for not "Reading the Fucking Manual". They shouldn't have had to. The malfunctioning sensors should have been doing nothing but warning the crew, and readily ignored and disabled when they determined it was in error. They could then figure out or remember how to disable the sensor, which is a lot simpler to do when you're not frantically fighting the system for your life.

The elephant roaming around the room with autonomous vehicles lies around liability. Who is liable when a self driving car, or even a car fitted with "recommendations" causes an accident? If I am approaching a cyclist and leave my lane to safely pass him, but my car detects I'm leaving a lane and "recommends" me back, clipping the cyclist, how do I explain that to a court? Do I even want to risk ending up in that situation? Several owners of Toyota vehicles got dragged through courts and public opinions over cases where their vehicles went through incidents of "uncontrolled acceleration". It was blamed on the drivers, and then blamed on the floor mats.  I owned a Toyota many, many years ago as they continued to press for more "drive by wire" technology, and that car suffered from a persistent problem in the winter with a single chip called the MAP sensor. This faulty chip was apparently accumulating moisture in the winter while the car was warm, but then that moisture would freeze around the chip while parked, and then melt and short it out. The car would start, but then as soon as you hit the accelerator, it would stall. (Thank goodness it didn't fault the other way and go into an uncontrollable burn-out!!) I could start it easily again, and as soon as I touched the pedal, *stall*.  In one case it played up while stopped at a red light. To get home I literally had to hammer the gas pedal. I had that damned chip replaced 3 times, and towards the end of my warranty I asked if the chip failed again next winter, would I be expected to be out of pocket? The response was less than encouraging so I sold the car that spring. (I did run into the new owner a few years later, and thanks to the warmer winters, or possibly that they did finally manage to fix that stupid sensor, he didn't have any issues with it.)

Now when it comes to the idea of a self driving car, I will not ride in a vehicle that has the ability to make its own determinations of its surrounding while offering me the ability, or responsibility to override those determinations. If a car and it's autonomous system is going to be trusted to drive the streets, then it will have to do so entirely without me. A "Johnny Cab", but without the joystick for the ridiculous car chases, and preferably without the creepy mannequin as well.


Autonomous cars should not, cannot have manual overrides. You have to be able to trust them the exact same way you trust a cab driver or a bus driver every time you step into one of these vehicles. If you have the ability to override the vehicle, then the situation becomes murky when the vehicle damages something, injures someone, or kills them. Why didn't you override, or did you override and were responsible for the crash?  If I'm in a cab, I'm not watching over the driver's shoulder, ready to overpower him and take the wheel, and I don't want that responsibility dumped on me. I don't want to pay for his speeding tickets, or any damage he causes, or have my insurance premiums/ratings affected by his mistakes. I want to trust that he is capable of getting me where I want to go reasonably safely to both me, and the people around me.

I'm not a skeptic that we cannot trust technology to be as reliable as a human being, but merely pointing out to the fact that once it can be trusted, there is no reason for me to "own" it. Companies like Uber have the right idea. Once self-driving cars are a reality, individuals don't need to own them because they will virtually be like cabs that can roam around town without needing a toilet break. You should be able to request a vehicle with criteria (such as # of seating) on demand and expect a nearby vehicle to come and pick you up. People will still have the choice to own one for themselves (like a personal chauffeur) but they will likely have to pay for extremely common, routine inspections and servicing, and have to face those nagging questions if/when their personal "Johnny" misses something or simply "flips out".  Something as simple as a splatter of bird shit could cause an autonomous controller to miss out on a critical piece of data that causes an accident, or prevent the vehicle from leaving until you get your human ass out and clean it up.

"The ideal flight crew is a computer, a pilot and a dog. The computer’s job is to fly the plane. The pilot is there to feed the dog. And the dog’s job is to bite the pilot if he tries to touch anything."

So if you're still keen to own a self driving car, be prepared to keep that sucker polished, and remember to feed the dog.

About Me

I live around sunny Brisbane working around the city and generally trying not to make too much of a nuisance of myself.