Tesla driver dies in first fatal crash while using auto-pilot mode’. This headline or derivations of it have been coming in at the speed of light. The reasons why it has made such big news are manifold. It’s the first fatal crash for an auto-pilot car, it’s a huge setback for the self-driving car industry and it’s devastating news for the first-user geeks and nerds. But within it lies another story.
A story of pushing the limits of technology before it is ready, trying to be different and defying auto industry conventions just to stay ahead and more importantly just how silly it is to have unproven tech like this being opened to the public and calling it a beta test.
Beta tests are what you do with a piece of software or an unreleased app or maybe even a prototype phone. But a beta feature that allowed control of a car and ended up taking a life?
Tesla Motors and its very famous founder Elon Musk have always portrayed themselves as unconventional, ahead of the curve and more innovative than any other car maker. And in making sure that everyone understood that, an autopilot self-driving feature was introduced in the Tesla Model S sometime back. This was at best a ‘in-testing phase’ feature and while it worked fairly well, it wasn’t perfect as has been clearly demonstrated in the crash.
Pushing limits, extreme aggression to be ahead of the rest and showcasing features that may not be ready for usage on actual roads – these are becoming typical norms of the auto industry in its race to be the first out with a self-driving car. A race that may well end with major crash and burn in the future.
False Sense of Security
Think about it. You buy a car. Right on top of the brochure as feature number one, it says ‘self-driving auto-pilot mode’. The salesperson gives you a demo, the car actually does drive on its own as the car keeps to its own lane and even gently swerves right and left when another vehicle comes close. You are thrilled. This is it; all those futuristic movies, all those articles you read, all those promises about the future self-driving car – they are all already here.
You slowly start to use the auto-pilot, grow more confident and soon are leaving it completely to the car to drive you around while you read a newspaper (yes, lots of videos of people doing that are out), watch a movie (that’s what apparently the driver of the crashed Tesla was doing, watching a Harry Potter movie), work on your laptop or even chat on your phone.
This easy seduction of cutting-edge technology and the false sense of security it establishes is when things get dangerous. The Tesla auto-pilot system allows the car to keep itself in a lane, maintain speed and operate for a limited time without a driver doing the steering with cameras and other sensors.
While more details will emerge, preliminary findings state that a white semi tractor-trailer turned in front of the car and the cameras and sensors on the Tesla couldn’t distinguish between the white trailer and the bright Florida sky. The car drove on and into the trailer – not knowing it was there at all!
There is a second story here. How untested technology that isn’t ready for prime time can sometimes become a serious road block for future innovation. What will be the fallout of this fatality. New rules, stringent laws, banning of auto-pilot mode in all cars for a while, licences and regulations? And it’s not as if the technology really failed. If the trailer in front of the car also had an auto-pilot mode and could sense all the other cars on that highway, it would have been aware of the car behind it and speeded up or turned away. This tragedy could have been averted.
The Path Ahead
There is no going away from the fact that we will all be in an auto-pilot self-driven car in the future. It’s a predestined path for the automobile industry and for us humans. Every single car and tech company in the world is working towards it, plus it’s actually good for us, the environment and the world we live in.
From parking woes, traffic jams, pollution, better transportation as well as safety on the roads – all of them get solved with the perfect self-driving cars ecosystem. But this one crash has opened up a can of worms and the whole self-driving car industry may have a five-year setback. It also brings back the classic question that still doesn’t have a perfect answer. Maybe you can give it a shot.
You’re sitting in a self-driven completely automated car. Two children aged 6 and 7 suddenly cross the road right in front of the car and a crash is imminent in the next second. Your automated car has to take a decision based on programming it has been fed for a situation like this. Should it drive you into certain death by crashing the car into a wall on the right or should it save you and mow down the two children? Think about it and let me know what the right answer is. It is questions like these that need to be answered before we all take our hands off the steering wheel forever.
From HT Brunch, July 10, 2016
Follow us on twitter.com/HTBrunch
Connect with us on facebook.com/hindustantimesbrunch