Tesla’s Autopilot system has recently hit the streets in beta form, being included in the latest of Model S over-the-air updates. Autopilot is an autonomous driving function that acts similarly to the functions of the Mercedes-Benz S Class and BMW 7 Series, but it isn’t just automatic cruise control, it can actually steer around corners. It also doesn’t require hands on the steering wheel, like in most other autonomous functions, except for when the car recognizes a situation it doesn’t know how to handle, to which it will then prompt the driver to retake control. This sounds all well and good, but are we ready to have machine like this driving around on our public roads with other drivers, especially when the software is in beta form?
It depends on who you ask, honestly. If you were to ask Tesla’s owner, Elon Musk, he’d tell you that it’s perfectly safe and necessary. If you ask any Tesla fans, they’d tell you that Elon Musk is the automotive messiah and nothing that comes from his mouth can be anything other than the absolute truth. However, if you were to ask some realists about Autopilot and its beta testing, they’d tell you that it’s not the best of ideas and that we aren’t ready for it just yet.
In a recent interview with BMW’s CEO, Harald Krueger, he spoke about how BMW wouldn’t be putting out this sort of technology yet, because it isn’t “100 percent reliable” just yet. Now, Krueger, nor I or anyone else, claimed that Tesla’s Autopilot system is poorly designed, just that there’s not enough technology out there to make fully autonomous driving perfectly safe yet. Tesla knows this and openly admits it, which is the reason why its current Autopilot system is called a beta, it’s not done just yet. But that’s the issue Krueger, and the majority of the motoring world, has with Tesla and its Autopilot.
Beta testing is generally a public test of software and the like. There are beta tests for apps, video games and many other electronic based technologies. The point of beta testing is to give the program a large enough sample size that the developers can more easily find flaws in the system and correct them. The issue with beta testing an autonomous driving function is that if there are any flaws, people can die.
There are several reports of Tesla’s Autopilot system suddenly veering off into opposite traffic for unexplained reasons or not recognizing unclear lane markers and almost changing lanes into a concrete median. So far there have been no recorded accidents, crashes or injuries. So far. And then there’s the fact that the Model S doesn’t require constant interaction with the driver to keep Autopilot on, like the autonomous functions in Audis, Mercedes’ and BMWs do. In those aforementioned cars, the system will require the driver to put a hand on the steering wheel every ten seconds or so, to insure that the driver is awake, alert and in the driver’s seat. Tesla’s Autopilot has no such requirement which let to a Norwegian Model S owner to take a video from the back seat, with no one in the front driver or passenger seat, while the Model S cruised down a motorway and around 50 mph. If it decided to veer off of the road, like it has in other instances, while he wasn’t in the driver’s seat, someone could have been severely injured or even killed. Beta testing should not be done by the public if lives could be at stake. It’s inexcusable.
Autonomous driving functions are great and will help our automotive infrastructure incredibly when they are ready for the public. But until then. they should be tested in very small sample sizes and in controlled environments, similar to what Audi is doing in Boston. But as a whole, the public is simply not ready for autonomous driving, the technology isn’t there yet and neither are we.