Self-driving cars could have a long road to travel, analysts warn
A technician sits in an Uber self-driving car (Angelo Merendino / AFP)
Published Thursday, September 7, 2017 9:57AM EDT
Industry trackers and analysts caution that technical challenges along with legal and liability issues will be speed bumps on the road to self-driving cars becoming common out on the street.
The U.S. House of Representatives on Wednesday approved legislation aimed at clearing the path for introduction of self-driving vehicles by requiring consistent regulations across the 50 states.
The Self Drive Act "will help pave the way for self-driving cars nationwide and ensures America stays a global leader in innovation," said a tweet from Representative Greg Walden, who chairs the House panel that drafted the bill.
The bill, which needs Senate approval before being sent to the White House, would prevent states from imposing regulations on autonomous vehicles that would make it more difficult for manufacturers to deploy self-driving cars nationwide.
At the end of June, the Group of Seven (G7) countries expressed a commitment to remove regulatory obstacles and smooth the way for self-driving vehicles.
Auto behemoths are racing to get 'autonomous vehicles' in gear, and their competition includes Silicon Valley innovators including Apple, Google, Tesla, and Uber.
Major car makers have promised to have self-driving models coming off assembly lines as early as the year 2020.
On the road
Even computer chip giant Intel announced plans for a fleet of self-driving cars, breaking the news after closing a $15 billion deal to buy Israeli autonomous technology firm Mobileye.
Waymo, a self-driving car company owned by Google-parent Alphabet, is testing self-driving cars with volunteers in Arizona.
In California alone, about forty companies have permits from the state to test cars without drivers on the roads. New York is open to similar testing with an eye toward reducing accidents.
Meanwhile, Tesla boasts that all of its models are equipped with sensors, cameras, and other technology to enable them to navigate routes without human involvement.
Car makers already put aspects of the technology to work with features such as self-parking and automatic braking to avoid collisions.
It is estimated that more than 90 percent of driving accidents result from human error, and advocates of autonomous vehicles argue they will save lives and avert injuries.
Tesla founder and chief Elon Musk has publicly contended that the foundation is laid for cars to navigate completely on their own twice as safely as vehicles controlled by human.
Despite advances in sensors, software and machine smarts, some argue for companies and authorities to throttle back expectations.
"It's time to face some challenging realities when it comes to the world of autonomous cars," TECHnalysis Research analyst Bob O'Donnell said in a post at techpinions.com.
"For those predicting radical changes in how consumer-purchased cars and trucks are built, bought, and used over the next few years, it's time to stop the charade."
He listed concerns including security, design complexity, legal expectations, lack of infrastructure for electric cars, suggesting a long timeline until reliable self-driving vehicles merge into the mainstream.
Electric cars are seen as leading the charge in self-driving, and those models are a scant portion of vehicle sales.
Regulations in countries around the world would have to catch up to, understand, and adapt to self-driving technology.
Insurance companies, and probably the courts, will need to work out who gets the blame when accidents happen.
According to U.S. press reports, more than a dozen engineers and Tesla executives working on autonomous capabilities in cars have internally expressed worries about whether the technology is safe enough to be out in the wild.
Musk has held firm that the company's cars need only a green light from regulators to start driving themselves.
Even if the technology left to its own devices proved trustworthy, some worry about the potential for hackers to remotely take control of vehicles in scenes seeming fit for futuristic action films.
Tesla last year deployed a security patch for the Model S after Chinese researchers claimed to have hacked into one through a wireless connection.
A Tesla model with autopilot was involved in a fatal accident in the U.S. last year, and while the technology was cleared of culpability several opinion polls indicate many people remain reluctant to take their hands from steering wheels.
The question also arises of what kind of ethics will be programmed into car-controlling software. For example, what should a self-driving car do if forced to chose between saving its occupant or a pedestrian?