The story appears on

Page C1

July 18, 2016

GET this page in PDF

Free for subscribers

View shopping cart

Related News

Home » Business » Autotalk Special

The promise and angst of driving with no hands

THE recent US crash of a highly automated Tesla, whose driver became the first known casualty of a new auto technology still in the testing phase, is a stark reminder of the ultimate price of being a guinea pig and the ultimate conundrum carmakers face when peddling the concept of self-driving cars to the public.

In the auto industry’s bold new experiment with artificial intelligence, boundaries are still being tested even as they are pushed to the extreme. Lines are so easy to blur, just as occurred in the recent fatal accident. While the Tesla’s camera was failing to distinguish the white side of a turning tractor-trailer from the brightly lit sky and failing to activate automatic braking, the hands-off driver was suspected of watching a movie played on a laptop.

The tragedy of the ensuing collusion hurt but did not stop the undaunted march toward the merging of automotive and robotic technologies. The industrial concept of Vision Zero points to motoring with no casualties, no injuries and no accidents, with cars running purely on programmed rationality, free from human distraction, fatigue and emotional instability.

Earlier this month, sporty car maker BMW, chipmaker Intel and camera supplier Mobileye teamed up to establish an open autonomous-driving platform for the industry to align technologies, from door locks to data centers.

At the regulatory level, Japan and several European countries were recently reported to be working on common standards for autonomous driving in an effort to form a joint competitive front against the US, which is currently ahead of them and home to high-profile pioneers like Tesla.

California-based Tesla is one of the few carmakers with aggressive deployment of autonomous driving technologies in cars for sale. As such, the company chose to soft-pedal the fatal accident. It noted that the fatality was the first in 130 million miles driven since Tesla’s autopilot function was introduced seven months ago. That fatality rate is less than the one death every 94 million miles among all cars in US, and the one in every 60 million miles globally.

The statistic methodology has had its critics, who note that Tesla’s sample size is much smaller and therefore perhaps less reliable.

Then, too, it’s anyone’s guess whether the driver who died in the Tesla crash could have prevented the accident if he had been aware of what was happening and taken back the controls. Would there have been enough time?

The Internet is not shy of videos showing Tesla drivers performing “stunts,” such as taking a nap, crawling into the backseat or demonstrating “hands-free” driving. The driver killed in the accident was known to film his own autopilot adventures. Tesla salesmen in Shanghai once admitted proudly to Shanghai Daily that they felt so assured of the autopilot function that they dared play with their mobile phones when driving cars between Tesla stores.

Be it mischief or pure lunacy, this is not the sort of image Tesla wants to encourage. It has repeatedly stated that the autopilot function is not a substitute for the human driver, but merely an assistant. That message tends to get blurred amid the background noise of hypes and hopes.

All the pitches about hassle-free, hands-off journeys of the future, coupled with presentations of fancy concept cars, have raised expectations and confusion about the capabilities of the emerging technology.

Technically speaking, Tesla is still at Level 2 of the industry’s autonomous driving roadmap, relying on an advanced driver assistant system to perform conditional autonomous functions, like pacing a car at a safe distance from the car in front and auto-steering that reads clear lane markings.

To be fair, Tesla has taken a huge step forward from Level 1, which is a bundle of stand-alone features like electric stability control and pre-charged braking. But it is still far from Level 3, which covers “eyes-off” motoring.

“In fact, all the systems on market-ready cars now are at most Level 2,” said Frank Jourdan, president of Chassis and Safety Division at Continental, a major supplier of autonomous driving technologies. “Consumer education is very important. It can be difficult with advanced driver assistant systems. Everything is going so fast.”

Continental will start localizing the production of sensors for such systems next year in China, helping carmakers pursue more cost-effective autonomous driving solutions. Compared with the future vision of the driverless car, a car with an advanced driver assistant system is a real temptation in making mundane driving more relaxing and allowing motorists a bit of daydreaming time.

Snapped back into reality, there is still a long way to go to Level 4 of autonomous driving “mind-off,” and the ultimate Level 5 “driver-off”. By then, here will be no more taking back of controls.

BMW, on its open platform, is calling for shared solutions for Levels 3 to 5. Olaf Kastner, president of BMW China, said the company has completed 8,000 kilometers of highly automated test-driving in China, in enclosed areas as well as on public roads.

Earlier this month in southern city Chengdu, it became the first premium carmaker to demonstrate three key functions, including auto-lane changing, as a preview of the next level of advanced driver assistant systems.

The company is working on a car called BMW iNEXT, which is supposed to lay the foundation for series production of fully automated driving systems by 2021. This car will target not only open highway motoring but also complicated urban environments.

Cost remains a principal factor in development.

At the current stage, there are two strategies for dealing with that. One is bottom-up evolution, by which hardware and functions are added step-by-step to maintain reasonable costs and sell semi-automated cars first. The other is top-down evolution, which makes a car fully wired regardless of the cost, mainly for experimental purposes, and hopes profit will follow one day when full autonomous driving comes to commercial fruition.

The former strategy has been adopted by Tesla and other major carmakers, while the latter has come to be known as the “Google style.”

But can consumers be assured that the industry is keeping its feet on the ground while its head is in the clouds?

It’s a stretch of confidence sometimes.

The highly expected public beta version of automated cars may contain bugs that cannot be fixed because current software is not powerful enough to replace hardware that carmakers cannot afford to deploy.

A supercar must be a fusion of gadgets with their own specialties. Radar, with its strength in velocity measurement and darkness availability, enables adaptive cruise control on the long range and parking assistance on the short range.

Cameras specialize in object classification and lateral resolution, which means measuring the space between objects. Ultrasonic sensors can pick out obstacles and pedestrians in the vicinity of a car. The Tesla that crashed had all of these functions but couldn’t recognize a turning tractor-trailer with 18 wheels.

Mobileye, the supplier of Tesla’s front camera, released a statement trying to explain the accident.

“Today’s collision avoidance technology, or automatic emergency braking, is defined as rear-end collision avoidance, and is designed specifically for that,” the statement said. “This incident involved a laterally crossing vehicle, which current-generation systems are not designed to actuate upon.”

The company said it will introduce a lateral-turn-across-path detection function to its system in 2018. It is not clear which hardware it will use. The best insurance for precise sensing agreed upon by the industry is to add LiDAR, a light detection and ranging system that can do 360-degree scanning and 3D mapping of surroundings, under any light conditions.

Ford has equipped its fleets of autonomous driving test cars with a light detection and ranging system to navigate in the snow and in the dark. Google has a similar system in the form of a scanner bolted on its self-driving robot. But it isn’t being used on any mass-produced cars yet because the technology has yet to be commercialized.

A highly accurate light detection and ranging system could cost US$70,000, almost the price of a new Tesla.

And that’s not even the best one in the pipeline. Solid-state light detection and ranging systems, which integrate separate mechanical parts into a single microchip, are said to be simpler and more reliable. That is the new frontier of 3D scanning, and auto suppliers have been rushing to embrace it since the latter part of 2015. To pursue low-cost solutions of that, Continental bought a business from ASC, Valeo partnered with LeddarTech and Delphi joined forces with Quanergy.


Copyright © 1999- Shanghai Daily. All rights reserved.Preferably viewed with Internet Explorer 8 or newer browsers.

沪公网安备 31010602000204号

Email this to your friend