When Robots Take the Wheel

Jeff Gottlieb | March 23, 2018

In a world of driverless cars, what role will humans play? The answers may lie inside the cockpit of a 747.

(Courtesy/iStock)

For the past 25 years, Najmedin Meshkati has taught a course for pilots and aviation professionals called Human Factors in Aviation Safety at USC’s 66-year old world- renowned Aviation Safety and Security Program. Meshkati, a professor of civil and environmental engineering and industrial and systems engineering at the USC Viterbi School of Engineering, is one of many experts who say commercial aviation, where most of the actual flying is automated, provides a cautionary tale on the road to driverless cars.

When the technology functions properly, everything is fine, Meshkati said.

But when the unexpected occurs, or something the robots aren’t ready for takes place, a human pilot must grab the controls, and they may not be prepared. “Yes, the aviation industry has benefited from automation significantly,” Meshkati said, “but at same time this automation brought some other risks that need to be taken into account.”

One day self-driving or autonomous cars may move from the experimental to the every day, much like cell phones. Cars already come with options such as assisted parking, adaptive cruise control, automatic braking, collision warning, and lane-keeping assist systems, a long way from the days when those tasks were the driver’s business. The popular vision is that a self-driving car becomes a living room on wheels, where you can read, sleep, work or watch a movie without paying attention to the road.

But how real is that? Will humans really be able to sit back enjoy a stress-free ride?

More than 30,000 people in the U.S. die annually from car crashes, and the cause of 94 percent of them is human error, according to the National Transportation Safety Board. If you remove the humans, the theory goes, the number of crashes drops dramatically.

Although you might find it hard to believe next time someone cuts you off, humans are surprisingly safe behind the wheel. Steven Shadlover, program manager for mobility at UC Berkeley’s Partners for Advanced Transportation Technology program, has calculated that there is one fatal crash for every 3.3 million hours of driving in the U.S., the equivalent of someone driving 24 hours a day for 375 years.

 

“That sets a very high bar that the new systems have to meet just to be as acceptable,” Shadlover said. “It’s a big job to get it to what is equal to today.”

As a result, many experts believe it will take many years before fully autonomous cars hit the street. “It’s a crawl, walk, run adventure to be sure,” said Christopher Hart, former chairman of the National Transportation Safety Board, who last November gave a talk about the challenges and opportunities of autonomous vehicles in Meshkati’s ISE 370 class. “I’m not saying we can’t get there, but we’re a long way from getting there. The reality is, what if automation fails? We learned over 100 years ago not to say that this ship can’t sink.”

Shadlover is so insistent about the importance of humans in the operation of automobiles that he believes, “We should ban the word driverless. It’s going to be a very long time before the driver goes away.”

One problem that has cropped up with automated airplanes, Meshkati said, is “de-skilling,” where pilots spend so little time at the controls that their skills atrophy. He said that younger pilots who learn to fly on automated aircraft may not even acquire the skills needed in emergencies.

Added the NTSB’s Hart: “That problem gets worse as automation gets more reliable.”

Meshkati pointed to Air France Flight 447 traveling from Rio de Janeiro to Paris, which crashed in June 2009, killing all 228 people on board. Because of bad weather and a complex sequence of events, the autopilot disconnected, sending the plane into a stall that the pilots were unable to control. “When something goes wrong and the whole system fails, at end of the day, it is the human operators who are last level of defense,” Meshkati said.

Air France (Courtesy/Wikimedia)

In a research article he published with Stephanie Chow and Stephen Yortsos, two of his former USC Viterbi students, Meshkati referred to Asiana flight 214, where the fuselage hit a seawall near the end of the runway at San Francisco International Airport in July 2013, injuring 187 and killing three. They concluded that the captain’s lack of understanding of the automation system contributed to the crash.

Meshkati and the NTSB’s Hart point to Capt. Sully Sullenberger’s famous landing of US Airways Flight 1549 in the Hudson River, when both engines were taken out by birds, as an example of a pilot saving the day. All 155 people aboard survived.

A big difference between airplane automation and self- driving cars occurs when the pilot or driver has to take control. A plane usually flies in the open sky with no other planes around. If the pilot needs to take control, he or she can have thousands of feet and more than a few seconds to take over and make several attempts to maneuver out of danger.

US Air Flight 1549 rescue efforts (Courtesy/Wikimedia)

But the driver of a car, Meshkati said, may have only a spilt second and one chance for avoiding a crash in a “tightly-coupled” traffic on a busy street or freeway. Even if the manufacturer insists the driver must focus on the road when the automatic system is in control, after a couple days with no problems, most drivers probably become complacent and will stop paying attention.

In 2012, Google employees tested the company’s self-driving cars on the freeway portion of their commutes. They were told to pay attention to the road and that they would be filmed during their drives.

“We saw some silly behavior, including someone who turned around and searched the back seat for his laptop to charge his phone — while traveling 65 mph down the freeway!” Google wrote in a blog.

A study by the Virginia Tech Transportation Institute that found it took drivers five to eight seconds to take control of their cars and another by the Nation Highway Traffic Safety Administration determined it took drivers as long as 17 seconds, enough time that they could drive more than a quarter of a mile.

Google was not alone in wondering how long it will take drivers to get oriented once they must take control. What if the person was sleeping? “Do you have enough understanding of what’s going on around the vehicle to make the right decision?” Google wrote.

Meshkati also worries about the tech company practice of releasing products and then relying on customers to find bugs, combined with the automakers’ endless recalls of everything from air bags to faulty ignition switches and problems with gas tanks. “This mentality for this technology is very serious and dangerous,” he said.

Meshkati who said he “treasured” his recent first experience in a self-driving car, said he isn’t a Luddite – just cautious. He thinks the driverless car is a good fit for stop-and-go freeway traffic when cars are traveling 8 mph and the accident is of “low consequence”. But when speeds pick up, he said, an accident endangers the well-being of those “innocent” people in surrounding vehicles and bystanders, not just those in the self-driving car.

Driverless car (Courtesy/Wikimedia)

“Why should we put our lives and our loved ones at the risk of imperfect technology, this fast?” Meshkati asked. “It seems that we, the innocent defenseless people, are being subject to an emerging vicious process: a technology that is being rushed to the marketplace by a global network of overzealous software, sensor, and car manufactures all trying to cash in on this frenzy while being monitored by a patchwork of unprepared, feckless, or captured regulators.

“This is a serious public policy issue,” added Meshkati, who expressed such concerns in a short piece in the New York Times in the wake of a controversial Tesla S crash that killed its driver in July 2016. “We need to make sure we’ve really tested the technology before unleashing it on the public. It’s the job of government agencies to scrutinize, and they need to be chronically skeptical and have a chronic unease when it comes to this new safety-sensitive technology and be over-vigilant”

Published on March 23rd, 2018

Last updated on April 8th, 2021

Share This Story