It was still dark on a Friday morning in November, when a California Highway Patrol officer followed a Tesla Model S on Route 101 between San Francisco and Palo Alto International Airport. The gray sedan turned 70 miles per hour with a flashing light flashing past several exits. The officer came to stand next to him and saw the driver drop from his head in an attitude. Lights and sirens did not wake him up. The car, the officer suspected, controlled himself under the control of what Tesla Autopilot calls.
Each Tesla is equipped with hardware that, according to the automaker, enables its vehicles to drive themselves on whole journeys, from car park to car park, without input from the driver. At the moment, the company limits its cars to a system that can lead them from ramp to exit on motorways. The system is smart enough to keep the Tesla safe, even with a seemingly incapacitated driver, but not yet smart enough to obey and cross the sirens of the police.
This case appears to be the first time the police have stopped a vehicle on an open road under the control of an automated system. There was no way for the police to drive the driving software, so they improvised a way to manipulate Tesla's security programming. A patrol car on the highway blocked the traffic from behind while the officer pulled behind the Tesla at the front and began to slow down until both cars came to a standstill.
The incident encapsulates both the high expectations and deep fears of the future without a driver. The driver of the Tesla, a 45-year-old Los Altos man, according to the police did not pass a practice-related austerity test and is charged with driving under the influence; a trial is planned for May. The car, which seems to have sailed about 10 miles from the nocturnal highway without the help of a man, may have saved a drunken driver from himself or others. Neither Tesla nor the police are ready for people to rely on technology in this way.
According to Tesla's disclaimer, drivers must remain "alert and active" when they use Autopilot and are willing to take control when, for example, the police are approaching. If the car does not feel hands on the steering wheel, it should come to a slow stop and turn on its hazard lights. Two days after the incident, Tesla Inc., Chief Executive Officer Elon Musk, tweeted that he "looked at what was happening here." A spokesman for the company referred to Musk's tweet and refused to share everything the company had learned from the car's datalog.
"My guess when we think it's safe for someone to fall asleep and wake up at their destination, probably by the end of next year," Musk told the podcast "For Your Innovation" in an episode that Monday was released.
The police who had stopped the Tesla that night had never used the technique before. It is not part of their training. Coincidentally they wanted to know enough about the Tesla to improvise a reaction. "That's a great change," said Lieutenant Saul Jaeger of the nearby Mountain View police station. This familiarity is perhaps to be expected in the heart of Silicon Valley – the car stopped halfway between the headquarters of Facebook and Google – but relying on the fast sense of law enforcement is not a scalable plan.
Robots can not take over the road until car manufacturers, engineers, legislators and police endure a series of thorny problems: how can an agent take over an autonomous car? What should robot drivers do after a collision? How do you program a vehicle to recognize human authorities?
Five years ago, a robotic technician from the Massachusetts Institute of Technology, named John Leonard, began making dashcam videos on his Boston rides. He was looking for catalog moments that would be difficult for artificial intelligence to navigate. One evening he saw a policeman in a busy crossroads to block oncoming traffic and allow pedestrians to cross against the light. He added that to his list.
Of all the challenges that self-propelled technology stands for, such examples are one of the most terrifying and a large part of the reason that real unmanned cars will come ", slower than many in the industry predicted," says Leonard. He knows: Leonard has said goodbye to MIT in 2016 to join the Toyota Research Institute and lead the AV efforts of the automaker.
Waymo LLC, the autonomous driving start-up launched by Google's parent company and now deployed for passengers in the Phoenix area, has already reached almost the exact scenario that Leonard was worried about. In January, a sensor-loaded Chrysler Pacifica minibus was rolled up in Waymo's automated ride-hailing fleet into a darkened traffic light in Tempe, Arizona. The power was out and a policeman was on the road and led the traffic. In dashcam images in addition to a view of Waymo's computer vision, the minibus stops at the intersection, waits for traffic on the cross, and a left-handing car drives the other way, and then continues when it is cleared by the officer.
A Waymo spokeswoman, Alexis Georgeson, says the company's fleet can distinguish between citizens and police who are on the road and can follow hand movements. "They will admit and respond on the basis of recognition that it is a policeman," she says. "Our cars are doing very well on construction zones and respond to uniformed officers."
Waymo opts for a territorial approach to autonomous vehicles, with the emphasis on the development of fleets that can serve as taxis in restricted areas and that stop with full autonomy available everywhere, a not yet reached barrier known in the industry stands as level 5. Working in a limited space makes both for building detailed maps and for easier coordination with the government and law enforcement. Instead of trying to enter different jurisdictions, Waymo Chandler, a suburb of Phoenix with wide avenues, sunny weather and hospitable national and local authorities, chose his first living laboratory. Many of its competitors operate in the same way, with an emphasis on fleets that would remain within certain limits. Ford Motor Co. test in Miami and Washington, D.C .; General Motors Co., Cruise, Zoox Inc. and Toyota Motor Corp. are among the dozens of companies that test autonomous cars on roads in California.
In the summer of 2017, about a year and a half before the debut of her first ride in Chandler, Waymo invited local police, firefighters and ambulance personnel for a test day in which trucks and patrol cars roar with sirens and blinking lights – approached minibuses without a driver from all angles on a closed track. "We have had a lot of interaction with their staff about the research and development of their technology," says Chandler spokesperson Matt Burdick.
Last year, Waymo was the first AV maker to publish an interaction protocol for law enforcement. If one of the self-driving cars detects the police while the light is flashing, the document states: "It is designed to stop and stop when it finds a safe place to do so."
Jeff West, a battalion chief at Chandler's fire brigade, says that the Waymo vehicles he has seen on the road have been cleared more quickly than many human-powered cars. "Once it recognizes us, it's about", he says, "while someone might listen to a radio or switch on their air conditioner."
At the moment, however, most Waymo cabs have a safety driver at the wheel to take over in any situation that the car might stomp. There have been no burglaries between the local police and a human-free, car without driver, says Burdick. When that day comes, says Matthew Schwall, head of field safety at Waymo, the police can contact the support team of the company by calling a 24-hour hotline or by pressing the help button of the minibus above the second row of seats. At that time, Waymo & # 39; s remote staff can not take over the direct control of the vehicle, but they can divert it – if the police, for example, wants it to drive to the side of the road after a collision.
Michigan state trooper Ken Monroe took Ford's engineers around Flint last summer. The engineers were especially curious about what he wanted drivers to do when he came after them with flashing lights, and how those reactions differed, depending on whether he was pulling over a car or trying to get past.
"While responding to an emergency, they said:" Okay, you're approaching this vehicle here. "What is the best-case scenario that you can find to do that vehicle?" "They spoke extensively, Monroe, about how an autonomous vehicle could recognize when it was pulled. "The biggest cue we came up with was the time that the police vehicle was behind the AV."
In addition to testing in Miami and Washington, Ford has been working with the Michigan police for nearly two years as part of the preparations for the rollout of autonomous driving services and delivery vans planned for 2021. Two decades ago, several dozens of Michigan troopers were State Police came to his office in Dearborn to talk about his plans. "We have emphasized that this will not be a private vehicle," says Colm Boran, head of autonomous vehicle systems at Ford. "That immediately helped to remove some of their concerns."
Autonomous car & # 39; s learning to pull right is a relatively simple task. The point of the lights and sirens can be seen from afar. "If it is salient to a person, it is probably a salient to a machine," says Zachary Doerzaph, director of the Center for Advanced Automotive Research at the Virginia Tech Transportation Institute. The bigger challenges come when the police and other first responders are outside their vehicles: "It's all those other cases where the last 10 percent of the development could last most of the time." Doerzaph & # 39; s team examines such scenarios for a group of car manufacturers, but he can not talk about his findings yet.
The frequently used jargon for these atypical moments is & # 39; edge cases & # 39 ;, but the term belies the extent of the challenge, says Doerzaph. At any moment there are thousands of construction zones, crash sites and police at intersections across the country. The signals people use to recognize them are subtle and varied. People also recognize basic hand signals and, perhaps most important to the police, recognize instructions with eye contact or a nod.
It may be necessary because self-employed researchers try to replicate these subtle interactions, to create new ways of communication between cars and police. In theory, when Trooper Monroe leaves his patrol car on the freeway in Michigan, he can instruct all AVs in the area to steer with a few taps on a portable device. These types of solutions, although technologically elegant, offer a range of logistical and legal barriers.
Inrix Inc., a Washington-based startup specializing in digital traffic and parking information, has started offering software to cities that allow them to enter traffic rules and roadway markers in the high definition maps used by AV developers. City officials can mark the locations of stop signs, crossings, cycle paths, etc. and when an AV commutes the navigation software to map out a route, it is given the rules and restrictions for its journey. Boston, Las Vegas, Austin and four other cities currently use the service called AV Road Rules.
The maps can be constantly updated. If road works block a lane, a city can mark the change. Inrix is working to make it possible for the police to update the map directly from their cars. "That is something we have heard that we are interested in, and we are investigating how we can turn this hypothetical functionality into a real tool," says Avery Ash, head of autonomous mobility at Inrix.
As soon as the AV industry solves daily traffic disruptions, accident scenes and roadworks, a long list of real "edge cases" awaits. "What if it's a terrorist suspect? What if I ordered the car and just threw in a backpack, and then told the car to go where and then blow up?" Asks Lieutenant Jaeger in Mountain View, who works with Waymo- engineers because the company was self-propelled car project in Google.
The good news for the industry is that cities, police and car manufacturers are all motivated to find answers because they all agree that the status quo is unacceptable. More than 37,000 people lose their lives in car accidents every year, and the vast majority of the crashes are due to human error. The police are one of the most important witnesses of this massacre and sometimes are victims. Cars that could detect their sirens from miles away and reliably follow the rules would be a welcome change.
"The human driver is simply not predictable," says Monroe, the state executive. "It is very, very difficult."
Schwall from Waymo says that when he is training with the police – she shows how the company's bus works and lets them in – he often hears the same question: "They ask when they can have a self-driving police car."
Copyright 2019 Bloomberg.