Tesla’s self-driving technology can be “easily tricked” to turn on when nobody is sitting in the driving seat, an issue that could cause havoc if used on roads, according to a consumers’ group.
Engineers from Consumer Reports found that in a half-mile closed test track, the Tesla Model Y automatically steered along painted lane lines without any warning or indication that there was not a driver in the seat.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” says Jake Fisher, CR’s senior director of auto testing, who conducted the experiment.
“Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.”
Tesla, has disbanded its press office, did not respond to a request for comment from The Independent before time of publication.
Tesla’s Autopilot partially automated system can keep a car centred in its lane, keep a distance from cars in front of it, and can even change lanes on its own with a driver’s consent. But Tesla has said the driver must be ready to intervene at all times.
Consumer Reports said that during several trips on its closed tracks with an empty driver’s seat, its Tesla Model Y automatically steered along painted lane lines without acknowledging that nobody was at the controls.
In the Consumer Reports test, Fisher said he engaged Autopilot while the car was in motion on the track, then set the speed dial to zero to stop it. Fisher then affixed a small, weighted chain on the steering wheel to simulate the weight of a driver’s hand. He then slid over into the front passenger seat where he was able to accelerate and decelerate the vehicle.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher said. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Consumer Reports noted the test was performed on its closed track and that “under no circumstances should anyone try” to duplicate it.
“Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger,” Fisher said.
Consumer Reports’ test came just days after a Tesla crashed in Texas killing the two men in the car. Authorities say neither of the men were in the driver’s seat at the time of the crash.
The Tesla that crashed outside of Houston over the weekend was a Model S, but also had an Autopilot function.
The National Highway Traffic Safety Administration and the National Transportation Safety Board (NHTSA) are in the early stages of an investigation into the Texas crash. Local authorities said one man was found in the passenger seat, while another was in the back. The car veered off the road, crashed into a tree and burst into flames, authorities said.
Investigators should be able to determine whether the Tesla’s Autopilot system was in use.
Tesla CEO Elon Musk said on Twitter Monday that data logs “recovered so far” show Autopilot wasn’t turned on in the Texas crash, and “Full Self-Driving” was not purchased for the vehicle. He did not respond to reporters’ questions posted on Twitter.
In the past, NHTSA, which has authority to regulate automakers and seek recalls for defective vehicles, has taken a hands-off approach to regulating partial and fully automated systems for fear of hindering development of promising new features.
But since March, the agency has stepped up inquiries into Tesla, dispatching teams to three crashes. It has investigated 28 Tesla crashes in the past few years, but thus far has relied on voluntary safety compliance from auto and tech companies.
Additional reporting by Associated Press