The May 5 crash in Fontana, a metropolis 50 miles (80 kilometers) east of Los Angeles, can be underneath investigation by the National Highway Traffic Safety Administration. AP Photo
LOS ANGELES: The driver of a Tesla concerned in a deadly crash that California freeway authorities stated could have been working on Autopilot posted social media videos of himself riding in the car with out his fingers on the wheel or foot on the pedal.
The May 5 crash in Fontana, a metropolis 50 miles (80 kilometers) east of Los Angeles, can be underneath investigation by the National Highway Traffic Safety Administration. The probe is the twenty ninth case involving a Tesla that the federal company has probed.
In the Fontana crash, a 35-12 months-outdated man recognized as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m.
Hendrickson was member of the Southern California chapter of a Tesla membership who posted quite a few photographs and video on social media of his white Model 3. One video on his Instagram account confirmed him riding in the motive force’s seat with out his fingers on the wheel or foot on the pedal because the Tesla navigated freeway site visitors. The video included the remark: “Best carpool buddy possible even takes the boring traffic for me.”
A GoFundMe web page set as much as elevate cash for his funeral and memorial service says Hendrickson was survived by his spouse and two youngsters. A message searching for remark from his spouse has not been returned.
“Every time we spoke to him, he would light up talking about his kids and loved his Tesla,” Tesla Club-SoCal posted on Instagram. “He was truly an amazing human being and will be missed!
Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.
The CHP announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” prior to the crash. The agency said it was commenting on the Fontana crash because of the “excessive degree of curiosity” about Tesla crashes and because it was “a possibility to remind the general public that driving is a posh activity that requires a driver’s full consideration.”
However on Friday, the agency walked back its previous declaration.
“To make clear,” a new CHP statement said, “There has not been a closing willpower made as to what driving mode the Tesla was in or if it was a contributing issue to the crash.”
At least three people have died in previous U.S. crashes involving Autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it. Tesla is allowing a limited number of owners to test its self-driving system.
Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.
Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.
In two Florida crashes, from 2016 and 2019, cars with Autopilot in use drove beneath crossing tractor-trailers, killing the men driving the Teslas. In a 2018 crash in Mountain View, California, an Apple engineer driving on Autopilot was killed when his Tesla struck a highway barrier.
Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.
After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action.
In a Feb. 1 letter to the U.S. Department of Transportation, NTSB Chairman Robert Sumwalt urged the department to enact regulations governing driver-assist systems such as Autopilot, as well as testing of autonomous vehicles. NHTSA has relied mainly on voluntary guidelines for the vehicles, taking a hands-off approach so it won’t hinder development of new safety technology.
Sumwalt said that Tesla is using people who have bought the cars to test “Full Self-Driving” software on public roads with limited oversight or reporting requirements.
“Because NHTSA has put in place no necessities, producers can function and check autos just about wherever, even when the placement exceeds the AV (autonomous car) management system’s limitations,” Sumwalt wrote.
He added: “Although Tesla features a disclaimer that ‘at the moment enabled options require lively driver supervision and don’t make the car autonomous,’ NHTSA’s fingers-off strategy to oversight of AV testing poses a possible danger to motorists and different street customers.”
NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Joe Biden took office.