Sunday, November 10, 2024
HomeVehiclesNHTSA investigates Tesla that didn't cease for varsity bus, struck teen

NHTSA investigates Tesla that didn’t cease for varsity bus, struck teen



DETROIT — U.S. road safety regulators have sent a team to investigate a crash involving a Tesla that may have been operating on a partially automated driving system when it struck a student who had just exited a school bus.

The National Highway Traffic Safety Administration Friday that it will probe the March 15 crash in Halifax County, North Carolina, that injured a 17-year-old student. The State Highway Patrol said the driver of the 2022 Tesla Model Y, a 51-year-old male, failed to stop for the bus, which was displaying all of its activated warning devices.

Sending special investigation teams to crashes means that the agency suspects the Teslas were operating systems that can handle some aspects of driving, including Autopilot and “Full Self-Driving.” Despite the names, Tesla says these are driver-assist systems and that drivers must be ready to intervene at all times.

A message was left Friday seeking comment from Tesla.

Tillman Mitchell, a student at the Haliwa-Saponi Tribal School in Hollister, had just exited the bus and was walking across the street to his house when he was hit, according to the Highway Patrol.

He was flown to a hospital with life-threatening injuries but was listed in good condition two days after the crash.

 

“I’ve been saying probably for a couple of years now, they need to figure out why these vehicles aren’t recognizing flashing lights, for a big starter. NHTSA needs to step in and get them to do a recall because that’s a serious safety issue.”

 

Messages left with the North Carolina State Highway Patrol were not immediately returned. A spokesperson for WakeMed hospital in Raleigh did not immediately provide an update on the student’s condition or indicate whether he had been discharged.

NHTSA has sent investigative teams to more than 30 crashes since 2016 in which Teslas suspected of operating on Autopilot or “Full Self-Driving” have struck pedestrians, motorcyclists, semi trailers and parked emergency vehicles. At least 14 people were killed in the crashes.

In March the agency sent a team to a Feb. 18 crash in which a Tesla Model S hit a fire department ladder truck in Contra Costa County, California. The Tesla driver was killed, a passenger was seriously hurt, and four firefighters suffered minor injuries.

Authorities said the California firetruck had its lights on and was parked diagonally on a highway to protect responders to an earlier accident that did not result in injuries.

The probes are part of a larger investigation by NHTSA into multiple instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations.

NHTSA is investigating how the Autopilot system detects and responds to emergency vehicles parked on highways.

The agency wouldn’t comment on open investigations, but it has been scrutinizing Teslas more intensely in the past year, seeking several recalls.

Tesla and NHTSA need to determine why the vehicles don’t seem to see flashing lights on school buses and emergency vehicles and make sure the problem is fixed, said Michael Brooks, executive director of the nonprofit Center for Auto Safety in Washington.

“I’ve been saying probably for a couple of years now, they need to figure out why these vehicles aren’t recognizing flashing lights for a big starter,” Brooks said. “NHTSA needs to step in and get them to do a recall because that’s a serious safety issue.”

Earlier this month the agency revealed an investigation of steering wheels that can detach from the steering column on as many as 120,000 Model Y SUVs. It’s also investigating seat belts that may not be anchored securely in some Teslas.

NHTSA also has opened investigations during the past three years into Teslas braking suddenly for no reason, suspension problems and other issues.

In February, NHTSA pressured Tesla into recalling nearly 363,000 vehicles with “Full Self-Driving” software because the system can break traffic laws. The problem was to be fixed with an online software update.

The system is being tested on public roads by as many as 400,000 Tesla owners. But NHTSA said in documents that it can make unsafe actions such as traveling straight through an intersection from a turn-only lane, going through a yellow traffic light without proper caution or failing to respond to speed limit changes.

The U.S. Justice Department also has asked Tesla for documents from Tesla about “Full Self-Driving” and Autopilot.

 

 

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular

Recent Comments