Wednesday, October 23, 2024

Trending Topics

HomeBusinessNo One Knows How Safe New Driver-Assistance Systems Really Are

No One Knows How Safe New Driver-Assistance Systems Really Are

spot_img

This week, a US Department of Transportation report detailed the crashes that advanced driver-assistance systems have been involved in over the past year or so. Tesla ’s advanced features, including Autopilot and Full Self-Driving, accounted for 70 percent of the nearly 400 incidents—many more than previously known. But the report may raise more questions about this safety tech than it answers, researchers say, because of blind spots in the data.

The report examined systems that promise to take some of the tedious or dangerous bits out of driving by automatically changing lanes, staying within lane lines, braking before collisions, slowing down before big curves in the road, and, in some cases, operating on highways without driver intervention. The systems include Autopilot, Ford’s BlueCruise, General Motors’ Super Cruise, and Nissan’s ProPilot Assist. While it does show that these systems aren’t perfect, there’s still plenty to learn about how a new breed of safety features actually work on the road.

That’s largely because automakers have wildly different ways of submitting their crash data to the federal government. Some, like Tesla, BMW, and GM, can pull detailed data from their cars wirelessly after a crash has occurred. That allows them to quickly comply with the government’s 24-hour reporting requirement.

But others, like Toyota and Honda, don’t have these capabilities. Chris Martin, a spokesperson for American Honda, said in a statement that the carmaker’s reports to the DOT are based on “unverified customer statements” about whether their advanced driver-assistance systems were on when the crash occurred. The carmaker can later pull “black box” data from its vehicles, but only with customer permission or at law enforcement request, and only with specialized wired equipment.

Of the 426 crash reports detailed in the government report’s data, just 60 percent came through cars’ telematics systems. The other 40 percent were through customer reports and claims—sometimes trickled up through diffuse dealership networks—media reports, and law enforcement. As a result, the report doesn’t allow anyone to make “apples-to-apples” comparisons between safety features, says Bryan Reimer, who studies automation and vehicle safety at MIT’s AgeLab.

Even the data the government does collect isn’t placed in full context. The government, for example, doesn’t know how often a car using an advanced assistance feature crashes per miles it drives. The National Highway Traffic Safety Administration, which released the report, warned that some incidents could appear more than once in the data set.

And automakers with high market share and good reporting systems in place—especially Tesla—are likely overrepresented in crash reports simply because they have more cars on the road. It’s important that the NHTSA report doesn’t disincentivize automakers from providing more comprehensive data, says Jennifer Homendy, chair of the federal watchdog National Transportation Safety Board. “The last thing we want is to penalize manufacturers that collect robust safety data,” she said in a statement.

“What we do want is data that tells us what safety improvements need to be made. ” Without that transparency, it can be hard for drivers to make sense of, compare, and even use the features that come with their car—and for regulators to keep track of who’s doing what. “As we gather more data, NHTSA will be able to better identify any emerging risks or trends and learn more about how these technologies are performing in the real world,” Steven Cliff, the agency’s administrator, said in a statement.

Outside of the NHTSA, that information is vanishingly hard to come by. Police reports and insurance claims can highlight issues with advanced safety features, says David Kidd, a senior researcher at the nonprofit Insurance Institute for Highway Safety. But accurate police reports depend on law enforcement identifying and understanding many different systems across many different automakers.

And claims can only relate whether a vehicle involved in a crash was equipped with a safety system—but not if it was on at the time of the crash. Tesla offers some degree of self-reporting but for years relied on a statistic that the NHTSA indicated was misleading in 2018. The company’s quarterly Autopilot safety reports don’t include important context, like how often cars with the system enabled crash off the highway, and how much safer those using the feature are than others driving other luxury vehicles.

Tesla didn’t respond to a request for comment about the new DOT report. The fear, says Kidd, the IIHS researcher, is that the new safety systems “can produce different types of crashes and potentially new failures that create different types of safety problems. ” The DOT, for example, is investigating incidents in which Teslas have crashed into stopped emergency vehicles, killing at least one person and injuring 15.

It is also looking into reports of vehicles on Autopilot suddenly braking without warning and for no apparent reason. Humans “can handle a lot of oddball road situations in stride,” says Kidd. But some car systems “are not flexible enough, not innovative enough, to deal with what’s on the road today.

” Beyond specific tech, safety researchers question whether driver-assistance systems are fundamentally flawed. Carmakers warn that drivers must keep their hands on their steering wheels and eyes on the road even while the systems are engaged, but decades of research suggests that it’s hard for humans to keep paying attention to the task at hand when a machine is doing most of the work. Consumer Reports ranked GM’s Super Cruise and Ford’s BlueCruise as the safest driver-assistance systems because both automakers use in-car cameras to verify that drivers are looking ahead.

A study by Reimer’s team at MIT found that drivers using Autopilot were more likely to look away from the road once the system was on. Reimer sees the DOT report and data set as a call to action. “With automation comes an inherent new level of complexities,” Reimer says.

“There are lots of risks and lots of rewards. ” The trick will be to minimize those risks—and doing that will require much better data. .


From: wired
URL: https://www.wired.com/story/advanced-driver-assistance-system-safety-tesla-autopilot/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News