Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the hcaptcha-for-forms-and-more domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /var/www/wp-includes/functions.php on line 6114

Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the hcaptcha-for-forms-and-more domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /var/www/wp-includes/functions.php on line 6114

Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the wordpress-seo domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /var/www/wp-includes/functions.php on line 6114
Tesla Autopilot Controversy Continues After Software Update
Friday, December 27, 2024

Trending Topics

HomeBusinessTesla Autopilot Controversy Continues After Software Update

Tesla Autopilot Controversy Continues After Software Update

spot_img

I thought I had written my last CleanTechnica article for 2023, but when I logged on to the Washington Post , I found a story by Geoffrey Fowler with this headline — “Testing Tesla’s Autopilot recall, I don’t feel much safer and neither should you. ” What recall is he talking about? Three weeks ago, Tesla said it would update the software for its Autopilot system in some 2 million cars in the US to fix some issues that NHTSA claimed were unsafe. Right away, Teslafincionados were upset the headline of our story used the word “recall.

” They insisted anything that could be addressed by an over the air update was not a recall. They complained using that word in the title was a cheap “clickbait” trick designed to suck people into reading the story. When I posted a link to the form Tesla filed with NHTSA that clearly has the world “Recall” prominently displayed in its title, they exploded in rage.

They were last seen muttering darkly and plotting their revenge. A few days later, many of those same Tesla stalwarts erupted in a new outpouring of venom when the Washington Post dared to ask why Autopilot can be activated in situations where Tesla say it should not be used. Many comments suggested Jeff Bezos called down to the Post newsroom to demand an attack on Tesla, the theory being that Bezos and Musk don’t like each other and so naturally Bezos is ordering up hit pieces on his rival.

Well, OK, if you want to believe that codswallop, go ahead. Have yourself a pity party. It’s OK, you can’t help yourself.

We understand. The recall or over the air update — whatever you want to call it — is now complete. So Fowler took his Tesla Model Y — one of the best selling cars in the world — out for a spin to see if the software update solved the issues the prior Post story complained about.

The subtitle to his article pretty much says it all — “On the streets of San Francisco, the updated version of Tesla’s driver-assistance software still took the wheel in places it wasn’t designed to handle, including blowing through stop signs. ” Here’s more from his report. After testing my Tesla update, I don’t feel much safer — and neither should you, knowing that this technology is on the same roads you use.

During my drive, the updated Tesla steered itself on urban San Francisco streets Autopilot wasn’t designed for. (I was careful to let the tech do its thing only when my hands were hovering by the wheel and I was paying attention. ) The recall was supposed to force drivers to pay more attention while using Autopilot by sensing hands on the steering wheel and checking for eyes on the road.

Yet my car drove through the city with my hands off the wheel for stretches of a minute or more. I could even activate Autopilot after I placed a sticker over the car’s interior camera used to track my attention. The underlying issue is that while a government investigation prompted the recall, Tesla got to drive what went into the software update — and it appears not to want to alienate some customers by imposing new limits on its tech.

It’s a warning about how unprepared we are for an era where vehicles can seem a lot more like smartphones, but are still 4,000-pound speed machines that require a different level of scrutiny and transparency. Fowler said that after the update, the warnings the software posts on the touchscreen are in a larger font but his car still allowed him to activate Autosteer on roads where the owner’s manual warns it should not be used. Before the update, he could drive without his hands on the wheel for 75 seconds on one secondary road.

After the update, that time was reduced to 60 seconds. The car also clunked over speed bumps without slowing and blew through stop signs without stopping even though the signs were clearly displayed on his car’s touchscreen. After reviewing the owner’s manual, Fowler says determined that drivers need to purchase the Full Self Driving software suite if they want their cars to obey stop signs.

And that’s the problem. Tesla assumes a level of understanding about what features are included in which software package that may be beyond the skill level of many. Confusion is not something that promotes safe driving.

“More worrisome,” Fowler says, “is how the recall handled my car’s interior camera. It’s used along with pressure on the steering wheel to check whether the driver is paying attention and not looking at their phone. When I covered the lens with a smiley-face sticker — a trick I read about on social media from other Tesla owners — the car would still activate Autosteer.

The system did send more warnings about keeping my hands on the wheel while the camera was covered. But I don’t understand why Tesla would allow you to activate Autosteer at all when the camera is either malfunctioning or being monkeyed with. ” That seems like a legitimate question.

Fowler made his concerns known to NHTSA. Veronica Morales, the agency’s director of communications, said the “investigation remains open” and that NHTSA will “continue to examine the performance of recalled vehicles. ” She declined to comment on the specifics of Fowler’s experience, but she said that the law, known as the Vehicle Safety Act, “puts the burden on the manufacturer” to develop safety fixes.

“NHTSA does not preapprove remedies,” she said. Instead, “the agency will monitor field and other data to determine its adequacy, including field monitoring of the effects of the remedy in addressing the safety problem and testing any software or hardware changes in recalled vehicles. ” She added that the agency has several Tesla vehicles that it will use for testing at its Vehicle Research and Test Center in Ohio.

“Consumers should never attempt to create their own vehicle test scenarios, or use real people or public roadways to test the performance of vehicle technology,” Morales chided. “Intentional unsafe use of a vehicle is dangerous and may be in violation of State and local laws. ” Yet Fowler says, “Every Tesla driver who is using Autopilot with the update is testing the performance of the technology while we wait for NHTSA to do its own.

It’s hard to see how post-release review serves public safety in an era where software, and especially driver assistance capabilities, introduces very new kinds of risk. Compare a current Tesla to your phone. Apps are subjected to pre-release review by Apple and Google before they’re made available to download.

They must meet transparency requirements. Why should a car get less scrutiny than a phone?” Fowler asks a valid question, yet the partisan nattering between Tesla defenders and Tesla antagonists continues hot and heavy. The Fowler article has gotten more than 2500 comments as of this moment, so it’s fair to say people have strong opinions on this subject.

Our own take around the CleanTechnica FroYo counter is that Autopilot has always been a potentially misleading term that is writing checks it cannot cash. Ultimately it is not the opinion of one multi-billionaire who is a computer genius that counts. It is the understanding of people of ordinary intelligence who may be gullible or simply confused about what their Tesla automobiles can and cannot do.

An additional consideration is whether other drivers are even aware they are participating in some beta software trial without their consent. This is ground that has been plowed many times and we don’t expect there to be any agreement among readers on this subject. We will simply say the final chapter of this story has yet to be written.

So make your voice heard and Happy New Year! LinkedIn WhatsApp Facebook X Email Mastodon Reddit.


From: cleantechnica
URL: https://cleantechnica.com/2023/12/31/tesla-autopilot-controversy-continues-after-software-update/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News