Dubai Tech News

Former Head Of Tesla AI Explains Why They’ve Removed Sensors; Others Differ

Transportation Former Head Of Tesla AI Explains Why They’ve Removed Sensors; Others Differ Brad Templeton Senior Contributor Opinions expressed by Forbes Contributors are their own. I cover robocar technology & previously worked on Google’s car team. Following New! Follow this author to stay notified about their latest stories.

Got it! Oct 31, 2022, 08:00am EDT | New! Click on the conversation bubble to join the conversation Got it! Share to Facebook Share to Twitter Share to Linkedin Andrej Karpathy former director of AI at Tesla (Photo By Michael Macor/The San Francisco Chronicle . . .

[+] via Getty Images) San Francisco Chronicle via Getty Images n a recent interview, Andrej Karpathy, who formerly was head of AI for Tesla’s Autopilot and FSD products, outlined their reasoning behind removing both the radar and ultrasonics from Tesla cars, as well as never using LIDAR or maps. While Elon Musk is best known for making statements on this, Karpathy was his go-to guy on backing up that reasoning. Karpathy raised eyebrows, however, when earlier this year he took a sabbatical from the job and eventually announced he would leave it.

Karpathy’s main points: Extra sensors add cost to the system, and more importantly complexity. They make the software task harder, and increase the cost of all the data pipelines. They add risk and complexity to the supply chain and manufacturing.

Elon Musk pushes a philosophy of “the best part is no part” which can be seen throughout the car in things like doing everything through the touchscreen. This is an expression of this philosophy. Vision is necessary to the task (which almost all agree on) and it should also be sufficient as well.

If it is sufficient, the cost of extra sensors and tools outweighs their benefit. Sensors change as parts change or become available and unavailable. They must be maintained and software adapted to these changes.

They must also be calibrated to make fusion work properly. Having a fleet gathering more data is more important than having more sensors. Having to process LIDAR and radar produces a lot of bloat in the code and data pipelines.

He predicts other companies will also drop these sensors in time. Mapping the world and keeping it up to date is much too expensive. You won’t change the world with this limitation, you need to focus on vision which is the most important.

The roads are designed to be interpreted with vision. Complexity of sensor fusion In my recent interview with Jesse Levinson , CEO and co-founder of Zoox, I asked him the same question. While he concurred that having more sensors definitely is more work and more noise, those problems are not intractable and are worth the benefit.

He believes that if you are smart and do your sensor fusion right, you can assure that new data from sensors and contradictory data are not a downside. While every input has noise, if you are good you can pull the true signal from it and win. MORE FOR YOU Why The Rock’s Social Media Muscle Made Him Hollywood’s Highest-Paid Actor Cabot Delves Into Outside Golf Management With Two Vegas Courses And Two Others Near Casinos What To Expect From Electronic Arts’ Q2? In general, other teams will not necessarily disagree with too many of Karpathy’s points.

Having multiple sensors and fusion does add lots of complexity and cost. Many will even agree that some day down the road, vision may be sufficient and those other sensors can be dropped. However, all (including probably Karpathy and Musk) would agree that vision is not sufficient today.

Further the others would state that it’s not at all clear when vision will be sufficient. Karpathy and many others make the point that humans drive primarily with vision, so it’s clearly possible, but the reality is that computers don’t have nearly the power that human brains have at doing this. Very few technologies work just like human minds — that birds fly with flapping wings does not imply that aircraft designers follow that routes.

It is more common to make use of different, or in some cases superhuman capabilities of machines to make up for the lack of brain power in machines. Tesla’s approach would be quite rare in the AI world, deliberately constraining a system to just the ability of human sensors, and hoping to match the human brain to work with those constrained sensors. Cost as the driver, or time to market? This difference of view stems in part from the fact that Tesla is a carmaker, and further from their goal to make their system work on their already shipped cars, or at worst a minor retrofit of their already shipped cars.

(This retrofit is already underway, and owners of old cars have seen one upgrade of the main processor with a second one pending, as well as a replacement of the cameras — and perhaps a new camera system rumored — in some cases. ) Automakers are very, very cost-conscious. Everything they add to a vehicle adds 2-5 times its cost to the list price of the vehicle.

Everything they can take out adds to their bottom line. The philosophy of removing parts makes sense here and has done well for Tesla, though many drivers complain they have gone a bit overboard in some cases. But this is less clear when removing a part when the system doesn’t function without that part.

After Tesla removed radar support, they downgraded a number of features in Tesla Autopilot, and even a year later it has not returned to the speeds it was capable of. Many Tesla owners complain the no-radar system much more frequently has “phantom braking” events where the vehicle brakes, sometimes hard, for obstacles that are not there or are not a problem. Tesla’s new cars shipping without ultrasonics have removed almost all the functions the ultrasonics did, such as parking assist, auto-parking, summon and more.

They are promised, Tesla says, to return in the near future. Most self-driving teams believe that the shortest path to deployable self-driving uses LIDAR, radar and in some cases other sensors. They view it as the shortest and safest path, not the easiest and lowest cost.

Since they are not selling vehicles, those constraints are not priorities for them. Jesse Levinson of Zoox says that because their custom robotaxi will see very heavy use and charge good fees, the extra cost of special sensors is not the barrier it would be on a car sold to consumers. But while cost is a factor, speed of development is the biggest one.

LIDAR does fully reliable detection of a large class of obstacles today, at a level of reliability one can bet one’s life on. Camera don’t, and while they probably will some day, the date they will is unknown by both Tesla and other teams. The date they will have low cost is much better understood.

That question of when affects the software complexity. Today, it is more work to get cameras to deliver the needed reliability — so much more work that nobody can yet do it. That it might allow a simpler system in the future is not considered by most teams today.

The leading teams are all investing billions of dollars, and accept the cost of the additional complexity. A theoretically simpler solution that doesn’t yet work is not simpler than a more complex but operational one. Naturally, it should be noted that none of the other self-driving teams have a production deployment, though several have pilot projects operating in complex cities with no safety driver aboard.

Earlier, I published a series of articles and videos on what the big remaining challenges teams see , and by and large, getting reliable perception is not one of the big blockers for the LIDAR and map-using teams. The challenge is instead more in the immense detail work required to be confident the vehicle can handle all the unusual cases, particularly cases never before seen. Mapping and fleet The question of the virtue of maps is another one where Tesla/Karpathy and other teams differ.

While Karpathy hoped to make a car which could fully understand the road and where it needed to go without a detailed map, such a car is also a car that can remember what it learned and use that to build a map to help the next version of that car to travel that road. Ironically, Karpathy’s own statement about the tremendous value of a large fleet applies well here — if one has a large fleet, it is possible to build complete, detailed maps of the whole world, and to keep them fresh, and it’s foolish to discard the useful information learned by that fleet. These issues were discussed in more detail in my article and video on Tesla’s mapping decisions: The path to the future Karpathy is correct that at some point a breakthrough will probably come that allows computer vision to perform the driving task with high safety.

Most other teams don’t disagree with him on that. He may be right in his prediction that they will eventually get rid of their LIDARs to reduce cost. But they think they will do that after they are in production, after they have taken the lead in the robotaxi business while Tesla is still just doing driver assist.

They might be wrong — that breakthrough might come earlier, in which case Tesla will see great success. But they don’t think it’s the way to bet. It’s also the case that as time progresses and all the tools get better, the extra sensors may not cost that much more, or add that much more complexity.

LIDAR, radar and thermal cameras provide superhuman sensing. They can detect things that cameras can’t. Even should this advantage dwindle it won’t drop to zero — the debate will be over whether their cost is justified.

But when it comes to digital technology, that cost historically is known to plummet. The immense complexity of a modern mobile phone would stagger the mind of anybody from not that long ago in the past, and its cost would shock them even more. People who have bet on technology being expensive have rarely won the technology race.

Tesla is actually a great example of a company that won by betting that technologies would get better and cheaper. Karpathy’s view of that future is hard to discern. His position at Tesla was a highly coveted and lucrative one within his field.

For somebody who believes Tesla’s path, it is a particularly important place to be to change the world. However, he didn’t leave Tesla to start another project, at least as far as public announcements are concerned. His departure suggests (though doesn’t guarantee) that he had problems of some sort — possibly with the project or his notoriously difficult to work for boss.

Possibly something else or something personal, of course — this is only speculation. What is true is that the bet Tesla has made based on this principles is a big one — with a big payoff, or a big risk of falling behind. Though fortunately for Tesla, it has so much in the way of resources that even if its internal research fails it can afford to switch directions.

In fact, had it wanted, it could probably have wished to purchase Argo. AI last week — but Argo’s assets don’t fit with Tesla’s current plan. Perhaps if the plan changes, another player will be available for acquisition.

Follow me on Twitter or LinkedIn . Check out my website . Brad Templeton Editorial Standards Print Reprints & Permissions.


From: forbes
URL: https://www.forbes.com/sites/bradtempleton/2022/10/31/former-head-of-tesla-ai-explains-why-theyve-removed-sensors-others-differ/

Exit mobile version