Tesla has released a new software update activating its vision-based park assist feature, and videos are starting to roll in showing it in action.
In October of last year, Tesla abruptly decided to stop including ultrasonic sensors on Model 3 and Model Y vehicles. These ultrasonic sensors were used for short-range objects detection, particularly during low-speed maneuvers, like parking, to help drivers know how far they are from objects outside the car.
Tesla said at the time that it planned to move to a fully vision-based parking system, using the myriad cameras around its cars to estimate distances and provide park assist functions, without the added complexity of these additional ultrasonic sensors.
Since then, these vehicles have been delivered without sensors, but with no driver aids to help in parking. For these cars, Park Assist, Autopark, Summon, and Smart Summon would not be available until a software update came along to enable them.
Now, just under six months later, these software efforts have finally borne fruit as Tesla has started rolling out vision-based park assist in its 2023.6.9 update. It should be available on cars now or soon, so check for software updates if you’ve been waiting for this feature.
The update notes state:
Tesla Vision Park Assist provides visual and audio alerts of surrounding objects. This feature uses the occupancy network to predict high-definition outlines of objects 360 degrees around the car.
Note: Tesla Vision Park Assist is for guidance purposes onlv and is not a substitute for an aware driver. Please be attentive and avoid obstacles as required.
The update does not seem to activate Autopark, Summon, or Smart Summon, yet merely brings back the lost functionality showing drivers how far they are from various objects while parking their car.
Videos have started to surface on social media showing drivers testing out the new functions in their garages and driveways, and results so far seem… a little inconsistent.
It seems to work reasonably well in some situations, showing roughly similar graphics as the vehicles with sensors, but with the added benefit of detecting objects all around the vehicle, instead of just in front or behind. One driver found the measurements to be quite accurate in a well-lit and straightforward parking lot:
Though the lines are quite wiggly, significantly more so than they are when using ultrasonics.
In other situations, the system still seems like it needs work. Here, a driver pulls between two cars and toward a trash can, before the system deactivates and states “park assist unavailable” when he gets close enough to actually need it. Then, he gets out to compare the car’s 26-inch approximation with reality, and eyeballing the distance, thinks that it’s closer to “three and a half, four feet”:
And here, another driver tries to use it with a bike rack attached to the rear of his Tesla, and the system continually detects the rack as an obstruction, repeatedly telling him to stop even though there’s plenty of room behind the car:
Electrek’s Take
Well, it’s clear that the system still needs some work. Which, frankly, is not unexpected when it comes to Tesla’s history with similar things.
A couple years ago the company abruptly removed radar from its cars, moving to a fully camera-based system for its driver assist features (which it’s now reversing course on). At the time, this led to temporary limitations for new owners of non-radar cars, who had to wait for software updates to re-add those features.
The same has happened here with ultrasonics, which caught several customers by surprise. Tesla has sold a lot of cars in the last six months, and I know of at least one who hadn’t heard the news of the missing ultrasonic sensors and was quite annoyed to realize he had just bought a vehicle without a relatively standard modern feature that he had expected his brand-new high-tech $53,000 car would have.
Tesla owners have gotten used to similar things happening, and often give the company slack because actions like these are balanced out by the benefit of over-the-air updates, which improve cars and add features over time.
But this is such a basic and expected feature on modern vehicles, and it has been estimated that these sensors cost about $114 per car. That’s a significant cost but certainly not a massive one, but we’re six months in and so far we’ve only seen one of the four missing features reactivated for the cars in question.
Further, the feature just doesn’t look ready for prime time yet. A feature like this doesn’t need to work 50% of the time, or even 99% of the time – it needs to work 100% of the time because any dings or scratches don’t just go away the next time you park, they stay there for good. If drivers are going to rely on it, and use it in place of their eyes, it needs to be reliable. And if drivers aren’t going to use it in place of their eyes – as Tesla currently recommends that they don’t – then why don’t they just use… their eyes? What’s the point of the sensor if it’s just replicating what your eyes see?
One benefit of ultrasonics is to provide additional confirmation of distance through something other than vision. As in the first embedded video above, the driver could already estimate distances with his eyes, but the ultrasonics would give him additional information that he doesn’t have visually. If the car is just estimating visually the same way the driver does that, then it’s not giving any new information.
This doesn’t mean the system can’t improve. Surely it can and it will have access to more advantageous angles than the driver’s eyes do, and be able to look all around the car at once instead of only in one direction at a time (as it already can). And in certain situations, it already seems to do a good job. But for now, the visualization doesn’t seem a lot better than eyeballing, which is disappointing six months after the feature was unceremoniously eliminated. Let’s hope we don’t have to wait another six months for underwhelming results from Autopark, Summon, and Smart Summon.