Tesla will build its own maps using driving data from the fleet, confirms Elon Musk
Tesla CEO Elon Musk just confirmed on Twitter that the Smart Summon feature in the future will use the GPS points from the maps/navigation data generated by the Tesla fleet cars that have previously been at that location, in order to further refine this feature, he was responding to the following tweet directed at him by @thirdrowtesla podcast.
What this means is that the automaker is already gathering enough data to be able to create its own maps or a routing engine that will help in navigating parking or private spaces much easier for Tesla vehicles resulting in a smoother Smart Summon experience.
This will happen— Elon Musk (@elonmusk) April 13, 2020
Tesla currently uses Google Maps as the base and to pinpoint the points of interest but the navigation data and routing engine are provided by another software company named MapBox which has been working on these projects for about a decade now.
In 2018 MapBox acquired the routing engine named ‘Valhalla‘, the is now part of the core API MapBox provides to users and enterprise clients like Tesla, Facebook, and many more. MapBox claims they gather data from millions of devices that their application is a part of and they reach at least 600 million people a month.
Our maps learn from every application they’re embedded in. We use real-time data from 600 million MAUs to ship hundreds of thousands of map updates per day so developers can build precise maps that perform across platforms.
Ira Ehrenpreis, who was the actual owner of the first-ever production Tesla Model 3 and later on he gifted his spot to Elon Musk actually sits on the board of both Tesla Inc.(TSLA) and MapBox, so the bond between the companies is strong — but with billions of miles of Autopilot data, Tesla will want to separate itself like they developed their own Self Driving Software and ended the partnership with Mobileye.
Seeing some people respond “Why do you need maps? I thought you only needed vision!”— Third Row Tesla Podcast (@thirdrowtesla) April 13, 2020
Vision & memory.
You can exit the mall parking lot with vision only, reading signs.
But it might take you longer than if you *remember* how to get out.
Autopilot will have shared memories https://t.co/kYyd1tmKDx
Although we have seen a Tesla Model 3 performing Smart Summon in light snowfall that limits the vision of the vehicle, if the car had the opportunity to access information from Tesla’s Neural Net or GPS data, this would have enhanced its ability to navigate to the owner with much more precision.
Smart Summon along with the Parallel and Perpendicular Autopark are part of the Full Self-Driving (FSD) suite, although Tesla has deployed these features already, the Silicon Valley automaker keeps on polishing the stuff for free for its 1 million+ vehicles around the world for the cost of nothing extra — yes, via the free over-the-air (OTA) updates.
Related: Watch Tesla Autopilot / Smart Summon handle construction zone and traffic cones
Alex GreenPosted at 12:41h, 14 April
Edit: “are provided by another software company named Mapbox*”