As with all things Google of late, AI capabilities are coming to Maps. The company announced a slew of machine learning updates for the popular app Thursday including an “Immersive View” for route planning, deeper Lens integration for local navigation and more accurate real-time information.
Back in May at its I/O developer conference, Google executives debuted Immersive View for routes, which provides navigation shots of your planned route. Whether you’re on foot, bike, taking public transportation or driving, this will allow you to scrub back and forth through street level, turn-by-turn visuals of the path you’re taking. The feature arrives on iOS and Android this week for Amsterdam, Barcelona, Dublin, Florence, Las Vegas, London, Los Angeles, Miami, New York, Paris, San Francisco, San Jose, Seattle, Tokyo and Venice.
Just because you can see the route to get where you’re going doesn’t guarantee you’ll be able to read the signage along the way. Google is revamping its existing AI-based Search with Live View feature in Maps. Simply tap the Lens icon in Maps and wave your phone around, the system will determine your precise street level location and be able to direct you to nearby resources like ATMs, transit stations, restaurants, coffee shops and stores.
The map itself is set to receive a significant upgrade. Buildings along your route will be more accurately depicted within the app to help you better orient yourself in unfamiliar cities, lane details along tricky highway interchanges will be more clearly defined in-app as well. Those updates will arrive for users in a dozen countries including the US, Canada, France and Germany over the next few months. US users will also start to see better in-app HOV lane designations and European customers should expect a significant expansion of Google’s AI speed limit sign reader technology out to 20 nations in total.
Google Maps also runs natively in a growing number of electric vehicles, as part of the Android Automotive OS ecosystem. That Maps is getting an update too as part of the new Places API. Starting this week, drivers will see increased information about nearby charging stations including whether the plugs work with their EV, the power throughput of the charger, and whether the plug has been used recently — an indirect means of inferring whether or not the station is out of service, which Google helpfully points out, is the case around 25 percent of them.
Even search is improving with the new update. Users will be soon able to look for nearby destinations that meet more esoteric criteria, such as “animal latte art” or “pumpkin patch with my dog,” results of which are gleaned from the analysis of “billions of photos shared by the Google Maps community,” per a Google blog post Thursday.