Google Maps offers AR indoor navigation, new eco-friendly route options

Google Maps will soon be announcing new features, ranging from on-map weather updates to AR-powered indoor navigation. There’s a lot more to this, and the company says it’s aiming to deliver “more than 100 AI-powered improvements” to Google Maps this year.

First, there is the new UI for directions. Today, the Direction UI uses TSBs for each mode of transport: one for driving, then mass transport, walking King, Ride Shares and Biking. In this redesign, everything appears in the flat list, but now you can hit the “Options” button and set the modes of transport selection. You can choose routing options for driving, walking, trains, buses, motorcycles, bikes, ride shares, “bike and scooter shares” and ferry. You can also select multiple items, so all your top choices will be first on the list.

Some route options will have a bit of greenery, as part of Google Maps’ new focus on promoting cleaner methods of transportation. For driving, Google Maps’ routing screen will soon take into account fuel efficiency, and you’ll also start to look greener next to fuel-efficient routes. In many places, the shortest route is the most fuel-efficient, so not much will change. But Google Maps, C.O. To come up with, will count things like traffic, start and stop and road elevation (a big concern in Google’s back yard in California).2 Rating for each trip. If it finds a route that is more fuel-efficient, but longer, it will tell you about it, and if both routes take the same amount of time, fuel efficiency will be used as a tiebreaker for the default default route.

Google has said that both of these features will be released sometime this year.

Google Maps affects the weather

Two new levels are coming up on Google Maps that put the service in a bit of competition with the weather app of your choice: air quality and weather levels. The weather should always be something to consider before traveling, and soon you’ll be able to find that information on Google Maps. For allergy sufferers or people in places where air quality is a regular problem, that information will come in handy.

Google says, “Data from partners such as The Weather Company, and the Central Pollution Control Board will begin rolling out to Android and iOS in the coming months to power these levels. Weather levels will be available globally, and air quality levels will be covered.” It will launch in Australia, India and the US, along with more countries. “

For now, the presentation of this information is very limited. The most obvious way to display weather and air quality data is to show the intensity of different colors, with an overlay showing rain and air in the radar view. Google Maps displays this data only as small, random dots on a map, similar to how points of interest are displayed. Where the rain starts and stops, it becomes difficult to determine how long it will stop, whether it will get better or worse in the next few hours, or how bad the weather will be while you are driving there. Adding this information directly to Google Maps will probably cut a big chunk out of the weather app industry, as overlaid data on Google Maps is a key feature, but Google is now more receptive to first-party weather solutions when Apple invests the moment. Area with the acquisition of Dark Sky.

Indoor AR navigation

Google Maps AR navigation is moving indoors. The feature is rolling out to iPhone and Android devices in some select cities in 2019 and uses Google’s trove of street-view images to direct you through AR Core-based 3D sensing and cameras and determine what points to it. Outside, the facility ended up being the world’s most complex replacement for a compass, but in phones (especially Android phones) the compass is just not so precise and there is a risk of interference, so only the initial walking direction can be obtained using a compass. A challenge. AR navigation, in addition to the cool 3D visualization on the camera, is really a big help.

It looks like the AR navigation will really shine when it comes to indoor navigation. Google demodulated the feature at the airport terminal, where it can do things like figure out your location (GPS doesn’t work indoors) and identify which floor you’re on. In the demo, he tells someone where the escalator is and go down any level to reach their terminal. Google says it wants to roll out the technology at “airports, transit stations and malls”, where it will help “find nearby elevators and escalators, your gates, platforms, baggage claim, check-in counters, tickets” offices, restrooms , ATMs and more. “

Indoor navigation has been something that Google has been constantly trying to adopt in business, solutions such as Wi-Fi RTT – Wi-Fi-based positioning, which was built into Android 9. I think the company has realized that any method of requiring independent industries does not work to establish and maintain some kind of technical infrastructure. AR navigation seems like a more scalable option because Google can do all the work manually. It is powered by nothing but your camera and a set of pictures stored on Google Maps – Google calls this VPS or Visual Positioning System and is basically an AI-powered landmark navigation.

VPS data is similar to Street View data, so it can scale the Street View scale in the same way, by sending a bunch of contractors from around the world to photograph everything with specific devices. You could claim that photographing every large indoor public space works a lot, but Google has already proven that it can do this with Street View. The company now commercially produces its own Street View backpacks, so sending a contractor on a quick march through your local airport, train station or mall should suffice for VPS data. “Just go photograph the whole world” is entirely within Google’s capabilities.

Google calls the feature “Indoor Live View” and says it’s now live on Android and iOS in many malls in Chicago, Long Island, Los Angeles, Newark, San Francisco, San Jose and Seattle. In transit and transit stations, with more route.

Now, if we could only get this in the AR-version of Google Glass, that would be great.