View airport and shopping mall interiors with Live View

  • Live View and Indoor Live View use augmented reality to guide you in and out of airports, shopping malls, and stations.
  • The technology combines camera, AI, Street View and indoor maps to provide precise directions by floor and location.
  • Features such as immersive view, multi-search with Lens, and advanced image translation expand the way you explore places.
  • Google Maps incorporates eco-friendly routes, low-emission zone alerts, and air quality data for more sustainable mobility.

Live View airports

Since some years, Google Maps is no longer just a map to become a kind of intelligent guide that accompanies us practically everywhere. Now the leap goes a little further: it not only guides us on the street, it also helps us get around inside airports, stations and shopping centers thanks to augmented reality.

This revolution comes hand in hand with Live View and Indoor Live ViewThese features combine the phone's camera, artificial intelligence, and indoor maps to display virtual arrows and signs superimposed on the real environment. The result is that you can see, on your phone's screen, where you need to go. to find a boarding gate, a shop, or an ATM without going crazy looking at plans.

What is Live View and how has it changed Google Maps?

Live View is the Google Maps feature that Use augmented reality to guide you as you walk. Instead of just using a top-down view of the map with a blue arrow, turn on your phone's camera and draw over the actual street view. giant arrows, signs and street names that guide you along the way.

google earth banner
Related article:
Google Earth: A Comprehensive Guide to Virtual Creation Tools and Tours

This system is based on a key concept: the VPS or Visual Positioning SystemInstead of relying solely on GPS, Live View compares what the camera sees with millions of Street View images to know exactly where you are, what's in front of you, and which direction you're moving.

The feature was originally announced in the 2018 Google I / OAnd since then, Google has been refining it to make it more accurate, faster, and above all, more useful in real-life situations, like when you get off the subway and you don't know whether to turn left or right.

How does Live View work step by step?

The mechanism for using Live View is quite simple, even though it has some hidden features. a good cocktail of artificial intelligence and computer visionThe basic idea is that, when you activate the option in Google Maps, the mobile phone switches between the classic map and the augmented reality interface depending on the device's position.

When you hold the phone vertical pointing towards the streetThe application activates the camera and displays a real-time image of the surroundings. On top of that image, other elements appear. arrows, labels, and text boxes indicating where to go next, how many meters away is the next turn, or what street is ahead.

If you lower the phone and place horizontally towards the groundThe interface automatically reverts to the traditional Google Maps map. In this way, Google attempts reduce battery consumptionbecause the camera and image processing consume more resources than the classic map supported only by GPS.

Since Live View is primarily designed for walking, Google requires that the device be compatible with ARCore (on Android) or ARKit (on iPhone)These augmented reality development kits allow the phone to understand space, detect surfaces, and keep arrows and signs stable on the environment as you move.

Indoor Live View: Augmented reality enters buildings

The really interesting leap comes with Indoor Live View, the evolution of Live View designed for indoors. Until now, Google Maps showed floor plans of many shopping centers, airports, or stations, but it was time interpret them by hand like classic plans stuck to the walls.

Indoor Live View combines those indoor maps with enhanced navigation. Thanks to New advances that allow us to know the altitude and exact position of objects inside a buildingGoogle can place virtual arrows with enough precision to tell you not only whether to go straight or turn, but also On which floor is what you're looking for located?.

The objective is clear: in places where it's very easy to get disoriented, such as a huge airport or a mega-shopping mall, the mobile phone becomes a kind of visual compassYou just have to pick up your phone and follow the signs that appear around you, without having to check signs or physical directories every two minutes.

What can you do inside airports and shopping malls with Indoor Live View?

Live View airports and shopping malls

The practical applications of Indoor Live View indoors are enormous. In an airport, for example, you can use it to... guide you to your boarding gate without wasting time wandering around the terminal or asking everyone.

Among the points that Indoor Live View can help you locate are elevators, escalators, restrooms, luggage conveyor belts, check-in counters, ticket offices, public transport stops within the building itself, ATMs or any other essential service that is mapped.

In the case of shopping centers, the function tells you which floor is each store on and how to get there by the most direct route. It's especially useful when you're in a hurry, want to go in, buy something specific, and leave without wandering around the entire building blindly.

Indoor Live View has already been deployed in several areas of United States, in large venues such as airports, transport interchanges and shopping centersGoogle has confirmed that the service will be rolled out to more locations, including airports, shopping centers and transport hubs in Tokyo and Zurichand that the idea is to continue the international leap progressively.

Madrid and Barcelona: Live View arrives indoors in Spain

Google has officially confirmed the Arrival of Live View indoor indicators in Madrid and BarcelonaThis means you can use augmented reality not only to find your way around the street, but also inside... airports and large shopping centers of these cities.

Furthermore, the outdoor Live View itself is enhanced in these cities and in Dublin with a feature that allows search for nearby items directly from the cameraThis way, by simply picking up your phone, you can locate ATMs, restaurants, parks, or bus stops near your location, without having to manually enter your search on the map.

This expansion is part of a broader package of improvements announced by Google, which includes not only the maps component, but also Advances in the search engine, Google Lens, and Translatorall of them closely linked to artificial intelligence and visual recognition.

The technology behind it: AI, visual recognition, and global localization

This entire Live View and Indoor Live View experience is based on one fundamental pillar: Artificial intelligence applied to computer visionGoogle uses advanced neural networks to analyze what the camera sees, compare it to its databases, and calculate, in a matter of milliseconds, your position and where you are pointing.

The company is talking about a new positioning technology called “global localization”which uses AI to scan tens of billions of Street View imagesWith this massive processing, the system is able to identify buildings, shapes, signs, and other elements of the environment to locate you with greater accuracy than GPS alone allows.

Indoors, the challenge is even greater because GPS becomes less reliable. That's why Google is taking photographs thousands of buildings from top to bottomin the same way it did with the streets for Street View. By combining those images with floor plans and the phone's sensors (camera, gyroscope, accelerometer), Indoor Live View can calculate the exact plant and approximate position inside the building.

How to create routes with multiple stops in Google Maps
Related article:
How to create routes with multiple stops in Google Maps

For more advanced scanning functions, such as the so-called immersive viewGoogle uses AI techniques such as neuronal radiation fields (NeRF)which allow normal images to be converted into very realistic 3D recreations, with lighting, textures and depth much closer to reality.

Immersive vision: explore cities and venues before you go

In addition to the directions with Live View, Google is rolling out a feature called immersive viewwhich offers a different way to explore cities and points of interest. Instead of being limited to a photo or a 2D map, it generates three-dimensional scenes of the environment from a fusion of aerial images and Street View.

With this immersive view you can, for example, fly over the surroundings of a museum like the Rijksmuseum in Amsterdam, see where the tickets are, virtually navigate through the nearby streets and check, with a time slider, how the light, the weather, and the level of foot traffic change at different times of the day.

If you go down to street level, you can explore the area's restaurants and shops, and even peek inside some of the shops to give you an idea of ​​the atmosphere: whether the lighting is intimate, whether there are pleasant views, whether it seems like a quiet or very lively place.

In this highly detailed recreation, the AI ​​is responsible for accurately reproduce the texture of the materials, realistic lighting, and backgroundso that the experience is similar to a preview visit before deciding where to go. Immersive viewing has already begun to roll out in cities like London, Los Angeles, New York, San Francisco and TokyoAnd Google has announced that it will also be coming to Amsterdam, Dublin, Florence and Venice.

Google Lens, multi-search and “Search what appears on your screen”

Alongside the Maps improvements, Google is enhancing its search engine with functions that combine image and textThis is where Google Lens comes in, the tool that allows you to search for information simply pointing the camera or using an image already saved.

Lens has become a central part of Google's search ecosystem: the company says users are already using it. more than 10.000 billion times a monthIts role is reinforced by what Google calls multiple search, a feature where you can mix an image with additional text to further refine what you are looking for.

For example, you can take a picture of a plate of food and add the text "next to me" so the system can return results for restaurants in your area that serve similar cuisine, or that specialize in that type of food. This multiple search is currently available in United States and in EnglishBut Google has already announced that it will be rolled out globally and that This will also apply to local results..

Another important step will be being able to use multiple search with any image that appears in the search results on the mobile deviceIf you're browsing, see a photo that catches your eye and you want to know more, you can launch Lens directly onto that image and combine it with text to refine the information you're looking for.

Within this same context, Google is preparing the feature “Search what appears on your screen”This will allow you to search for any visible element on your mobile device (a photo, a video, an app, a website) without having to leave what you're viewing. The system will take what's on the screen, analyze it with Lens, and It will show you relevant results instantly..

Improvements to Google Translate and image translation

Another aspect of this AI-based evolution is Google translatorwhich incorporates changes to offer translations with more context, examples and nuancesThe idea is that when you translate a word with multiple meanings, the app will show you phrases, turns of phrase, and common uses adapted to the target language.

This is especially useful if, for example, you're talking about comics, manga, or cultural termsBecause the Translator helps you choose the expression that best fits the context, local idioms, or the register you need. These improvements are being rolled out in languages ​​such as English, Spanish, French, German, and Japanese.

Google has also redesigned the way results are displayed: it now uses a dynamic font size that adjusts as you type, and adds alternative translations and dictionary definitions when available, so you can quickly compare nuances.

In the visual realm, the company has applied AI advancements, such as generative networks similar to those of the “magic eraser” From Google Camera to text-in-image translation. In this way, the translated text is integrated much more naturally on complex backgrounds, eliminating the patches or blocks of color that previously remained around the letters.

In addition, the option to translate has been added. full web imagesThis greatly expands the possibilities for understanding visual content regardless of how you find it: whether through a photo, a screenshot, or a page full of images and labels in another language.

Other Google Maps features: eco-friendly routes and air quality

Beyond augmented reality navigation, Google Maps is incorporating improvements to make the experience of getting around the city even better. more sustainable and more informativeOne of the new features is the option to calculate greener routesprioritizing routes that generate a smaller carbon footprint.

If the estimated arrival time is similar between the fastest route and the most efficient route, the application can Choose the option with the lowest emissions by default.When the eco-friendly route is clearly longer, Maps will show both alternatives so you can decide, marking which is faster and which is more environmentally friendly.

Another important function is the layer of air qualityThis shows information about air quality in specific areas, as well as the expected weather conditions upon arrival at your destination. This is especially relevant for people with respiratory problems or those who want to better plan outdoor activities.

Final considerations

Finally, Google Maps is starting to warn of low emission areas In cities like Amsterdam, London, or Madrid. When you plot a route that passes through an area restricted to certain vehicles, the app will show you a warning and, if possible, will offer alternatives to avoid it or it will tell you if your car can enter.

How to use Google My Maps
Related article:
Google Maps: How to create custom maps with My Maps step by step

All these innovations make Google Maps Google's associated services are already part of an increasingly comprehensive system for explore, orient yourself, and make smart decisionsBoth inside and outside buildings. From arrows floating in the air at airports and shopping malls with Indoor Live View, to immersive 3D city views, visual search with Lens, or eco-friendly routes, the traditional map has been transformed into an interactive platform that blends augmented reality, artificial intelligence, and real-time data to make getting around the world much less complicated.