Google 2018 Visual Positioning System

our teams have been working really hard to combine the power of the camera the computer vision with Street View and maps to reimagine walking navigation so here's how it could look like in Google Maps let's take a look you open the camera you instantly you instantly know where you are now futzing with the phone you all the information on the map the street names the directions right there in front of you notice that you also see the map so that way you stay oriented you can start to see nearby places so you see what's around you and just for fun our team's been playing with an idea of adding a helpful guide like that there so that you can show you the way oh there she goes pretty cool now enabling these kinds of experiences though GPS alone doesn't cut it so that's why we've been working on what we call VPS visual positioning system that can estimate precise positioning and orientation one one way to think about the key insight here is just like you and I when we are in an unfamiliar place you're looking for visual landmarks looking for the storefront and the building facades etc and it's the same idea VPS uses the visual features in the environment to do the same so that way we help you figure out exactly where you are and get you exactly where you need to go with smart text selection you can now connect the words you see with the answers and actions you need so you can do things like copy and paste from the real world directly into your phone just like that or let's say you're looking at or you can pay turn a page of words into a page of answers so for example you're looking at a restaurant menu you can quickly tap around figure out every dish what it looks like what are all the ingredients etcetera the next feature I want to talk about is called style match and the idea is this sometimes your question is not what's that exact thing instead your question is what are things like it you're at your friend's place you check out this trendy looking lamp and you want to know things that match that style and now lens can help you or if you see an outfit that catches your eye you can simply open the camera tap on any item and find out of course specific information like reviews etc of any specific item but you can also see all the things and browse around that match that style so the last thing I want to tell you about today is how we're making lens work in real time so as you saw in the style match example you start to see you open the camera and you start to see lens surface proactively all the information instantly and it even anchors that information to the things that you see now this kind of thing where it's sifting through billions of words phrases places things just in real time to give you what you need not possible without machine learning so we are using both on device intelligence but also tapping into the power of cloud CPUs which we announced last year at i/o to get this done really excited and in over time what we want to do is actually overlay the live results directly on top of things like storefronts street signs or a concert poster so you can simply point your phone at a concert poster of Charlie puth and the music video just starts to play just like that this is an example of how the camera is not just answering questions but it is putting the answers right where the questions are

Loading