Pokémon Go Players’ Data Used to Train “Visual Positioning” AI

Pokémon Go Players’ Data Used to Train “Visual Positioning” AI

Niantic spinoff, Niantic Spatial, used over 30 billion images taken by users of Pokemon Go to train its “visual positioning” system to help robots navigate the world.

Pokémon Go, released all the way back in 2016, is an insanely popular game that uses augmented reality to simulate catching Pokemon in the real world.

While the game has received lots of praise for encouraging people to get outside and meet people, any app that requires location data and camera access is concerning.

It turns out that Niantic was actually collecting all that juicy location and camera data in order to train a model to help robots navigate the world.

The data collection was centered around “hot spots”: places that Niantic wanted you to go to collect detailed images in different weather conditions, times of day, etc.

“We had a million-plus locations around the world where we can locate you precisely” Brian McClendon, CTO of Niantic Spatial, told MIT Technology Review. “We know where you’re standing within several centimeters of accuracy and, most importantly, where you’re looking.”

Not only is this much more precise than GPS-based location, it relies only on images in order to function. This means, just based on visual data, this model can locate you within centimeters. The article goes on:

Each of those images comes with detailed metadata that pinpoints where in space the phone was at the time it captured the image, including which way the phone was facing, which way up it was, whether or not it was moving, how fast and in which direction, and more.

Niantic Spatial is now teaming up with Coco Robotics, a service that provides delivery robots that can “carry up to eight extra-large pizzas or four grocery bags” in a limited number of cities, currently just Los Angeles, Chicago, Jersey City, Miami, and Helsinki.

The robots can’t rely on GPS alone since the signals that power GPS can bounce off buildings and interfere with each other, reducing accuracy.

Questions of ethics seem inevitable in a case like this: players likely had no idea they were having their data harvested to train AI. Most people won’t read the privacy policy for a game, assuming that they don’t have to worry much and that the data collected will only be used to make the game function properly.

This case serves as a stark example of why it’s important to be careful what permissions you grant apps you install. Location, camera, and microphone permissions should be doled out sparingly only in cases where it’s absolutely necessary.

With human players of games such as Geoguessr, able to find locations on the planet surprisingly quickly, it’s only a matter of time before models like Niantic’s will be used for less innocent purposes than delivery robots.

Erasing metadata from images might no longer be enough to hide your location in the near future, if that’s not the case already.

Community Discussion