Skip to main content

pokemon-goPokemon Go has been a phenomenal success. Almost overnight the game has captured more active users than Twitter.

At the same time the game makers, Niantic, are increasingly coming under fire around privacy concerns re the user data that the game is hoarding – including location. Niantic’s privacy policy notes that it may share aggregated information and non-identifying information with third parties for research and analysis, demographic profiling and other similar purposes.

US Senator Al Franken has asked Niantic, to explain what they do with the user data. He’s also not happy that parents are not informed enough to give meaningful consent. And more recently acclaimed film director Oliver Stone weighed in claiming the game had hit…“a new level of invasion of privacy that could lead to totalitarianism.”

We asked MEF members and the wider mobile community what they think of the privacy issues and how the industry should respond.

andy_patelAndy Patel

Security Advisor

F-Secure Labs

color-linkedin-128 color-twitter-128 color-link-128

Andy Patel, Security Advisor, F-Secure Labs

There’s absolutely no security risk involved in the game. The game client queries GPS, and the backend for location information the same way that a map application or a location-enabled social media service would. Players do not see the location of other players. As far as the augmented reality portion of the game (which is only enabled when you go to catch a Pokemon), you can turn it off. A lot of people do that to save battery on their phone. When you catch a Pokemon, the game doesn’t save the photo (you have to do that manually).

There was a mild initial security concern when the game launched, based on the fact that the iOS version made a deprecated API call to Google’s authentication service that gave the game full access to a Google account. Since the game’s authentication servers were being hammered, Google authentication was the only way to sign up to the game. This issue is being addressed by Niantic.

markmagMark Maglaras

VP Sales & Marketing


color-linkedin-128 color-twitter-128 color-link-128

Mark Maglaras, VP Sales & Marketing, Novatti

Mobile applications such as Pokémon Go should improve the consumer’s trust by providing more transparency on how users data is being used. This is ideally intertwined with an enhanced user experience and more tailored in-app functionality. For example, if a user receives an email newsletter from their favourite shoes brand, they will suddenly be able (with an in-app purchase) to customise their character to wear said brand. This builds future consumer trust and loyalty with both the brand and the application provider. Privacy and transparency need to go hand in hand to maintain the consumer’s trust.

Gillian HughesGillian Hughes



color-linkedin-128 color-twitter-128 color-link-128

Gillian Hughes, VP Corporate Sales, Veoo

There has been a lot in the news recently about the privacy of personal data and quite rightly it’s a very sensitive issue. Users of mobile apps, games and services need to know and trust that their data isn’t going to be misused and that they won’t end up receiving a whole lot of spam after signing up to a loyalty scheme.

However users don’t need to view the collection of data as negative because there are many upsides, such as better and more personalised marketing with more relevant offers, but in order to build trust companies must be totally transparent about how they are going use data and should only request data if there is a legitimate reason, such as improving a service or providing customers with more relevant or personalised communications.

Gone are the days of blasting out uniform advertising to consumers; today’s users want personalised marketing and companies need to respect this by building relationships with their customers before requesting such data. Being a household name alone doesn’t afford brands that level of trust and they should work first on safe levels of engagement by asking for innocuous details, such as name and email address, before asking for more personal data. Sending mobile vouchers via SMS can be a great way to do this, as it allows companies to reach out to their customers with personalised content but doesn’t require them to commit to downloading a full mobile app or signing up to a loyalty scheme.

linkedin imageBob Gallagher



color-linkedin-128 color-twitter-128 color-link-128

Bob Gallagher, Managing Director, Appsynth

It’s quite possible that monetizing data is part of Niantic’s revenue model. Many apps have similarly vague privacy policies and sell this data to ad networks for location based targeting. Several companies specialize in collecting and brokering this data. Device IDs are normally hashed so they’re non-identifiable but users can still be targeted based on where they’ve been.

This might shock some people but it’s not so different to Facebook using background location data to power their Local Awareness ads. This doesn’t mean Pokemon Go will ever show ads but advertisers could potentially buy its data to target users across other mobile and social channels.

For hyper-local targeting location is often combined with bluetooth data allowing stores with beacons to track and retarget visitors, even without the retailer’s app installed. The forthcoming Pokemon Go Plus wearable will encourage players to enable Bluetooth, perhaps one reason Nintendo is developing the device knowing the value this could provide.

The expectation is that big companies like Facebook have suitably secure access to all this data. This is less of a guarantee with smaller companies and it remains to be seen how Niantic is tackling this concern.

IMG_20130527_175313-1Richard Gomer



color-linkedin-128 color-twitter-128 color-link-128

Richard Gomer, Scientist, University of Southampton

Last week I didn’t know a Pigeotto from a Charizard, but yesterday I proudly caught my first Magikarp. Games designers are brilliant at helping us to understand the fictional worlds that they create – so why won’t anybody help us to understand what they’re doing in the real world?

The work we’re doing at the University of Southampton focusses on “meaningful consent” – the idea that, basically, we should be building a digital world in which consumers actually know what’s going on. The concern around Pokemon should not be whether they are right to process the data (I suspect that for most people it probably isn’t a material risk), but we should be concerned about a world where we can’t make informed decisions.

Consumers are losing out because values like privacy don’t exert meaningful pressure in the absence of knowledge, and services lose out because consumers rightly wonder what they’re hiding.

KenMunro2016Ken Munro


Pen Test Partners

color-linkedin-128 color-twitter-128 color-link-128

Ken Munro, Partner in Pen Test Partners

Niantic is not unique in its use of blanket permissions. From a marketing perspective, it makes perfect sense to request more permissions to access data that can be monetised in the future. But more data means more responsibility and that could see app developers compromise user privacy.

There’s also been an erosion of user control. It’s often difficult or impossible to alter these permissions retrospectively but it hasn’t always been this way. The facility to control permissions was in the Android OS as far back as Jelly Bean, but removed from Kit Kat and Lollipop. Google really dropped the ball with this, effectively paving the way for mass permissions. However, with Marshmallow, Android now have a great granular permissions feature. It means that users can select the permissions that an app is allowed e.g. Calendar, Camera, Contacts, Location etc. can all be customised. That user control has to be preserved

Serge Huber_JahiaSerge Huber



color-linkedin-128 color-twitter-128 color-link-128

Serge Huber, CTO, Jahia

Typically, users do not want to have to think or care about data permissions. Most don’t understand it nor they do they understand the consequences of these data collection systems. But really, they should care as the data is at risk of being misused for commercial purposes or being sold to a third-party.

Of course, in the case of data breaches, which are known to happen, users should care.

The incoming GDPR should give users a new reason to care. The new legislation protects consumers against organisations moving their data without their knowledge and also protects their ‘right to be forgotten’. For example, Pokemon Go players are having data collected about their behaviour. Give it 50 years and they may not agree to that anymore. What can one do about data that was previously collected without one’s knowledge?

Jill BrittlebankJill Brittlebank

Senior Director

Zeta Interactive

color-linkedin-128 color-twitter-128 color-link-128

Jill Brittlebank, Senior Director, Zeta Interactive

Brand trust is paramount: customers must be reassured that their data is safe. But businesses must pay more than lip service to this guarantee. If brands are to earn their customer’s trust, they must show they are using the data sparingly and thoughtfully, rather than deluging the customer.

The challenge for businesses now is to be clear about what data they are using, at the same time as showing how it adds value to the customer. Offering some form of opt-in will ensure consumers know what they are signing up to. Meanwhile, using data smartly and sensitively proves to customers that the more they let a brand know about them, the better their experience. Once brands can demonstrate the benefits outweigh the risks, customers will become ever more willing to divulge their data.

Headshot_Richard StiennonRichard Stiennon


Blancco Technology Group

color-linkedin-128 color-twitter-128 color-link-128

Richard Stiennon, Chief Strategy Officer, Blancco Technology Group

When users sign up to the Pokémon Go app via their Google or Facebook account, Niantic reserves the right to gain access to all prior information that has been shared with the two platforms. The app then uses third party companies to analyse this data.

Having such a large collection of users’ personal mobile data makes it quite possible for the data to be accessed and even stolen at a later date. The app’s data sharing policy also leaves Facebook and Google at risk of violating EU GDPR, which takes a hard-line stance on ensuring users’ right to be forgotten.

Given how obsessed everyone is with the Pokémon Go app, it’s only as matter of time until the app’s millions of users become at risk of data theft. Niantic, Facebook and Google have a responsibility not to let this happen.

If you would like to contribute a comment to this article contact us at or leave your thoughts in the replies below. 

One Comment

  • Geoff Revill says:

    Wow not a single commentator has expressed an understanding of location data and its especially sensitive nature in a privacy context, with only one vaguely connecting to the GDPR which calls out this data as a special class of data needing special consideration for security and privacy and ‘fully informed’ (nor towards Richard Gomer) consent.
    Anyone would think people were scared to address the elephant in the room!
    There is simply NO REASON for Niantic to sustain a history of a persons movements when using the app except as a way of profiling people and commercialising their location.
    They could keep a record of how many people using the app visited locations based on app usage, but no personal information needs to be stored for any period of time to achieve this. Sure, access location data in real-time (securely) to deliver the service functions, maybe even store a few relevant ‘been here before’ type markers to make the app efficient (which could be stored on the local device not in the cloud to make it even more secure), but historic movement patterns stored anywhere for any reason are a major risk to the individuals privacy, even if never sold, which they are….and don’t talk about anonymity, case after case has shown its impossible to anonymise individual movements pattern histories.