MEF’s Senior Policy and Initiatives Advisor Simon Bates reflects on CES, and the huge array of new technologies on display and how they could impact our privacy, and the future of consumer trust.
January may be the coldest month of the year, but you can always rely on CES to warm the hearts of mobile gadget freaks around the world.
There were ultra-thin laptops, gesture sensors, remote controls and football-fields full of gleaming auto-tech. Oh, and smart bras and Bluetooth pregnancy kits are now a thing, apparently.
In the first week of January, a deluge of press releases announce new technologies or device that will take the world by storm. They run the gamut of ‘smart’, from computers to cars, lightbulbs to lavatories.
Every year, despite the epic diversity of the technology on display, a few key themes emerge. In 2016, commentators zeroed in on drones, robots, virtual reality headgear and – that perennial favourite – wearables.
But there was a story behind the story – a red thread running through each of these disparate technologies – that didn’t get the pick-up it deserved. As Gary Kovacs – CEO of AVG Technologies – points out in MEF’s video below, there won’t be many consumer tech devices launched in 2016 which aren’t connected to the internet and each other.
So for me, the real question to come out of CES was what constitutes privacy in this brave new world of connected devices.
Think about it. What do drones, robots, VR and wearables have in common? If anything, it’s that they all have the ability to hoover up data – and super-sensitive personal information at that. Let’s imagine for a moment how things might look in a not-too distant future:
- A squadron of drones – kitted-out with long-lens video cameras and microphones – buzz over every town and city; pausing over gardens and street corners.
- Robots are introduced into care homes and hospitals monitoring the behaviour and condition of their human charges.
- Smartwatches and activity trackers monitor heart rate, blood pressure and amount of words typed per hour – then send a report to insurance companies and employers.
- VR headsets are used in job interviews and performance reviews to simulate ‘what-if’ scenarios with the footage logged, recorded and analysed.
Paranoid? Maybe. Possible? Definitely.
MEF launched its annual Global Consumer Trust Report at CES in association with AVG. The results should serve as both a check on industry over-exuberance and a shot in the arm for those enlightened companies that are prepared to take consumers with them on this exciting journey into the future.
As Gary from AVG observes, mobile apps and services are moving up the ‘anxiety index’ as more negative news stories and anecdotal evidence emerges of security breaches and privacy abuses.
The bottom line is that yes, mobile users share their data, but 41% do so only reluctantly. They know that if they don’t share personal information they won’t get to use the app or service but that doesn’t mean they’re happy about it.
If an app upsets them or gives them an unpleasant surprise in terms of the information it’s collecting, we now know they don’t take it lying down. 85% said they took action as a result of a privacy or security concern. More than half (52%) delete the app from their device, 21% post a negative review and 1 in 6 – 15% – download a competitor’s app instead.
We are living through a period where consumers are driven by fear of security and privacy breaches. That’s the bad news. The good news is that consumers seem to be transitioning into a new time of opportunity. Mobile users around the world have signalled they will reward companies that understand their concerns. Almost half (47%) would pay extra for an app that didn’t collect unnecessary information and 17% would pay a premium of more than 10%.
Two things are clear. One – everyone’s now in the data business. Two, in 2016 trust is the new currency. People will succeed or fail based on the extent they protect the user’s security and privacy.
Senior Advisor Policy & Initiatives
Our industry can – indeed must – learn from these insights. The scenarios I outlined above could have been taken straight out of 1950s sci-fi paranoia but consumers will start to take them as real if they’re not convinced otherwise.
It’s in all of our best interests to reassure consumers – to let them know we understand their concerns – rather than throw new products at them and assume they will embrace them regardless of the privacy and security consequences.
The future may well be bright…but only if we make it so.