Last week, four experts reflected on the profound difficulty of ‘getting digital consent right’. And the huge upside when you do. Here are the highlights of the first MEF Connected Digital live video webinar…
Can a machine give consent? Why do consumers hate privacy policies so much? How can you present consent forms on a small mobile screen? Or no screen at all?
Yes, digital consent is a thorny topic.
But it’s also a pressing one. One the ‘stick’ side, regulators all over the world are mandating new practices around personal data.
On the ‘carrot’ side, enterprises are realising the commercial upside of transparency – which actually encourages customers to share their data.
The MEF is deeply committed to highlighting consent issues.
And last week we hosted our first MEF Connects Digital event on the topic. The live video webinar convened the following four experts to share their thoughts.
- Fabien Venries, Head of Privacy Marketing Orange Group
- Emily Hancock, Vice President, Legal, Evernote
- Stuart Lacey, Founder & CEO, Trunomi
- Michael Becker, head of Identity and PIMS Development & Strategy, Assurant
The hour-long conversation revealed several fascinating insights. Here are 13.
Not all consent involves human beings
It’s easy to see consent from an end user point of view. But there’s more to it. Lacey said: “Not all consent involves consumers. There are B2B scenarios, with large conglomerates that might have 500 brands all sharing data across different envelopes. There may not be a human being in these scenarios, but just data controllers or automated processes.
There are ways to automate consent
Lacey’s company Trunomi is one of a small group developing consent platforms. Here’s how he describes his mission. “We’re developing the ability to capture – using our certification process – the following questions: what am I sharing; who with; for what reason; for how long; in what context?
“We can then create rights that can be shared across a multitude of enterprise requirements using business logic. And those consents can be granted to end consumers but also across large organisations.
Regulators are facing an ‘astro-physics’ level task
In astro physics when there are three celestial bodies orbiting each other, the maths of plotting their orbits is insanely hard. Becker sees a parallel for this in the regulatory challenge around consent.
Not all consent involves consumers. There are B2B scenarios, with large conglomerates that might have 500 brands all sharing data across different envelopes. There may not be a human being in these scenarios, but just data controllers or automated processes.”
He says there are three models available. The open data model says all data should be free and available. The sovereign data model says only the individual should own it. The social engineering model says data should be used to shape society. They are all in conflict with each other. That gives regulators are hellishly difficult task.
Whatever the region, people always react to consent forms the same way: they ignore them
Working for Orange across multiple regions, Fabien Venries has a good view of national approaches to consent policy. But he says that, whatever the geography, people nearly always disregard it.
“The behaviour is always the same: find the close button. They feel there is less value in protecting data than accessing the service.”
Transparency is good for business
Regulation is pushing enterprises to re-examine consent. But so is the commercial incentive. Simply, customers will reward transparency.
Lacey said: “When you expose the data you have it changes the relationships. We have results that prove you get more engaged customers, better personalised service, higher rates of conversion, better retention and higher NPS scores.”
Algorithms are making it hard to know exactly what personal data is
What is personal data? Historically this was easy to answer: name, address, bank details etc. But all that changed less than a decade ago. Now, a range of non-personal data – location maps, browsing history etc – can be combined to reveal a lot about a person.
What is personal data? Historically this was easy to answer: name, address, bank details etc. But all that changed less than a decade ago. Now, a range of non-personal data – location maps, browsing history etc – can be combined to reveal a lot about a person. That makes ‘informed consent’ difficult to get. How can a person give consent the algorithms have not yet drawn their conclusions?”
That makes ‘informed consent’ difficult to get. How can a person give consent the algorithms have not yet drawn their conclusions?
Becker said: “There was an assumption of explicit consent until about six years ago. Then we switched to a correlation model. And things got murky. When you combine data sets, you need to ask to what extent you can be informed.”
We can assume too much knowledge from end users
Becker said: “A vast number of people don’t understand that companies are amassing multiple datasets to create insights and do predictive scoring. They’re doing it so they can better serve us… But done wrong, it can create risks.”
You can scare people with the wrong language
“You have to get out of your own echo chamber.” said Hancock. “You can think something is a really great idea. But is it? You need to test your assumptions. When we re-launched our privacy we spent six months doing that… and we’re now in a much better space.”
It helps to look at consent and personal data issues through four lenses
Becker says, and later refined, a good way to think about consent issues is in four frames: technology; legal and regulatory; economic; and ethical/moral – each lens shapes how we define what is appropriate within an individual’s situational experience.
GDPR could be the template for most regions
While GDPR only affects companies trading in the EU, it could yet influence other regions. Stuart Lacey observed that “GDPR appears to be something Canada, Mexico, Singapore and Brazil are aspiring towards.”
But he reiterated that companies should look to good business practice rather than regulation as the trigger to act.
Consent could be pegged to payments – and that’s worrying
Emily Hancock observed that it’s possible some services could offer payment in return for less data sharing. She thinks that’s concerning. “That could end up being a social stratification issue. I worry about that.”
Machine to machine consent can be liberating for humans
Imagine having a machine (could be an app or a Siri style avatar) that represents you. You could assign it your default consents. It would reduce the burden on you to constantly review policies.
Lacey is excited by such a scenario. He said: “Age gating is a great example. I don’t need to give companies my date of birth to show I’m over 18. I just need a pass fail. And I might choose to have my age persona active at all times for the purpose of consent. This would be an elegant way to give consent enabled by machines. And it’s completely doable.”
Commerce based on data at the individual level will change the world
If people share their data with trust, and do so at an individual level, the results could be astonishing. Becker believes this. He says: “I believe that tenet of needing to serve the individual will form the basis of a new industrial competition by 2020.
A comprehensive guide to digital consent in the personal data economy, the whitepaper looks at the regulatory requirements as well as the innovative models and technologies being trialled to help businesses implement best practice when it comes to data permissions for collecting, sharing and storing data.
MEF’s Consumer Trust Working Group has explored the big questions around digital consent including how does the law define it? What is best practice for designing consent forms and receipts? How will new regulation such as GDPR affect it? What new technologies might help managing consent easier?