Skip to main content

Last week the FBI requested that Apple unlock an iPhone belonging to one of the killers in the December mass shooting in San Bernardino, California.

Apple has refused, publishing the following letter to its customers:

“The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.”

Clearly there are privacy and security principles at stake.  But how far should the mobile industry go in balancing personal privacy against national security? We asked MEF members and the wider mobile community for their thoughts.

Screen Shot 2016-02-23 at 09.11.14

AndrewBud150Andrew Bud

CEO

iProov

color-linkedin-128 color-twitter-128 color-link-128

Andrew Bud, MEF Global Chairman and founder / CEO, iProov

The Apple case opens some serious issues for the mobile industry, which must be explored in a fundamentally political context.

On the one hand, to many people the idea that a liberal, democratic state is prevented from exercising a legally issued warrant to access private property seems extraordinary.  If a bank were to prevent the police from accessing safe deposit boxes, the community would be unlikely to accept that.  A citizen’s mobile device is very analogous to a personal safe deposit – to be protected fiercely, but also accessible in the course of a legitimate criminal enquiry.

But once this unlocking code is invented, it cannot be uninvented or denied.  Illiberal regimes will have every opportunity to demand its use.  Yet is it for Apple, or any other major US vendor, to decide what the citizens of another state may hide from their own government?  Is it the role of the mobile eco-system to create greater levels of freedom and liberty than a citizen’s own laws may permit?  Is this benign colonialism, or a moral obligation?  This is a big political  and legal question, and one the industry must take care not to arrogate to itself.

And once invented, can it be safe-guarded?  Should the risk of it falling into the wrong hands not be a clinching argument against creating it in the first place?  This argument has raged about many forms of technological progress, and usually society has chosen to solve the problems caused by new technology rather than avoid them. It must be a social choice.

How far would concession affect consumer trust?  In many societies the police right to intrude into personal spaces to investigate serious crime is accepted and even comforting.  But to some, whose outlook is one of limited trust in the integrity of either public or private forces, concession would merely reinforce their doubts about the sanctity of their privacy.

Chris EngChris Eng

VP research

Veracode

color-linkedin-128 color-twitter-128 color-link-128

Chris Eng, VP research, Veracode

The issue here is not one of creating a backdoor; nor is the FBI asking for Apple to decrypt the data on the phone. They’re asking for a software update (which could be designed to work only on that one particular phone) which would then allow the FBI to attempt to crack the passcode and decrypt the data. Such a solution would be useless if applied to any other phone.

In the past Apple has complied with requests to, for example, bypass lock screens in aid of criminal investigations. It’s only in recent years that they’ve taken an ideological stance on consumer privacy. I believe Apple is taking this position less as a moral high ground and more as a competitive differentiator, betting that Google won’t do the same.

The broader discussion around whether generic back doors should be provided by technology providers to law enforcement is completely different, and the continued backlash against this is fully warranted. There is no way to do this safely without endangering users. Put a different way, if a back door exists for law enforcement, it also exists for criminals. However, this isn’t what’s being proposed in the Apple case right now and it’s important to make the distinction.

Screen Shot 2016-02-22 at 10.54.56Leo Taddeo

CSO

Cryptozone

color-linkedin-128 color-twitter-128 color-link-128

Leo Taddeo, Chief Security Officer,  Cryptzone [former Special Agent in Charge of the Special Operations/Cyber Division of the FBI’s New York Office]

In its stance against the Judge’s order, I think Apple is right in saying there needs to be public discussion.  They are also right in stating that creating a back door would expose their customers to security risks.

But the risks Apple’s encryption will guard against are small and outweighed by the greater public interest in preventing and investigating serious crimes and national security threats.  In the end, there is a balancing of security and privacy.  The encryption that Apple deploys as a default on its newer phones has shifted that balance.  People, as Apple points out, need to decide whether to shift it back by forcing Apple to comply with the judge’s order.

As the deployment of powerful encryption on cell phones becomes more commonplace, more crimes could go unsolved meaning more criminals will get away and more people will be victims.  The question is whether people continue to view the threat to their privacy as greater than the threat from serious crime and terrorism.

Apple and other companies are in a tough spot. The San Bernardino attacks, unfortunately, won’t be the last act of terrorism or extreme violence.  For every child kidnapping, mass shooting, or terrorist act that involves the use of an iPhone, Apple may have to write another letter to explain why they won’t help the FBI.


Gunter OllmannGunter Ollmann

CSO

Vectra Networks

color-linkedin-128 color-twitter-128 color-link-128

Gunter Ollmann, CSO, Vectra Networks

It appears that there is some confusion to connect this request from the FBI with the bigger government debate on providing backdoors for encryption. Apple has positioned the request from the FBI to be request to “backdoor” their product. This is not correct. The FBI request is pretty specific and is not asking for a universal key or backdoor to Apple products.

Apple likely fears that by complying with this request – to create a custom patch for a vulnerable phone – will open the doors to subsequent law enforcement requests to provide support in investigations of similarly vulnerable (old) iPhones. Given the nature of the vulnerability, there is no universal key approach, and each legal request will likely require substantial involvement from Apple. This would not appear to scale well and could be financially demanding.

I’m concerned that since Apple has attempted to deny the FBI request citing use of backdoors, should they lose this legal argument, the repercussions could be extensive to the entire security industry.

If the denial were phrased in terms of exploiting a (previously fixed) vulnerability, the prospect of a lost appeal would be greatly limited – and the arguments against government backdoors and law enforcement keys could be listed in their correct context, instead of being included in appeals over anti-terrorism and a specific instance of a horrific crime.

SpectorBrian Spector

CEO

MIRACL

color-linkedin-128 color-twitter-128 color-link-128

Brian Spector, CEO, MIRACL

Any process that weakens the mathematical models used to encrypt data will make the whole system less secure, because it will also weaken the protection offered to everyone else. The same vulnerabilities used by intelligence agencies to spy on global citizens can also be used by criminals to steal your passwords. We either enable spying – by either governments or hackers – or we defend against it.

These demands also force tech companies to make some difficult ethical decisions. How can you tell your customers that your products are secure, but also knowingly compromise that security by building backdoors and weakening encryption? For customers to trust your products, their security must come first.

phillieberman2Philip Lieberman

CEO

Lieberman Software

color-linkedin-128 color-twitter-128 color-link-128

Philip Lieberman, President and CEO, Lieberman Software

It is well known that both the phone carriers and manufacturer of locked cell phones maintain their own set of keys within their publicly declared walled gardens to the devices they sell.

This barrier to competition and their ability to select winners and losers in their app store as well as patch/improve their operating system at any time, is also the back door they have to get into any phone they wish and do as they wish at any time irrespective of a client’s wish to maintain privacy or security.

It will be interesting to see how all parties respond to a Federal order to comply with a lawful order designed to counter terrorism.

PKPravin Kothari

CEO

CipherCloud

color-linkedin-128 color-twitter-128 color-link-128

Pravin Kothari, CEO and founder, CipherCloud

The Apple-FBI battle is a watershed moment in the larger framework of the post-Snowden encryption debate. The requested work-around firmware to bypass iPhone’s authentication process essentially grants backdoor access that can be leaked and exploited to hack millions of users.

Under Tim Cook, Apple has differentiated its business model on security, which his speech at the President’s Cyber Summit highlighted.  If Apple reverses its stance now and builds the court ordered firmware, it will undermine the privacy credentials the company’s worked so diligently to obtain. A reversal would also put Apple in a terrible bind on the global stage by setting a precedent for meeting technology backdoors from government agencies. This bypass access to data is exactly what the hotly debated IPB attempts to enshrine into law.

If you would like to contribute a comment to this article contact us at editorial@mefmobile.org or leave your thoughts in the replies below. 

MEF