As a former software engineer and engineering manager at Apple Inc. who now runs his own software company, Kelly Robinson shared with the Collegian what people should understand about the FBI’s request for Apple to help decrypt the phone of the San Bernardino terrorist Syed Rizwan Farook. With Robinson’s background in software, he said he believes that most of the public is misunderstanding what the FBI is asking Apple to do, and without understanding that, they cannot form educated opinions on the hot-button issue.
Using your understanding of software engineering, can you describe what the court order is commanding Apple to do that’s led to this national controversy?
The court order is very straightforward. It says that Apple will be compelled to assist the FBI. The interesting bit is that it follows by defining what that assistance looks like in very specific technical detail.
They are asking Apple to create a specific, new thing, and understanding the design of that thing is the critical point. I think the whole case and public opinion about it hinges on understanding the thing Apple is being asked to create. That’s not a commonly understood thing, and it isn’t nicely accessible.
What is Apple’s reasoning for not wanting to comply with the FBI’s request?
Apple is making two points: the first that they believe the sort of backdoor they’re being asked to build is something that, by the sheer act of creating it, inevitably compromises every user’s data — that’s hundreds of millions of people — their iPhones, iPads, everything.
The second point is that they are not being asked to supply information they have; those requests were already granted. Instead, what they’re being asked to do is to use their staff, their tools, anything at their command to create something completely new at the behest of the government that can then be used to go find, potentially, and maybe not at all, some additional information.
The court-provided way to do that is to create something that’s never been created before. Could you imagine Apple being ask to make a new device of the government’s design on the hypothesis that if Apple could make real this design for us, it would potentially help us get information that aids the cause of national security? We’ll simply mandate that it be created. It isn’t the same as asking a company to participate in an investigation at some reasonable expense and burden.
I don’t think that it’s ever been quite done before. Industry participation during war, military production, that might be a closer model than court cases, helping in investigations.
Can the executive simply command any company or a private individual because they have trade knowledge or market position to go and work, at great expense, to create a tool that might or might not help in a cause? Can it simply command a private industry? On what grounds and to what degree can it or can it not?
Terrorism brings conditions of war into private life, so consumer devices are now entangled. The question is: How do private companies make communications devices? Are they not allowed, as a matter of policy, to implement strong end-to-end encryption?
Stronger or weaker security for the public is likewise stronger or weaker security for the enemy.
The reality is that intelligence agencies have been spying on private life for a long time, essentially without permission. It’s a serious overreach, but it’s hard to eliminate. Private companies have continually increased security until, in the case of Apple and the last few major software releases, the security was good enough to prevent these big surveillance systems from being able to see private user data.
There are some very powerful law enforcement officials who don’t want that kind of unbridled access to go away. So they won’t let it disappear without a fight. That general cause has almost nothing to do with the San Bernardino phone. It’s about a change in policy and an ongoing turf war.
The bottom line on the issue is that end-to-end encryption makes the American people safer. In the final balance, that’s it. And our spies can continue to be spies. They just have do more work and be more focused. It’s obviously easier to spy when private data is just openly available to you, but the tradeoff for that is exposing the data to enemies, too. Privacy is a matter of principle, but even if it weren’t, the security risk is too great to weaken encryption across the entire system.
By the way, just to speak for the other side. Someone like [Director of the FBI] James Comey, he is sincere. He wants what he wants because he wants to do his job, protect the American people. It doesn’t mean he’s right about the policy. Other strong voices, peers of his, disagree.
Those are public servants in the difficult position of running our intelligence agencies. They are biased towards their work. To admit that, it’s no cheap insult against them or the good work they do. It’s just that the position they are in weighs too heavily for them to be writing the policy. The legislative and the executive are separate for a reason.
Is Apple refusing to help the FBI at all in this case, or is it just refusing to create this backdoor into decrypting information?
It’s not that the FBI isn’t getting cooperation. Apple, long ago, formed a dedicated team of people, whose full-time focus is fielding and reviewing requests from the FBI and other government agencies. The requests are now common. Even in this case, Apple had already fulfilled the requests, which is being overlooked.
Why would it be so hazardous for Apple to assist the FBI by creating this backdoor to decrypting information on its smartphones?
To understand how digital security works is to know that there is no way to build a backdoor for one person to use that can’t be used by everyone else — terrorists, criminals, and malicious regimes. And these days, there’s more information about you on your phone than there is anywhere else. That’s great leverage, if you’re a bad guy.
The Apple phone has been quite successful, something like a billion of them have been made and sold all over the world, so the value of being able to read all of the information kept on them is incalculable. The incentive to compromise the system is enormous.
Imagine a law being passed, and, all of a sudden, everyone’s homes and offices all immediately have invisible cameras and microphones magically installed, with all of that information available not only to the government but to who knows who else — malicious regimes, criminals, or terrorists.
Everything about your private life, medical records, bank accounts, your location and the locations of your family and friends, your media, your personal thoughts, all would be potentially exposed to all of them all the time.
Another point about the way digital security works: You cannot control the use of something like that. Nothing in software is actually secure in total. It’s more like an economics situation or an arms race, where the goal is finding a homeostasis. Is the cost of getting at something significantly greater than the worth of what it is? If you can maintain that situation, then what you’re protecting is reasonably secure.
So what is the value of all the information on hundreds of millions of people’s devices? Put another way, what would you be willing to pay, if you were an enemy of freedom, to compromise every private citizen in the free world in some way? The value of that is so high as to be incalculable.
The lengths one would need to go to secure it would never be great enough. Its compromise is an inevitability.
That’s why when Apple talks about this backdoor, you have probably heard them say that they feel this is the software equivalent to cancer.
If there were a world where cancer didn’t exist and then you had the ability to create it, if only to somehow contain it, you would never create it at all. You would be inventing cancer — creating a great evil. But unlike a substance in a laboratory, this virtual thing would be impossible to contain.
![]()
