Thumbthing Writings by Bob Spryn, and Roderic Campbell, iOS Engineers and founders at Thumbworks

On the "FBI vs Apple" iPhone Case, Encryption, and Healthcare

In the healthcare space we deal with sensitive information. In fact, we deal with probably the most sensitive, personal information possible. This level of sensitivity calls on those dealing with that information to rigorously defend its security. In the US we consider this so critical, that we have acts of Congress outlining the lengths to which we need to go to protect this private information, and the penalties we will face if we fail to do so.

Today’s mobile phones are the most personal devices ever created. They know extraordinary amounts of information about those of us who carry them; much of that information being gathered (for our benefit) without needing our direct involvement. Our phones contain our banking information, our photos and videos of our children (with GPS data). They know our current location, potentially the current location of our family and friends; our communications with our family, friends, and the professionals we trust to take care of us. They also often know a lot about our health, including data collected in real time, and potentially care plans and treatments stored in an app.

As patients, practitioners, and medical software developers, we should all be very concerned about the FBI winning the battle with Apple over the encrypted iPhone and weakening our ability to secure the very information we deem the most private. We may not have anything to hide, but we have everything to lose both personally and professionally.

To simplify the current debate, let us draw some parallels to the physical world. Consider if there was a security system that completely protected your home. This isn’t an alarm system, this is a security system that 100% prevents any unwanted guests or intruders from entering your home. The only people who gain entrance are you, or those you explicitly give access to. The windows can’t be broken, the lock can’t be picked, the door can’t be broken down. Your children are utterly and completely safe from physical harm from outsiders.

Consider that there was a house which the FBI, or any law enforcement agency, wanted to enter, and the owner wouldn’t provide them access. The only way to enter the home is to ask the manufacturer to weaken or disable the security system. But there is no way for them to just disable or weaken the security system on that one house. The only way to create a backdoor into that house, is to create a backdoor into every house that uses their security system, including yours. If a backdoor is created, it won’t just be the good individuals using it for only completely just and good reasons.

To be clear: the FBI isn't asking Apple for the passcode for one iPhone - they are asking Apple to create a key that works on every iPhone. The FBI argues that it could keep it safe, and that Apple could destroy the key afterwards. If you believe the government could keep it safe, ask the 21.5 Million Federal Government workers whom had their SSN and in some cases fingerprint data leaked. There is also no world in which Apple would be allowed to delete the software afterwards, as Apple (and others) have pointed out:

“Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.”

This is the situation we find ourselves in the world of software and encryption. There is literally no way in which we can weaken the protection encryption gives us, but only for the bad individuals, and only when the good individuals’ cause is just.

While no one sympathizes with terrorists and psychopaths, we can’t react by creating a key that grants the holder access to anything and everything they desire, especially when that means putting our security and privacy at serious risk. (Nevermind the paradoxical thought of weakening security in the name of protecting ourselves. Greatly increased risk individually for the “greater good”.)

To steer this back into the world of healthcare, let us consider this from two angles.

As far as professional repercussions of the FBI winning this battle, it’s actually quite simple: If the FBI forces Apple to weaken the security on iPhones, that will make all iPhones much more susceptible to hacking, and any data stored in your app more likely to be compromised. If this happens, you will have to deal with the headache, and potential penalties/fines associated with a HIPAA breach (or a different regulation). The FBI winning means the government is forcing us, as health professionals and software developers, to bear the burden of an increased security risk to our reputation, our pocket books, and our users/patients’ trust.

Secondly, consider the massive loss of efficiency in medical care, the additional burdens on practitioners and patients, and even the loss of lives, all due to losing the huge benefits that software can provide if none of us trust in the security of our software anymore. The health industry is demonstrably behind other industries when it comes to great software, most notably in the mobile world. Consider if even the modest advances made so far were rolled back because we simply did not trust our devices with our most sensitive data. This is a real scenario, especially in the world of healthcare, where patient information breaches are so risky. Practitioners and insurance companies will decide that the gains, which in some cases might save lives, are simply not worth the risk. Patients may decide the same, to protect the identities and health information of their families.

There are certainly additional ways we can secure the data inside apps we develop, such that if Apple’s encryption was compromised, our data would still be protected. But the costs of developing this additional security are not insignificant. And if we continue down this path as a country, this sort of strong encryption may even end up illegal.

The battle against malevolent individuals, groups, and even foreign nations is real. As John Oliver comically points out in a recent segment (audio NSFW), Apple and others are just barely one step ahead of those that wish to compromise our security. They are constantly, feverishly working to try and keep prying eyes out of your information. Even Apple has been susceptible to some data breaches (luckily limited in scope so far, unlike the Government’s numerous breaches.) Now we are going to let the FBI force them to very intentionally introduce a gaping hole in that which they have made every effort to protect?

There are so many other angles to consider here, such as what happens when additional country governments (like China) ask Apple for the same backdoor, that this already long article could go on for much longer. Please read more here, here, here, here, here, here, and here.

There are very real drawbacks to having true security and privacy, like potentially not being able to trace certain communications, or pull evidence off of certain devices belonging to bad individuals. But what amount of security are we willing to give up in our efforts to try and eliminate every dark corner the bad individuals might hide in.

As for Thumbworks, just like countless security experts, computer scientists, and software companies, we stand with Apple.