A small earthquake happened at the end of May – a well-regarded, widely known encryption program called TrueCrypt shut its doors. For those who care about surveillance, encryption, and open-source methodologies, the change was abrupt and disturbing. It’s the type of thing that goes unnoticed by the broader public, but has quiet effects that should not go unremarked. Here’s why:
First, a bit of background. As many have noted, sound and effective encryption of data is one of the most attractive answers to concerns about surveillance (whether your concern is with respect to the US government, a private corporation, or the Russian mob). Well-encrypted data is, using today’s systems, effectively beyond decryption – with a couple of notable exceptions that highlight why TrueCrypt was important.
TrueCrypt was an encryption program for data at rest (that is, data stored, for example on your hard drive – as distinct from data in motion, like the contents of the email you send to your friend). For data at rest encryption programs there are, conceptually, three points of vulnerability. The first is technological – if the program implements its encryption algorithms poorly, or if it uses poor encryption systems that are weak mathematically, it is possible that the encryption can be cracked from the outside. In today’s world, that’s generally not a problem and the flaw is often easy to detect.
The second vulnerability is at the end-point. Nothing can protect my encrypted data if someone steals (or tricks me into disclosing) my passphrase that decrypts the file. Many of the programs that we fear most, like keyloggers (which log all keystrokes that I type in), are attacks of this sort that are directed not at breaking the program but at looking over my shoulder and watching me when I decrypt my files. These vulnerabilities are real, but obviously extrinsic to the encryption program itself. [And, under developing law, compulsion for disclosure of the passphrase is beyond the power of the US government.]
The third vulnerability is a problem of trust – do you trust the person who built your encryption program to NOT have added a universal backdoor that allows him/her/it to decrypt your files without your knowledge? In a world of limited trust, this question is often the most vital – at least to some in society.
So, for example, Microsoft has a very robust encryption program known as BitLocker. But the implementing code for that program is proprietary, and so the only assurance you have that Microsoft has NOT put a backdoor in the code is that it says it hasn’t. Personally, I trust that assurance – I think it would be too devastating to Microsoft’s business for such a backdoor to be discovered. But I also understand and acknowledge that many people don’t accept the assurance at face value and worry that they can’t test the proposition.
And that is where TrueCrypt came in. It was what I would call a “semi-open-source” program. Developed more than 10 years ago by several anonymous programmers it was given away for free and became the gold standard for many users – journalists, police, criminals, and protesters alike all used it widely. When the program became well-known and highly regarded, some who used it began to ask whether the unnamed programmers could themselves be trusted not to have put a back door into the program.
In response the programmers declined to identify themselves (which might have eased concerns). But they did make the underlying code available for an audit. In happy Internet fashion, funds for the code audit were raised through crowdsources. And the first phase of that audit was completed earlier this year with, essentially, a clean bill of health. The auditors found a few minor coding errors, but nothing that they could identify as compromising the integrity of the program. Plans were underway (and still are) to complete the second, final, phase of the audit later this year.
Which brings us the recent events. Those interested in cryptography were surprised to wake up one morning and find that the free site where TrueCrypt could be downloaded had shut down. Visitors were redirected to another site and cryptically (yes, the pun is intended!) told that “The development of TrueCrypt was ended in 5/2014 after Microsoft terminated support of Windows XP.”
Even more puzzling, the site offered instructions on how to migrate from TrueCrypt to Microsoft’s BitLocker system. As I said, that system looks quite robust – but the market for it is very limited among those who used TrueCrypt in the first place. They were, systematically, unlikely to accept the Microsoft substitute because they didn’t trust Microsoft – the very reason they used TrueCrypt. So it is a bit odd indeed, to point people to a product that they are probably not going to want as a substitute.
Dark theories for the shutdown abound. Some say that TrueCrypt knew it would fail the second part of the audit. Others think that TrueCrypt was penetrated by NSA and knew it (or, worse yet, that it was an NSA operation from the beginning). Some think that, like Lavabit, TrueCrypt shut down rather than narc its customers out to the Feds.
Most observers are a bit more charitable and suspect that it is simply the case that the guys who invested 10 years in keeping the code current got tired of doing it. Some users who hold that view are trying to keep TrueCrypt alive on a new site based in Switzerland. And, of course, others are trying to take advantage of its demise, offering a for-pay replacement. Still others, like Amazon, continue to use TrueCrypt because they don’t have a good alternative.
What are we to make of all this? Why, after all, spend 1000 words describing an event that seems of little significance?
Well, for starters, I like a mystery as much as anyone and you have to admit this is a good one. Famous product all of a sudden pulled from the market. Why? Nobody knows. So it’s just a good yarn …
But beyond that, the episode re-emphasizes the challenge of an “open source” method of security. [Lawyerly footnote: There are some who dispute that TrueCrypt was truly open-source because it was licensed, so you couldn’t modify it at will. I use the term here to mean that the code could be seen, reviewed, prodded, tested, and deconstructed.] As with Heartbleed, open source methods work only for as long as volunteers are willing to work on the project. I tend to think that you get what you pay for – and if nobody was paying the TrueCrypt developer(s) it is not at all surprising that he/they eventually just decided to find a better way to spend his time (heck, maybe he got married – or maybe she did …).
Third, the incident serves to reemphasize how much the whole Snowden affair has disrupted settled expectations. Pre-Snowden, concerns over encryption were limited to a much smaller minority of folks and the demise of TrueCrypt would not have been accompanied by grim views of government enforcement. Today, those are common place.
Finally, the episode serves to also illuminate how broken our system of security is. We can’t trust the government to provide it; we can’t trust private corporations to provide it; and we can’t rely on the kindness of strangers to provide it either. Unless you are one of the rare individuals who can build and install their own encryption code (I am =not=!) you are inevitably reliant on somebody else for your security. Yet nobody is somebody you can trust. And that leaves us hopelessly vulnerable – not just to mistrusted governments but to malevolent actors across the globe. The Russian cyber gangs must rejoice at the demise of TrueCrypt.
Me? I’m still using TrueCrypt until the audit is complete and I’ll make a decision then about a transition. Others who are reliable, like Bruce Schneier, are shifting to commercial products, despite the risks of unknown backdoors. You pays your money and takes your chances. And, therein, hangs the tale …. We are in a maze of twisty passages ….