There are two major problems with any backdoored cryptosystem, even if the back door is carefully designed to work only for the U.S. government after due process has been applied.
There is no technical difference between a back door and a “front door,” and in this post I will not pretend there is. Anyone who tells you otherwise is lying.
This is the first argument you’ll usually hear when talking with people who actually design and implement cryptosystems. I’ll skim over it here because the second argument I discuss is much more interesting.
Barring serious fundamental crypto breakthroughs, it’s not technically possible to introduce a back door for law enforcement without putting the system’s users at risk. Any back door represents an additional attack vector.
(For example, a hypothetical key store for an encrypted messaging service, even if designed to be used only after due process is applied, represents a huge risk to all users of the system.)
This is, in my opinion, the more interesting and important argument.
The Economist wrote this weekend:
The place where liberals should fight—and the spooks should concede—is over supervision and due process. Surveillance of individuals should require approval by independent judges, not by politicians.
Thanks to the nature of the Internet, the encrypted messages we send today will be archived forever. But “due process” and independent supervision are malleable over time. As governments and societal attitudes change, crypto is the only thing that can guarantee the encrypted messages we send today remain safe.
In the early-2000s United States, one could imagine a suspension of due process in the aftermath of a major terrorist attack or other act of war. In countries across the world, due process may be redefined or stripped away after a regime change; or a previously-legal act may become punishable by death.
Then every message sent by anyone who has ever used the system is vulnerable.
Over a sufficiently long timetable, we can only trust cryptography, not due process.