I’ve said it once, I’ll say it again: the FBI doesn’t get encryption

James Baker, Alvaro Bedoya, and David Garrow
James Baker, Alvaro Bedoya, and David Garrow

At last Friday’s Color of Surveillance conference, FBI General Counsel James A. Baker called for checks on government power to prevent the kinds of abuses discussed during the conference’s historically focused panels, notably COINTELPRO, surveillance of the civil rights and anti-war movements, and the intimidation of activist leaders — including Martin Luther King, Jr. But he also asked if we, as a society, are okay with the public security implications of unbreakable encryption. That question, and the assumptions behind it, exemplifies why the FBI doesn’t get encryption.

The question assumes that there are objective, measurable costs to unbreakable encryption. More fundamentally, it also assumes that untraceable communication is new, when it is anything but: for most of human history, communication was either oral or written on artifacts that had to be physically transported. Encryption is better understood as a return to the pre-digital status quo, when records were the exception rather than the rule, and the secrecy of correspondence was a sacrosanct rule. Far from “going dark,” we are living in a golden age of surveillance, and we don’t need to go all the way back to the Church Report to find evidence that institutional checks on the government’s power are insufficient. Yet last week Baker reiterated the bewildering claim that the FISA court is a meaningful check on NSA surveillance, when it serves as a rubber stamp at best.

The FBI argues certain investigations would be easier to conduct if the public communicated in the clear. The recent brouhaha over the San Bernardino shooter’s iPhone 5c provides one such example. But the argument falls apart on close examination.

The security community has said from the beginning that the FBI never needed Apple’s help unlocking the Farook phone. It was always about setting a legal precedent that they can compel companies to break their own products. The Farook phone presented a unique opportunity as a test case: it was very unlikely to contain new useful information (it was a work phone, and Farook had physically destroyed his two personal phones; neither the 30-day iCloud backup nor the call/SMS metadata revealed any non-work activity associated with that device), but it was connected to an ongoing investigation, and the FBI could plausibly argue that the phone might contain crucial information, thus playing on the media and the public’s fear of terrorism and poor understanding of the underlying tech issues. The case’s conclusion supports this interpretation, as does the fanciful nature of some of the FBI’s arguments (I’m still waiting for an explanation of what a “dormant cyber pathogen” might be).

What seems likely is that the FBI’s leadership, which has been fighting a losing battle against encryption for decades, had been waiting for the right opportunity to push for a court ruling requiring a private company to hack its own product, and thought the Farook phone would get them what they wanted. They miscalculated, and found an out that would allow them to save face with respect to the general public. The experts of course know that the FBI has egg all over its face, but the FBI doesn’t care what experts think. It’s going for a political win, and we are clearly living in a post-empirical political environment.

The U.S. is currently governed by fear and emotion, rather than facts. Terrorism and child abuse imagery are probably the most frequently invoked horsemen of the infocalypse these days. Senator Dianne Feinstein — whose anti-encryption draft bill was leaked on Friday as well — has raised the bogeyman of a pedophile communicating with her grandchildren through their gaming consoles to justify banning encryption. But it is just that, a bogeyman: there are many more effective ways to protect children from predators (including appropriate parental controls on devices used by children, and talking to kids about risks both online and in the real world, for example) that neither sacrifice the security and privacy of millions of individuals, nor jeopardize society’s ability to evolve and change. She has similarly claimed that the terrorist cell behind the Paris and Brussels attacks used encryption, when in fact it seems that their op-sec relied on burner phones, insecure tools like Facebook and on face-to-face communication. It seems that European law enforcement’s tragic failure to prevent the recent attacks should be attributed to poor coordination between agencies (much as 9/11 was), not encryption.

So these “costs of unbreakable encryption” are neither proven, nor unavoidable through other means, nor different from the world humanity was stuck with until the 1990s. They certainly aren’t worth the costs of mass surveillance, which are well known. Without going into the abuses by the Stasi, the KGB, or Sisi’s Egypt — parallels that are often rejected in the name of American exceptionalism — the mass surveillance revealed by Snowden in 2013 comes at significant, measurable costs to the American economy. Studies also support the notion that the chilling effect — self-censorship due to the perception of surveillance — is real.  As Alvaro Bedoya pointed out in his conversation with Baker last Friday, being a black civil rights activist was effectively against the law during the COINTELPRO era.

As Yochai Benkler laid out in a recent article, “the fact of the matter is that institutional systems are highly imperfect, no less so than technological systems, and only a combination of the two is likely to address the vulnerability of individuals to the diverse sources of power and coercion they face.” We already know that institutional checks on surveillance powers are insufficient, even in democracies, and yes, even in the United States. Unbreakable, ubiquitous encryption is the technical check on surveillance power. The FBI doesn’t need, and should not have, backdoor access to communications, warrant or not. It should find another way to do its job.

Advertisements