Last week, Deputy Attorney General Rod Rosenstein gave a speech about encryption that prompted a considerable amount of well-deserved blowback. His speech rehashed a number of long-discredited technical proposals for “solving” the “going dark” problem, and it also misstated the law. I won’t address those issues with the speech; they’ve been ably dissected elsewhere, for example by EFF, Techdirt, Robert Graham, and, on this site, Robyn Greene.

I want to focus on the rhetorical framing Rosenstein used. Much of it is transparently hyperbolic. Yet its confrontational tone also signals that the Justice Department believes it may yet be able to seize the upper hand in the current round of the crypto wars.

As in any war, propaganda is an indispensable component here. Branding is key. As cryptography professor Phil Rogaway pointed out in an award-winning paper, even the label “going dark” has a Lakoffian aspect to it, evoking our ancient fear of the dark. When we call this the “going dark” debate (or a “war”), we’re giving more power to that framing. Whoever dictates the labels we use has already begun to channel the discussion in their preferred direction, as Rogaway observed.

What I would brand “strong encryption,” the DOJ likes to call “warrant-proof” encryption. That’s the term Rosenstein uses in this speech. We’re both referring to the same thing: encryption that does not provide a mechanism for law enforcement, or the provider of the encryption, to gain access to plaintext (with or without a warrant). Yet Rosenstein and I use different rhetorical frames, because we have different answers to this question: Should there exist spaces in human society that cannot be policed?

It’s clear what the DOJ’s answer is. There “has never been a right to absolute privacy,” Rosenstein said in his speech. This is an attempt to normalize in Americans’ minds a cramped understanding of how much privacy we have, and a correspondingly expansive view of government power. But it does not reflect reality, even if the DOJ hopes that enough repetition will make us believe it. It intentionally obscures the limits on the government’s historical and legal powers to get evidence. 

We can have face-to-face conversations, out of earshot and unrecorded; we can burn letters and documents; we cannot be compelled to incriminate ourselves or to testify against our spouses. “Warrant-proof” is a cute term, but warrants are not magical talismans. The government is not and has never been entitled to absolute surveillance. “Surveillance-friendly encryption” is what law enforcement wants—but that sure isn’t what they’re going to call it.

Instead, Rosenstein calls it “responsible encryption.” He’s reviving a term used in 1996 by then-FBI Director Louis Freeh: “socially-responsible encryption.” “Responsible” here means “capable of granting law enforcement access to plaintext.” By definition, then, end-to-end encryption of communications is irresponsible. Building a smartphone that’s encrypted by default, from which not even its manufacturer can extract plaintext data, is irresponsible. If, as Rosenstein repeatedly tells us, encryption is a dangerous weapon used almost exclusively by wrongdoers, then any tech company providing strong encryption meant to protect its customers from wrongdoers is derelict. He portrays those companies as scofflaws recklessly enabling violent wrongdoers behind a figleaf of “absolute privacy.”

But “absolute privacy” isn’t those companies’ term; it’s the DOJ’s. It is a straw-man argument. Militantly espousing “absolute privacy” is a stance few entities offering encrypted devices and services actually take. That’s evident from any large tech company’s transparency report on its compliance with government demands for user data. However, by putting words in their mouths, Rosenstein puts the onus on them to deny the accusation. (He’s also turning the focus conveniently away from the fact that the ideas he’s rehashing, such as key escrow, have long been discredited.)

Another function the phrase “absolute privacy” serves is to bookend a false dichotomy opposite this peculiar notion of “responsibility.” Here’s the thing: Responsibility is transitive, not reflexive. Responsibility does not exist in a vacuum; it must be answerable. The phrase “responsible encryption” prompts the question, responsible to whom? The answer lies in Rosenstein’s use of the term “responsible encryption” rather than Freeh’s term, “socially-responsible encryption.” That truncation is not accidental. To Rosenstein, tech companies must be answerable to law enforcement above all other masters. Any other choice is irresponsible.

We have seen what “responsible” encryption products look like: the Clipper chip, whose notorious security flaws helped to decide the crypto wars of the 1990s. The Clipper chip was responsible to the U.S. government. It was not responsible to its would-be users, who wanted to secure their phone conversations.

There is no room in Rosenstein’s worldview for the notion of tech companies being responsible, answerable, to their legions of everyday users. Rosenstein willfully ignores encryption’s use by millions of ordinary people for completely valid purposes. According to the DOJ, if tech companies aren’t designing to serve law enforcement, then they’re designing to protect pedophiles and terrorists.

Rosenstein does not acknowledge that the DOJ vision of “responsible” encryption necessarily requires weakening security for all of those everyday users, though he pays the requisite lip service to “strong cybersecurity.” He instead falls back on the “nerd harder trope: Silicon Valley is full of wizards who can do anything if they want to, so their naysaying of “responsible” encryption must be simple recalcitrance. Right now, he notes, they’re hard at work on “drones and fleets of driverless cars, a future of artificial intelligence and augmented reality.” “Surely,” he says, “such companies could design consumer products that provide data security while permitting lawful access with court approval.”

That “surely” elides decades of computer security research saying otherwise. Secure “backdoored” crypto is technically impossible. But Rosenstein wants us to think there are only two possible categories of motivation for espousing strong encryption: “sincere concern” for privacy, and “profit.” Like the “we have never had absolute privacy” line, that “surely” is intended to change what we believe to be true. It’s easier to pretend that the objections to backdoors (a loaded term Rosenstein vehemently disavows) are not about technical realities, but rather, purely about policy choices.

Those policy decisions are driven, Rosenstein insists, by greed. This is where his anti-crypto rhetoric reaches its most cartoonish. Tech companies don’t keep improving their encryption designs because they want to provide better security to their many law-abiding users. No, they are interested only in “selling products and making money.” Law enforcement, by contrast, is “in the business of preventing crime and saving lives.” This line would be laughable were it not so insulting. Strong encryption does prevent crime, such as identity theft. Strong encryption does save lives, such as by helping stymie stalking. But a speech that portrays encryption only as a weapon can never admit that it functions as a shield too.

The companies providing encrypted products are no more craven money-grubbers than the purveyors of locks, wall safes, self-defense lessons, or pepper spray. Why single them out? Because Rosenstein believes the American public is particularly receptive right now to the message that tech companies’ dedication to their products’ security is insincere and financially motivated.

Two years ago, then-FBI Director James Comey forswore a “legislative remedy” in favor of “continu[ing] the conversations with industry”—that is, pressuring tech companies to “voluntarily” change their encryption designs. Today, Rosenstein says those “efforts to engage” did not “bear fruit.” The time for talking is over. Legislation is the only thing that will get tech companies to, as he euphemistically puts it, become “willing to make accommodations.”

What makes Rosenstein think backdoor legislation is feasible now, when it’s never panned out before? Last year’s backdoor bill died before even being formally introduced in the Senate. One of its many problems was that legislation can’t keep criminals and terrorists from having access to strong encryption. Rosenstein admits this. Yet he reasons that even if “less-used platforms” didn’t comply with an exceptional-access mandate, it “would still be a major step forward” if “only major providers” did so.

The focus on “major providers” is the key to Rosenstein’s confidence that legislation is coming. It’s true that defaults matter, as I’ve explained. But “major providers” of encrypted smartphones and apps don’t just account for a larger slice of the universe of potential criminal evidence. They also have a bigger target painted on them right now.

We are at a different cultural moment than we were two years ago. Popular attitudes toward Silicon Valley’s giants are cooling off. The whiff of regulation is in the air, and Rosenstein is cannily fanning it in the direction of the encryption debate. Post-election, those big tech companies are seen as having sold out Americans, and even American democracy itself. Never mind that political ad buys are hardly the same thing as deliberately weakening encryption for all users. Their profit motive is a vulnerability for Silicon Valley companies right now in a way it hasn’t been before in this round of the crypto wars. Rosenstein is happy to capitalize on that.

Rosenstein’s agency badly miscalculated whom the public would side with when investigators chose to take the “Apple vs. FBI” fight public last year. Now, with “negotiations” with tech companies at a détente and public opinion of Silicon Valley becoming more hostile, the government sees its chance—one it’s long anticipated. So far, the public and Congress haven’t bought what the DOJ has been pushing. With tech giants falling into disfavor, that might finally change.

Image: Win McNamee/Getty