There is a misconception that building a lawful intercept solution into a system requires a so-called "back door," one that foreign adversaries and hackers may try to exploit.
But that isn't true. We aren't seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process -- front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.
Cyber adversaries will exploit any vulnerability they find. But it makes more sense to address any security risks by developing intercept solutions during the design phase, rather than resorting to a patchwork solution when law enforcement comes knocking after the fact. And with sophisticated encryption, there might be no solution, leaving the government at a dead end -- all in the name of privacy and network security.
I'm not sure why he believes he can have a technological means of access that somehow only works for people of the correct morality with the proper legal documents, but he seems to believe that's possible. As Jeffrey Vagle and Matt Blaze point out, there's no technical difference between Comey's "front door" and a "back door."
As in all of these sorts of speeches, Comey gave examples of crimes that could have been solved had only the police been able to decrypt the defendant's phone. Unfortunately, none of the three stories is true. The Intercept tracked down each story, and none of them is actually a case where encryption foiled an investigation, arrest, or conviction:
In the most dramatic case that Comey invoked -- the death of a 2-year-old Los Angeles girl -- not only was cellphone data a non-issue, but records show the girl's death could actually have been avoided had government agencies involved in overseeing her and her parents acted on the extensive record they already had before them.
In another case, of a Louisiana sex offender who enticed and then killed a 12-year-old boy, the big break had nothing to do with a phone: The murderer left behind his keys and a trail of muddy footprints, and was stopped nearby after his car ran out of gas.
And in the case of a Sacramento hit-and-run that killed a man and his girlfriend's four dogs, the driver was arrested in a traffic stop because his car was smashed up, and immediately confessed to involvement in the incident.
His poor examples, however, were reminiscent of one cited by Ronald T. Hosko, a former assistant director of the FBI's Criminal Investigative Division, in a widely cited -- and thoroughly debunked -- Washington Post opinion piece last month.
In that case, the Post was eventually forced to have Hosko rewrite the piece, with the following caveat appended:
Editors note: This story incorrectly stated that Apple and Google's new encryption rules would have hindered law enforcement's ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.
Hadn't Comey found anything better since then? In a question-and-answer session after his speech, Comey both denied trying to use scare stories to make his point -- and admitted that he had launched a nationwide search for better ones, to no avail.
This is important. All the FBI talk about "going dark" and losing the ability to solve crimes is absolute bullshit. There is absolutely no evidence, either statistically or even anecdotally, that criminals are going free because of encryption.
So why are we even discussing the possibility to forcing companies to provide insecure encryption to their users and customers?
The EFF points out that companies are protected by law from being required to provide insecure security to make the FBI happy.
Sadly, I don't think this is going to go away anytime soon.
My first post on these new Crypto Wars is here.