Apple’s iPhone Encryption

It took the upheaval of the Edward Snowden revelations to make clear
to everyone that we need protection from snooping, governmental and
otherwise. Snowden illustrated the capabilities of determined spies, and
said what security experts have preached for years: Strong encryption of
our data is a basic necessity, not a luxury.
And now Apple, that quintessential mass-market supplier of technology,
seems to have gotten the message. With an eye to market demand, the
company has taken a bold step to the side of privacy, making strong
crypto the default for the wealth of personal information stored on the
iPhone. And the backlash has been as swift and fevered as it is
wrongheaded.
At issue is the improved iPhone encryption built into iOS 8. For the first
time, all the important data on your phone—photos, messages, contacts,
reminders, call history—are encrypted by default. Nobody but you can
access the iPhone’s contents, unless your passcode is compromised,
something you can make nearly impossible by changing your settings to
replace your four-digit PIN with an alphanumeric password.
Rather than welcome this sea change, which makes consumers more
secure, top law enforcement officials, including US Attorney General Eric
Holder and FBI director James Comey, are leading a charge to maintain
the insecure status quo. They warn that without the ability to crack the
security on seized smartphones, police will be hamstrung in critical
investigations. John Escalante, chief of detectives for Chicago’s police
department, predicts the iPhone will become “the phone of choice for
the pedophile.”
The issue for law enforcement is that, as with all strong crypto, the
encryption on the iPhone is secure even from the maker of the device.
Apple itself can’t access your files, which means, unlike in the past, the
company can’t help law enforcement officials access your files, even if
presented with a valid search warrant.
That has lead to a revival of a debate many of us thought resolved long
ago, in the crypto wars of the 1990s. Back then, the Clinton
administration fought hard to include trapdoor keys in consumer
encryption products, so law enforcement and intelligence officials—NSA
being a chief proponent—could access your data with proper legal
authority. Critics argued such backdoors are inherently insecure.
Trapdoor keys would be an irresistible target for corrupt insiders or
third-party hackers, and would thus make Americans more vulnerable to
criminals, foreign intelligence services, corrupt government officials, and
other threats. Additionally, foreign technology companies would gain a
competitive advantage over the US, since they’d have no obligation to
weaken their crypto.
The feds lost the crypto wars, but without serious consumer demand,
strong encryption has crept onto our gadgets only for narrow purposes,
like protecting Internet transactions. The iPhone encrypted email and
calendar entries, but little else. Now that Snowden’s revelations have
reinforced just how vulnerable our data is, companies like Apple and
Google, who were painted as NSA collaborators in the earliest Snowden
leaks, are newly motivated to demonstrate their independence and to
compete with each other on privacy.
However it got there, Apple has come to the right place. It’s a basic
axiom of information security that “data at rest” should be encrypted.
Apple should be lauded for reaching that state with the iPhone. Google
should be praised for announcing it will follow suit in a future Android
release.
And yet, the argument for encryption backdoors has risen like the
undead. In a much-discussed editorial that ran Friday, The Washington
Post sided with law enforcement. Bizarrely, the Post acknowledges
backdoors are a bad idea—“a back door can and will be exploited by bad
guys, too”—and then proposes one in the very next sentence: Apple and
Google, the paper says, should invent a “secure golden key” that would
let police decrypt a smartphone with a warrant.
The paper doesn’t explain why this “golden key” would be less
vulnerable to abuse than any other backdoor. Maybe it’s the name,
which seems a product of the same branding workshop that led the
Chinese government to name its Internet censorship system the “golden
shield.” What’s not to like? Everyone loves gold!
Implicit in the Post’s argument is the notion that the existence of the
search warrant as a legal instrument obliges Americans to make their
data accessible: that weakening your crypto is a civic responsibility akin
to jury duty or paying taxes. “Smartphone users must accept that they
cannot be above the law if there is a valid search warrant,” writes the
Post.
This talking point, adapted from Comey’s press conference, is an insult
to anyone savvy enough to use encryption. Both Windows and OS X
already support strong full-disk crypto, and using it is a de facto
regulatory requirement for anyone handling sensitive consumer or
medical data. For the rest of us, it’s common sense, not an unpatriotic
slap to the face of law and order.
This argument also misunderstands the role of the search warrant. A
search warrant allows police, with a judge’s approval, to do something
they’re not normally allowed to do. It’s an instrument of permission, not
compulsion. If the cops get a warrant to search your house, you’re
obliged to do nothing except stay out of their way. You’re not compelled
to dump your underwear drawers onto your dining room table and slash
open your mattress for them. And you’re not placing yourself “above
the law” if you have a steel-reinforced door that doesn’t yield to a
battering ram.
You have to feel for Apple. The company’s slovenly security on iCloud
made it the butt of jokes for weeks after the leak of celebrity nude
photos. Before that, the Goto Fail bug drew guffaws from computer
security experts and inspired mocking tee shirt designs. With the release
of iOS 8, Apple made a privacy improvement so dramatic that it should
rightly wipe out the taint of these security failures. Instead, the company
is bashed by the nation’s top law enforcement official and the editorial
board of one of the country’s most prestigious newspapers.
Yes, some investigations will be frustrated by strong crypto on the
iPhone and Android. Some criminals who otherwise would be convicted
will get away. But cops will still seize plenty of phones in an unlocked
state. Of the others, many crooks will choose insecure passcodes, or
write their code on a Post-It. Still more will hand over their passcodes
or unlock their phones voluntarily in the hope of buying leniency;
experienced cops are adept at convincing suspects to cooperate against
their best interests.
There’s even a growing body of case law saying suspects can be
compelled by the court to surrender their crypto keys under certain
circumstances, despite the protections of the Fifth Amendment. That has
its own issues, but at least the suspect gets a chance to be heard in court
before a search is conducted, instead of after, as with the current search
warrant regime.
On balance, smartphones have been a gold mine to police, and the mild
correction imposed by serious crypto will still leave the cops leaps and
bounds ahead of where they were seven years ago, while making
everyone more secure from the overreach of the authorities and the
depredations of criminal hackers. The law enforcement officials
criticizing Apple should put aside the sense of entitlement they’ve
developed in those seven years and spend some time thanking Apple and
Google for making things so easy for them for so long.

Posted by mubiy via WordPress for Android

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s