It would be useful if you could really trust code running in your browser. It’s not obvious that this is possible; but it’s not obvious that it isn’t, either.

Google just announced End-to-End, OpenPGP code, destined for life as a Chrome extension that might add Strong Privacy to your Gmail life. Similarly, Keybase offers a slick in-the-browser encrypt/decrypt/sign experience. Both of these do encryption in JavaScript. If this can be made useful and safe, it’ll be amazingly useful in extending privacy to everyone. But maybe it can’t.

There are two classes of reasons people don’t want to trust browser-based crypto: political problems and technical problems.

The political problem · This is when a government employee shows up at Google or Keybase with a legal document that says “You have to modify your code so that every time they type the password to unlock their secret key, you send us a copy.” There goes your privacy. Don’t think something that could happen? It already has.

This is tough territory; in many apparently-civilized countries, law enforcers can get a warrant that not only requires someone to spill the beans, it requires them to keep the spilling a secret. Which is reasonable on the face of it: If I’m mounting an intercept on a someone I think is smuggling slave laborers or AK-47s, I don’t want the bad guy to find out. It becomes unreasonable when the spooks want to know everyone’s secrets; recently, it seems that that’s exactly what they want.

Countermeasure: Canary doc · Some people have thought out loud about the notion of “Canary documents”. The first example I encountered was a public librarian somewhere in the US who posted a sign saying something along the lines of “So far, no FBI agents have requested information on the library use of any customers.” The idea is, when the sign comes down you should worry.

rsync.net, a secure-offline-backup provider, has a more sophisticated warrant canary; a document, signed with their key, asserting that no warrants have been served on them, containing headlines-of-the-day scraped from a well-known newspaper.

Which is actually pretty clever. Would it work? I doubt it; I am not a lawyer and have no idea whether, in any particular jurisdiction, law enforcers can get a warrant requiring you to lie. But I’m sure that in lots of jurisdictions it’s no problem, and that in even more, they can require rsync.net to turn over the private key, so they can go on publishing the statements whether rsync likes it or not.

Countermeasure: Safer hosting · This is the notion that you locate your business in a jurisdiction where you think it’s less likely that overenthusiastic public-safety officials will decide that their duties require you coughing up everyone’s secrets. Germany and Iceland spring to mind — the USA no longer does, sadly — but I’m no expert in this stuff. Note that it doesn’t really matter much where your servers are; it’s where you live and work and get paid.

The “get paid” part matters and is why this may not be a good solution; if you are offering privacy services as part of a business, it’s hard to make a living while ignoring the United States market. And if officials there don’t like your privacy policies, they can probably just shut your biz there down.

Countermeasure: Verifiable transparency · Take a few minutes and read Keybase’s Server Security document. The cryptographic details are hard to understand (for me too) but the goal is obvious: They’re trying to make it possible to use Keybase without actually trusting it.

Keybase has always done this, to some extent: They offer assertions that the holder of some private key also controls a particular Twitter account and/or GitHub account and/or domain name. You don’t have to believe their assertions because the workings of public-key cryptography mean you can double-check for yourself.

But suppose, once again, that they get a threat (criminal or legal) that forces them to compromise their code with the effect of misleading people or spilling secrets? The keybase server offers a signature chain that allows an external site to verify whether or not any of the assertions have been changed.

Of course, for this to work, the external site has to download and remember the existing signatures in the chain. If even a small number of external sites started remembering Keybase signature chains as a service, my feeling is that this would make subverting the service more or less completely impractical.

But, while this probably protects the integrity of the cryptographic assertions you can fetch from Keybase, it doesn’t actually protect the JavaScript code that you have to type your password into if you want to do in-the-browser encryption or signing.

The technical problem · Modern JavaScript executing in a modern browser is an incredibly flexible, malleable, open-ended system. It’s reasonable to worry that when you use your secret key, even if it doesn’t get sent off to the government for legal reasons, a software bug or a malicious hack might end up sending it somewhere you really don’t want it going.

For a good expression of this opinion, read Matasano Security’s Javascript Cryptography Considered Harmful. It’s sort of depressing.

Now, I lack the expertise to address the issues Matasano raises point by point; but there are a few that seem pretty bogus and as a whole I have to say that I’m not entirely convinced. Obviously, a secure JavaScript crypto implementation is difficult... but impossible? Just in the last few years, I’ve seen JavaScript doing all sorts of stuff that I would have sworn was impossible, and some of the most brilliant people in the software biz are systematically engaged every day in pushing its boundaries back.

So yeah, such an implementation would have to worry a lot about issues of secure delivery and context-driven behavior and runtime malleability and entropy sources and so on. It might not even be possible without some browser modifications, and Matasano makes hay over this one, talking about the crushing weight of people running 2008-vintage browsers.

But even in that worst case, a whole lot of people are running modern browsers that keep themselves up to date reasonably well; and if they’re the only ones that can have in-browser-crypto, that’s way better than nothing.

So, for the sake of this argument, let’s assume Matasano is wrong and good strong in-the-browser crypto is technically possible.

By the way, if (and quite likely when) there is news relevant to whether or not browser crypto is technically feasible, I’ll update this post.

But anyhow, it doesn’t help much without a remediation for the political problem: Someone points a gun at Google’s head and says “Wonderful that you made that end-to-end code safe and reliable. Now make it unsafe and give us the goodies, or it’s jail time for Larry and Sergey.”

Countermeasure: Verifiable code · JavaScript, by its nature, requires that the source code of an app be downloaded before it’s executed. Suppose that we had a particular version of an app that had been carefully audited and everyone was convinced was acceptably secure and back-door-free. Suppose we had a way to apply a digital signature to that version and then, whenever someone wanted to run it, a way to check the signature to make sure we were about to run the right version. This smells like a pretty interesting solution to me.

So I looked around to see if anyone was doing this. Well, sort of. I include the first thing I discovered mostly for amusement value: Signed Scripts in Mozilla, prominently labeled as probably not updated since 1998, and with references to “Netscape Object Signing” and something called SignTool. All it proves is that at some previous point in history, somebody thought this was a good idea.

People still do, in some contexts: Over at Firefox Marketplace there’s a writeup on Packaged apps, where all the resources come out of a zipfile. It says: “privileged and certified apps are digitally signed to enable the use of privileged and certified APIs. Privileged apps are signed as part of the Marketplace review process, while certified apps are signed by device manufacturers or operators.”

I wondered if there were something similar on the Chrome side. I dug around in the docs and, sure enough, a Chrome extension is signed with the developer’s private key. (I couldn’t find anything similar for Chrome apps as opposed to extensions, but maybe I was just looking in the wrong place).

So this notion of code signing is not radical in the slightest; is anyone working on making it accessible for arbitrary chunks of application JavaScript?

Versioning · Of course, software doesn’t stand still, if only because we keep finding bugs in it we have to fix. So I think that if we were going to dive into signed-JS, we’d probably want to use a Merkle tree or Hash list or something so that when someone releases version 1.4 of a library, we only need to review the diffs to maintain our confidence.

I don’t see anything here that’s really technically impossible. I hope I’m not missing anything, because the task of making Strong Privacy available to everyone is going to be way easier with decent browser crypto.



Contributions

Comment feed for ongoing:Comments feed

From: John (Jun 10 2014, at 03:05)

My reading of the lavabit situation is that they were asked for private keys, not to modify code. While I wouldn't put anything past the government, I don't actually think they have tried to pull this off yet. I don't think courts would approve it.

Also, I think all mail readers will allow you to use this heightened security. So if you don't trust Google JavaScript, use a different mail reader.

Furthermore, it's pretty easy for hacker types to check JavaScript for holes. If the govt were to ever start this kind of hacking, they'd be found out pretty quick.

[link]

From: Keith Wansbrough (Jun 10 2014, at 04:59)

"...so that when someone releases version 1.4 of a library, we on­ly need to re­view the diffs to main­tain our con­fi­dence."

Historically that hasn't always worked out so well. Remember this one: http://anonscm.debian.org/viewvc/pkg-openssl/openssl/trunk/rand/md_rand.c?r1=140&r2=141&pathrev=141 where the second change is just fine, but the first change was disastrous for SSH keys.

[link]

author · Dad
colophon · rights
picture of the day
June 08, 2014
· Technology (90 fragments)
· · Security (39 more)
· · Web (396 more)

By .

The opinions expressed here
are my own, and no other party
necessarily agrees with them.

A full disclosure of my
professional interests is
on the author page.

I’m on Mastodon!