Thursday, July 24, 2008

"Encryption Chip" Will Not End Piracy

Nolan Bushnell is full of it. There, I finally got that off my chest. It's arguably acerbic and rather rude, but it needed to be said. You have no idea how hard I've tried to avoid saying it. After all, he's the founder of Atari and a historical figure unto his own. He deserves certain respect for that.

The first time Nolan Bushnell claimed that the "encryption chip" will end piracy, I exercised due restraint. His statement reverberated all over the Internet, causing reactions that ranged from mild skepticism on one end of the spectrum to derision and disgust on the other.

So why am I writing now, more than two months later, if nobody believed him in the first place? In other words, why am I beating a dead horse? Partly, it's because he did it again and it pisses me off. Mostly, though, it's because I'm rather interested in copy protections and security; it's sort of a hobby of mine.

The most important lessons you learn in those two fields is that no protection is perfect and every solution spawns a new class of problems. This means that there will never be one final (technical) solution to the issue of piracy; there is no silver bullet. The experts from both fields are locked in an arms race with their adversaries. Once you've learned that, you'll have no problem recognizing that Nolan Bushnell is really just flogging his merch.

However, the issue runs deeper than that.

Copy Protection and Security

When I referred to copy protections and security I said "two fields", even though one can be considered a subset of the other; after all, copy protections are supposed to prevent the unauthorized use of software. Even though this is technically true, there are some drastic differences between the two.

An important difference is the level of cooperation from the users. When it comes to information security, the users actively cooperate with the protection systems, because it's in their best interest. You don't give access to your bank account to all your friends, do you?

On the other hand, copy protections often clash with the users' interests. Some of these interests are illegal, such as downloading a commercial game for free. But other interests are quite legal and legitimate. You added more memory to your computer? Odds are you might have to reactivate your Windows.

Another important difference is that a copy protection has to protect an application that lives on the user's computer. Unless we're talking about an MMOG, there's no server counterpart that executes a critical piece of code, without which the game can't work.

When you put those two things together, it becomes obvious why you can't make a perfect copy protection: you're relying on cooperation from a user that has complete control over his copy of your content or software. If that user doesn't want to cooperate, the best you can do is delay him. Even unbreakable ciphers won't help you, because sooner or later you'll have to decrypt the content and, when you do, the user will nab it.

But what if you could alter these conditions? You could make sure that there's a critical part of an application that executes somewhere where the user doesn't have control over it: that's what MMOGs do. The other option is to take away the control from the user.

Trust Controversy

Enter the "trusted computing". The first time I heard of it was back when Microsoft was touting Palladium. Back then, it sounded like a bad pun: a company found guilty in an antitrust lawsuit is proposing to build a "trusted computing platform" for its users. The irony was not lost on anyone and it provoked some enlightening responses from security experts.

Then, since nothing really seemed to happen and we didn't all suddenly wake up in some digital equivalent of 1984, I lost track of this topic for a while. I forgot about it until Nolan Bushnell started his TPM hype. A quick search engine query revealed that TPM stands for "Trusted Platform Module" and that it's the central component of "trusted computing".

What, then, is the so-called "trusted computing"? It's a technology that encompasses the following concepts:
  1. Endorsement key is a cryptographic key pair unique to one computer. The chief use for it is to prove the computer's identity.
  2. Secure I/O makes sure that the communication between the user and their software is secure and cannot be intercepted or altered.
  3. Memory curtaining protects those parts of memory that contain sensitive data (such as cryptographic keys) from unauthorized access, even by the operating system itself.
  4. Sealed storage binds data to the specific platform -- both hardware and software -- so that you cannot access it from any other platform.
  5. Remote attestation allows authorized parties to detect changes to the platform configuration in order to make sure that they meet the expected parameters; in other words, to prove that nobody tampered with the platform.
That's just a brief summary, to give you an idea of what we're talking about here. If you want more information, I recommend that you start at Wikipedia and then go on directly to the Trusted Computing Group site.

So, the core idea is to make computers more secure, by ensuring that no "untrusted" code has access to your stuff. At least, that's supposed to be the core idea. Unfortunately, there has been a great deal of confusion about the word "trust" in "trusted computing". Specifically, who is supposed to trust whom?

If you read Bruce Schneier's essay on "trusted computing", you'll notice that there's a good deal of controversy and confusion surrounding the issue. As one commenter so aptly put it, the only one not trusted seems to be the owner of the computer.

All Your Base

Each of the five concepts of "trusted computing" addresses a real security problem:
  1. Endorsement keys would be used to mitigate spoofing concerns in secure transactions by establishing the identity of each party involved.
  2. Secure I/O is supposed to avoid security breaches through techniques such as keylogging.
  3. Memory curtaining would make sure that sensitive information, such as cryptographic keys, is not allowed to "leak" somewhere where it could be extracted by malicious parties.
  4. Sealed storage would do a similar thing for sensitive information in non-volatile storage.
  5. Remote attestation could help network administrators easily detect intrusions and attacks on their machines.
Yet, after a closer look at them, it becomes evident that there's plenty of room for abuse. Imagine, for example, a system that enforces specific usage policies on your data:
  • It would use sealed storage to bind that data to a particular application or set of applications that you're allowed to use on that data.
  • It would employ memory curtaining to make sure you cannot extract that data directly from memory.
  • It would use secure I/O to make sure you cannot intercept it on its way somewhere else.
  • It would use remote attestation to report if you tamper with any part of the system.
  • And it would clearly identify you as a "culprit" to whoever is interested in enforcing those policies, if it possessed both your personal information and your endorsement key.
Is there any kind of usage policy that springs immediately to mind? There are two, actually: DRM and vendor lock-in. Ross Anderson describes several ways to abuse TC in his FAQ. Richard Stallman dedicates a whole chapter to this topic, in his book "Free Software, Free Society"; although slightly reminiscent of Book of Revelations in tone, it offers some very interesting insights.

Another interesting aspect of "trusted computing" is that it actually raises the stakes when it comes to information security: imagine a worm that successfully exploits a bug in the supposedly secure OS code to install a "trusted" rootkit? Talk about irony.

Pirates vs. Ninjas

Getting back to the original topic, does this mean that Nolan Bushnell is right? Is his "stealth encryption chip" really going to send all the pirates to the Davy Jones's Locker? Not by a long shot! Remember, if the software in question does not have some critical code running on some computer under control of some "authority", you can eventually break its copy protection.

When it comes to policy enforcement, the most important part of the "trusted computing" is the remote attestation. This is the way to ensure you won't tamper with the policy enforcement code. Incidentally, it requires you to be online. Now back up a couple of months and remember what happened when BioWare tried to pull that trick on its players.

If you believe that pirates can't hide behind this forever, think again. There are numerous valid reasons to resist the attempts to introduce an artificial dependency on Internet connection into software and all those reasons boil down to one: the connection is not always available, yet the artificial nature of the dependency means that the software doesn't actually need it to work properly.

Besides, people are not yet convinced that "trusted computing" will actually make things better. Plus, there are all sorts of concerns about privacy and also about practicality of the whole approach. Still, the Trusted Computing Group has been formed, the commercial motivation is there and "trusted computing" will keep rolling, until all doubts and concerns have been dealt with, one way or another.

What, then, is the worst blow "trusted computing" could deal to pirates? Indulge me a bit, as I let my imagination run wild and explore a "what-if" future.

Crack Dealers

Back when I was a little kid, learning what makes the cute, little Spectrum 48 tick, pirates were selling games on audio tapes. Today, pirated games are free. They are cracked for free, by enthusiasts; they are uploaded for free, on sites that survive on advertising or donations; and they are downloaded for free. I can still see some pirates in the streets, selling CDs and DVDs, but I'm sure they won't be buying any Ferraris with that money.

Fast forward to the time when "trusted computing" is in its full swing. To crack protections, pirates need very specialized software, maybe even some hardware, and a lot more effort than before. More than ever, piracy is something that only the select few can do.

However, it is also more lucrative than ever. As the usage policies are enforced more rigorously, the multitudes who used to obtain their entertainment for free now have to go and buy it. The big companies take advantage of that and the prices are even higher than before. You can buy an overpriced game directly from its publisher; or you can take a chance and go buy yourself a pirated copy from a local "software crack dealer". It's illegal, sure, but it's a lot cheaper and you can afford to buy a lot more.

Suddenly, pirates are not your everyday enthusiasts anymore; instead, they're rich criminals. They have bodyguards with guns. They have shady lawyers. They have money laundering enterprises and fake fronts and lots of connections. They know powerful people. In your efforts to eradicate a problem, you managed to make it mutate into something worse.

If this seems improbable and exaggerated, that's okay: I don't believe it's likely to happen. My point is that you should always be on the lookout for unintended consequences. It would be nice if, just for once, we asked ourselves where we're going before we get there.

Sunday, July 20, 2008

Update: Firefox 3 Works After All

Consider this a deep breath before I plunge into some more serious blogging. It turns out that Firefox 3 works rather well, after all. Not that I doubted it, but I couldn't really try it out until all my favorite add-ons were updated and relatively stable.

The biggest difference I've noticed so far is that it doesn't do its famous hog-the-CPU trick on pages such as Google Spreadsheets. Now that is a big relief for me; I absolutely hated that particular bug.

Make no mistake: I didn't change my mind about my first experience with Firefox 3. I still believe the developers should have allowed their users to do one of the following:
  1. check the add-on compatibility during the installation
  2. revert to Firefox 2
  3. run Firefox 2 and Firefox 3 side by side
But, hey, at least it works now. And I've only confirmed that I'm a late adopter by nature. As if I really needed to confirm that: I've lived in Chile for nine years before I decided to try Carménère.