18 March, 2004: I, for one, welcome our new Trusted masters

[ Home page | Web log ]

Yesterday I went to a talk, rather improbably hosted at Microsoft Research, on the subject of `TCPA-Enabled Open Source Platforms' by Demetrios Lambrou from Royal Holloway. (At this stage I should, as usual, apologise to those of my half-dozen readers who are non-technical. This article is pretty long. That said, I -- egotistically -- encourage you to plough your way through it; there's some stuff of general interest in here.)

For those who don't follow IT industry newspeak, `platform' means `software that isn't finished yet', `open source' means `Free software', `enabled' means `compromised', and TCPA, the `Trusted Computing Platform Architecture', is an industry initiative to make PCs less useful for consumers and more useful to certain companies -- in particular, media publishers like the music industry. (Note the slightly peculiar security engineering definition of the word `trust', which is that a thing is `trusted' if, by misbehaving, it could really screw up your life. In these terms, you probably `trust' your bank, car, doctor and significant other, but that doesn't necessarily mean that you'd tell them your deepest secrets. Bear this in mind the next time you're invited to place trust in some inanimate object.)

The basic gag with TCPA (and the Microsoft technology, which is called NGSCB, for `Next Generation Secure Computing Base', and which will presumably win out in the marketplace, partly because Microsoft is a big evil monopoly and partly because the IT industry is likely unable to resist the allure of a five-letter acronym) is that each new computer is equipped with a little sealed cryptographic gadget. A program running on the computer can ask this gadget (which in TCPA is called the `Trusted Platform Module', which is particularly confusing, since the acronym `TPM' already means something completely different in this field) to produce a signed statement about the `identity' -- meaning a secure checksum, a large number which uniquely identifies a particular thing -- of the program.

This process is called `remote attestation', and it is designed to allow people to build services which can only be used with the software intended by their designers. Most web services aren't like that, of course -- you can use any web browser you want, and they rely on standard protocols that anyone can implement. But if you want to restrict the use people make of your service, that's bad news; and this kind of restriction is what remote attestation is supposed to let you implement.

As an example, an online music shop might use remote attestation to make sure that you only download and play music from its servers using their own music player, which could make sure you listen to adverts in between songs, prevent you from playing the songs you've bought for more than a month after you've purchased them unless you pay further protection money, and stop you from recording them onto a CD. (It could also stop you from doing Bad and Wrong things like transferring them onto your iPod or -- horror of horrors -- giving copies to your friends. If you think the other examples are ridiculous, well, witness the DVDs which force you to watch the copyright warning and trailers in sequence, without fast-forwarding, before they allow you to watch the film.) For a slightly less offensive example, your bank might require you to use its own special client to access your bank account -- rather than your normal web browser -- as a condition of using their service; ostensibly this might make the service more secure against fraud, but it's hard to see why your bank should be making choices about what software you run on your own PC.

Now, this talk was about implementing an interface to the TCPA hardware in Linux. It's been known for quite a while that this is (a) feasible, and (b) more-or-less a complete waste of time. So now we know that (a) is true for sure; as for (b), well, Lambrou's implementation allowed you to define a big list of programs which would be allowed to run (by specifying a list of their checksums); the system would use the TCPA hardware to prevent and program from running unless it's on the approved list. You can sort-of see the value of this; for instance, it might stop certain classes of security attacks on your Linux computer. But observe that to be useful, this requires you to maintain an enormous list of every piece of software which you regard as `safe to run' (you can't do it the other way round, obviously, because it would be easy to make a trivial change to a blacklisted program which gave it a different checksum without changing what it did). This has a particularly bad consequence for users of Free software, who expect to be able to modify the programs they use -- after all, that's the whole point of Free software. (As the GPL says, it's about freedom, not price. In the future, expect proprietary software like Microsoft Windows to fall in price to its marginal cost -- which is zero, just the same as Free software -- in order to compete in the marketplace; but you still won't easily be able to alter it, because Microsoft won't give you the source and wouldn't allow you to share your modified versions with others anyway. They can't, of course, stop you from modifying the programs you've bought, though -- that's a right guaranteed by the Software Directive, whatever their `licence agreement' might say.)

This means that `remote attestation' is pretty useless for Free software, I remain convinced that it will be close-to-useless for the other kind, too. Imagine that music playing program again. As I said, it's designed to prove to the music shop that it's a kosher version of the player before it will play any music. So to start with the record company has to have a database of the checksums of all the versions of the player program it's released. But now observe that the nasty vicious user might have installed some other software which is designed to capture the output of the program and save it in a file. Clearly we can't put up with that, so the software will have to ask the TPM to verify that the sound driver is kosher too. So now the record company needs a list of the checksums of every sound card driver in the world. Except that's quite a lot of work, so they'll probably only pick the five most popular ones, leaving users of less-common hardware high and dry.

But now they realise that the user could modify the operating system itself to intercept data coming out of the music player. So now the record company needs a list of the checksums of every version of the Microsoft Windows kernel which they regard as kosher. But a device driver -- unrelated to the sound card -- can modify the kernel after it's running, so now they need a list of all the other device drivers they regard as OK. The same goes for more fundamental parts of your PC, such as the BIOS itself (which TCPA is also designed to check up on).

By this stage the record company finds itself in the position of vetting every Windows program in the world to check that it's happy with releasing its music to be played on a system with that program running, which is obviously going to be pretty expensive and hard to keep up with; or defining a short, restrictive list of software it regards as safe, which will probably be OK if you only run common programs on your PC, but infuriating otherwise; or giving up on the whole idea as basically a waste of time. Even the middle option has a serious technical problem, which is that the record company still has to keep up with new versions of any part of the system which might be upgraded, for instance if a security hole is discovered in them. And if they're lax about doing that, users who upgrade as soon as bugs are fixed will find that none of their music will play any more, which is hardly conducive to good security.

There are ways to ameliorate some of these problems -- one of them is, rather than relying on checksums of programs, to accept any program which is cryptographically signed by a `trusted' (there's that word again) manufacturer -- but at some point you have to face up to the problem that someone will have to maintain this big database of stuff. And as soon as a single program which can be used to subvert the `security' of the system creeps into the database -- that is, the program becomes `trusted' -- the whole thing is blown apart in an embarrassing expensive mess. That is to say, any security which is based on `remote attestation' and a catalogue of permitted software will be extremely brittle.

But... this is all that `remote attestation' can be used to do, so far as I can see. That makes it kind-of useless. So why is so much effort being expended on it?

The simple answer is that I don't know. However, I do have a theory, which is speculative and could only really apply to the Microsoft effort, NGSCB. It starts from another `why on earth?' question:

Why is Microsoft developing a virtual machine -- the `Common Language Runtime' -- and, indeed, the rest of the `.NET' effort about which we have heard so much lately?

After all, they don't really care about portability to non-Windows platforms, since their entire strategy is based on forcing users to buy Windows if they want to make use of Microsoft software (they even claim -- ludicrously -- in their `licence agreement' that you can't run their software on another operating system); and they don't really need hardware independence, since Windows only runs on one kind of hardware anyway.

(Note that this discounts Windows CE and possible ports of Windows to new processors like ia64. Note also that Windows NT ports to all the non-x86 processors were swiftly abandoned, with only the Alpha port lingering long enough to get rotten and smelly. This is a big hole in my argument, but ignore it for now.)

One of the things that a VM (virtual machine) lets you do is to impose a security model on programs which run inside it. For instance, the strangely popular Java VM has a security model that is supposed to prevent `applets' which run inside a web browser from doing things which they shouldn't (like deleting all your files or sending your bank details to Nigeria). It turns out that the Java VM's security model had all sorts of problems, but with enough work this kind of thing could be fleshed out and made to work properly.

Once you have a VM and a security model inside it, remote attestation suddenly becomes a workable tool. If the virtual machine really can impose effective restrictions on the software which runs inside it -- for instance, allowing a music player to write only to an encrypted data store, and never reveal the key; an online banking program to be protected from all the other programs on the system; or a file viewer or web browser to write only to a temporary filesystem for downloaded files -- then it's enough for a remote service to be able to prove that the virtual machine is a particular, trusted, version using `remote attestation'. Once that's done, the service provider can rely on the VM itself to do its dirty work. No huge catalogue of kosher software is required, but rather a much smaller catalogue of approved versions of the virtual machine. Even better, there's only one manufacturer of the virtual machine -- Microsoft -- and, just to be helpful, they can control the database too. Isn't that convenient?

Expect more bad news on this front in the future.


Copyright (c) 2004 Chris Lightfoot; available under a Creative Commons License.