Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:You still have to show me how to get my keys (Score 1) 290

Oops, I meant to include the following link in my other post:
Here's the latest TPM Main Specification Level 2 Version 1.2 from the Trusted Computing Group

I dunno if you actually want to dig though that, it's pretty dense techno-jargon specifications for the microchip. I just wanted to include it as a official source for the specification-quotes in my post, and to generally back up my other claims and explanations.

-

Comment Re:You still have to show me how to get my keys (Score 1) 290

In the end, it doesn't really matter who agrees with whom where. I want my keys. How do I get them?

Oh sorry, maybe I should have answered that sooner :)
Get yourself a sophisticated science laboratory and crack / acid-etch the chip open. Then use microscopic probes to extract the key directly out of the silicon circuitry.

Oh, and by the way the chips are explicitly designed to be attack resistant, meaning you have to be really careful the keys don't get damaged/wiped during the process.

Oh, and if you *do* manage to get your keys, you've got to be really careful that no one ever detects you doing anything the Trust system prohibits, like not obeying DRM. The PubEK is the public "name" for your PrivEK, and they can track you by it. They specified a revocation system they can use to effectively kill that key. You then need to go out and buy a new chip (perhaps even buy an entire new computer with a new key and new set of certificates), and crack that that chip open to get a new working key. And of course they'll revoke that key too if/when they detect your computer isn't securely locked down.

The entire point of the Trust system is for you to be able to "trust" that my computer will do what it says it will do, and only what it says it will do, and that my computer is secure against me meddling in that. And vice-versa, that I can trust that your computer is secure against you, and that it will do what I want it to do. For example you could agree to share personal information with some company. Under the Trust system you know that they don't know the Master Key to their own computer, so if their computer says that it will keep your personal information encrypted, then you can Trust that. If their computer says they will only use your personal information in an anonymous way to generate overall statistical data of all their customers, then you can Trust that their computer will enforce that. In theory.

Of course things will virtually always go in the exact opposite direction. A music service will sell you music files, and they will use the Trust system to ensure your computer strictly enforces that DRM against you. You don't have you master key, so when your computer says it will never allow you to read or copy the file (except through the approved DRM-enforcing-music-player), then they can Trust that your computer will never allow you to read or copy your music files. Some company can "rent" software to you, and they can Trust that your computer will never permit you to run that software, except during the paid rental time-span (and the computer would use a secure online date verification to enforce it). And my favorite example, websites using the Trust system to ensure you're not running any ad-blockers and that you can't right-click-save images or other content from the webpage.

The entire point of the Trust system falls apart if owners know or truly control their own computer's master keys. I can no longer Trust your computer, and you could no longer Trust my computer. That's why they set up an elaborate key-tracking and key-revocation revocation system. If you manage any sort of hardware hack to obtain control over your computer they can kill that key and establish your computer is no longer Trusted.

To clarify: The aspect on which BIOS4breakfast and Alsee disgree is that the former feels that there is not a restriction on obtaining keys as long as they are not obtained from the TPM module

You could simply "make up" a completely random key and there are some limited things you can do with it, but in general it isn't going to work. It's not a "valid" or "real" key. It will fail in critical chip operations such as Remote Attestation.

The best comparison is like buying a cellphone without a SIM card. Sure, you can make up your own phone number, and you can program phone numbers into the speed-dial memory and stuff, but in general a cell phone is designed for calling other cell phones, and none of the main phone functions work without a genuine SIM card and genuine phone number for that phone.

The chip comes with a manufacturer's certificate which certifies that the specific PrivEK in the chip actually is a PrivEK. The certificate is like the SIM card, and the genuine PrivEK is like a genuine phone number assigned to that SIM card. The certificate turns that pre-installed key into a genuine and fully functional PrivEK, like a SIM card makes a phone-number into a genuine working number.

Here's the latest TPM Main Specification Level 2 Version 1.2 from the Trusted Computing Group.

5. Endorsement Key Creation
Start of informative comment
The TPM contains a 2048-bit RSA key pair called the endorsement key (EK). The public
portion of the key is the PUBEK and the private portion the PRIVEK. Due to the nature of
this key pair, both the PUBEK and the PRIVEK have privacy and security concerns.
The TPM has the EK generated before the end customer receives the platform.

Later it says:

The PRIVEK SHALL exist only in a TPM-shielded location.
2.
Access to the PRIVEK and PUBEK MUST only be via TPM protected capabilities

and later:

5.1
Controlling Access to PRIVEK
Start of informative comment
Exposure of the PRIVEK is a security concern.
The TPM must ensure that the PRIVEK is not exposed outside of the TPM
End of informative comment
1.
The PRIVEK MUST never be out of the control of a TPM shielded location

So a real (working) PrivEK comes pre-installed in the chip, and the chip is forbidden to give it to you. It is forbidden to exist anywhere outside the chip, so obviously you can't obtain it from anywhere else. You have to crack into the microchip to extract ir.

-

Comment Re:Why? (Score 1) 290

I'll be quoting from this, the latest version from the Trusted Computing Group: TPM Main Specification Level 2 Version 1.2, Revision 116 Part 2 - Structures of the TPM

I'll paste quotes here in italics, key points in bold, and non-italics comments from myself in between.

An Endorsement Key (EK) has two parts, the public part and the private part. The private part the part in control, the public part allows anyone to verify signatures. The PrivEK is the highest level master key of a TPM. It's primary function is to sign messages sent out of the TPM to other people over the internet. PrivEKs are forbidden to ever exist outside a TPM. Anyone receiving a proper PrivEK -signed-message therefore knows that the message could only have been generated inside a TPM, secure under the controls and limitations of the TPM, and secure against tampering by anyone (including the owner).

Note that the PrivEK gets signed by a manufacturer key, securely identifying it as a genuine PrivEK securely locked inside a TPM. The manufacturer key is itself signed by the Trusted Computing Group's master key, authenticating the manufacturer key as a valid key of belonging to a valid and compliant manufacturer. If the Trusted Computing Group ever revokes a manufacturer's key then all TPMs made by that manufacturer are lo longer Trusted... for practical purposes those chips can be considered "dead". If some manufacturer's chips are found to be insecure the Trusted Computing Group can "close the security hole" by effectively killing all of those chips in one shot. And this is exactly how the Trusted Computing Group prohibits any manufacturer from making a non-compliant chip that allowed the owner to obtain control of his system.

5. Endorsement Key Creation
Start of informative comment
The TPM contains a 2048-bit RSA key pair called the endorsement key (EK). The public
portion of the key is the PUBEK and the private portion the PRIVEK. Due to the nature of
this key pair, both the PUBEK and the PRIVEK have privacy and security concerns.
The TPM has the EK generated before the end customer receives the platform. The Trusted
Platform Module Entity (TPME) that causes EK generation is also the entity that will create
and sign the EK credential attesting to the validity of the TPM and the EK. The TPME is
typically the TPM manufacturer.

So the chip's top key, the PrivEK, is inside the chip before the customer buys the computer or other device. This is generally done by the manufacturer.
You can skip/skim over this next section, I'm just including it to preserve continuity in copy/pasting from the source document.

The TPM can generate the EK internally using the TPM_CreateEndorsementKey or by using
an outside key generator. The EK needs to indicate the genealogy of the EK generation.
Subsequent attempts to either generate an EK or insert an EK must fail.
If the data structure TPM_ENDORSEMENT_CREDENTIAL is stored on a platform after an
Owner has taken ownership of that platform, it SHALL exist only in storage to which access
is controlled and is available to authorized entities.
End of
informative comment
1.
The EK MUST be a 2048-bit RSA key
a.
The public portion of the key is the PUBEK
b.
The private portion of the key is the PRIVEK

Here's where we start getting to the critical point you wanted, whether the owner is allowed to get his key:

c.
The PRIVEK SHALL exist only in a TPM-shielded location.
2.
Access to the PRIVEK and PUBEK MUST only be via TPM protected capabilities
a.
The protected capabilities MUST require TPM Owner authentication or operator
physical presence
3.
The generation of the EK may use a process external to the TPM and
TPM_CreateEndorsementKeyPair
a.
The external generation MUST result in an EK that has the same properties as an
internally generated EK
b.
The external generation process MUST protect the EK from exposure during the
generation and insertion of the EK
c.
After insertion of the EK the TPM state MUST be the same as the result of the
TPM_CreateEndorsementKeyPair execution
d.
The process MUST guarantee correct generation, cryptographic strength,
uniqueness, privacy, and installation into a genuine TPM, of the EK
e.
The entity that signs the EK credential MUST be satisfied that the generation process
properly generated the EK and inserted it into the TPM
f.
The process MUST be defined in the target of evaluation (TOE) of the security target
in use to evaluate the TPM

5.1
Controlling Access to PRIVEK
Start of informative comment
Exposure of the PRIVEK is a security concern.
The TPM must ensure that the PRIVEK is not exposed outside of the TPM
End of informative comment
1.
The PRIVEK MUST never be out of the control of a TPM shielded location

The PrivEK may never be exposed outside the TPM. YOU may never see your PrivEK. Also "Access to the PRIVEK and PUBEK MUST only be via TPM protected capabilities", which means that YOU may never make use of the PrivEK, except by the restricted "protected capabilities" allowed by the TPM. "Protected capabilities" that basically allow the chip to tell other people that the chip and the computer are secure against you. They don't trust you, so they want to know that they can Trust your computer, and trust that it's secure against you. They can Trust that your computer will only do what they want it to do.

You might have noticed the part about "The protected capabilities MUST require TPM Owner authentication or operator physical presence". That's for two reasons, privacy and "opt-in". The Trust system represents an unbelievable level of privacy threats, so they bolted on some aspects that kinda-sorta-sometimes help reduce the privacy threat. The the PubEK is a unique identifier that can be used to perfectly track you and your computer. It represents the biggest privacy threat. So direct use of this key is limited to the most restricted situations, and only used with "Owner authentication or operator physical presence", so software can't secretly grab the PublicEK to identify you. Also this direct part of the system is only used when you first "opt-in" to the surveillance-and-control system. You sign up with a Privacy Certificate Authority, you approve the activation of these top level functions, and the Privacy Certificate Authority uses it to scan what operating system and software you're running, ensuring that you haven't "tampered" with any of the software on your system and that it's all "secure" (in a DRM-sense of "secure"). Then the Privacy Certificate Authority gives your computer a supposedly Anonymous Trusted Identity, and they ensure that the Trust Chip will securely prohibit you from using that Trusted Identity if you've altered any of the software on your computer. And the chip can associate encrypted files on your harddrive with that Trusted Identity. The chip prohibits you from reading or modifying the contents of those files if you ever modify any of the software on your computer. These are effectively DRM files, they can only be accessed using the exact unaltered secure operating system, and only using the exact unmodified (DRM-enforcing) software approved to access those files.

-

Comment Re:No kidding (Score 1) 290

store and assist with generating crypto keys and perform platform validation so that you can, e.g., validate that your boot loader is not tampered with before it will release those keys. Hardware support for protecting against evil maid and transparent full disk encryption. That's such a bummer! Why would anyone want that?

That's all swell, and I'll be more than happy to jump on board when they offer a system that does that without being designed to secure the computer against the owner in the process. There are lots of ways to do that, but the simplest example is that I'd be satisfied if they allowed the owners to get a printed copy of their chip-master-keys, the Private Endorsement Key and Storage Root Key. That would preserved 100% of the functionality you just listed, while ensuring owners had the final say to fix/overdrive any threat of the computer being secured against the owner. Simply drop the printed keys in a safety deposit box at your local bank vault.

There have been a number of other proposals to fix the problem, such as the EFF's OwnerOverride system, but the Trusted Computing Group has categorically REFUSED to address any of the anti-owner aspects of the system. Enforcing the anti-owner design aspects are their first priority.

-

Comment Re:What? (Score 1) 290

A lot of computers? Name one. Go ahead. I'll wait.

I had several systems on screen about an hour ago with TPMs prominently listed in the system specs....

So please, point me to these computers that are forcing TPM's on us, i'll buy 10 tomorrow..

I'm doing all I can to avert sales of these systems. I can't stop you from buying them, but if you are truly incapable of locating them on your own.... ummm.... well... okaaaay..... I'm cool with that. I also would have declined to point George W. Bush voters to the local voting booths if they were incapable of locating them on their own. They certainly have the right to vote, but declining to actively aid them to vote would have been a public service. Lol.

-

Comment Re:Why? (Score 1) 290

You're right that the chip will do nothing if it's switched off. But I think it's worth pointing out other potentially relevant factors.
(1) If you buy a system with this chip and leave it off, part of your purchase price is a payment supporting the companies pushing this crap.
(2) If you buy a system with this chip and leave it off, you are contributing to their install-base figures, advancing them to the line where they start can start deploying the really nasty tactics.
(3) If we don't aggressively get to the message to not buy these PCs, people who are buying Windows regardless are more likely to buy ones with these chips, and more likely to turn them on, doubly advancing them to the point where they can start deploying the really nasty tactics.
And (4), if they do get to the point where they can push the rally nasty tactics, leaving the chip switched off isn't going to save you, having a computer without a chip isn't going to save you, and running Linux or anythign else isn't going to save you. Because in.... I dunno... two or three years it's possible you'll start running into an increasing percentage of websites that you can't view at all unless you have a Trust chip certifying that you're not running an ad-blocker, and that the Browser is DRM-compliant to not download copies of pictures and other page content. And if the deployment does proceed smoothly, then in a decade or somesuch a large majority of home PCs could have the chip installed and ISP's could start deploying Trusted Network Connect do a "health check" before permitting you any internet access at all. The "health check" uses the Trust chip to identify exactly what OS you're running, and to ensure that your operating system it up to date on all of the latest security patches, and checks that your computer isn't infected with a virus or something. Because obviously it's a "good thing" for an ISP to ensure that you're not connecting an infected or vulnerable computer to their network. And, of course, you fail the health check if you don't have a Trust chip, you fail the health check if you don't have the chip activated, and you fail the health check if your operating system doesn't appear on their list of known, approved, secure, operating systems. And then you're effectively banned for the internet until you comply.

So just merely saying "don't turn the chip on" doesn't seem like the best idea. And while I'm all for more people using Linux, blowing the TPM issue off with a largely ineffectual "don't run Windows" attitude doesn't seem like the best idea either.

If TPM deployment proceeds smoothly to a high percentage adoption, we're all going to be seriously screwed eventually.

-

Comment Re:Why? (Score 1) 290

We agree on the nature of the system, but I wanted to address this:

Malicious software can't read paper.
it wouldn't have to if you were to actually use those keys.

I'm countering people who argue that there are legitimate security benefits to Trusted Computing in home PCs. As long as the paper is locked away, they can't claim my proposal diminishes any legitimate security benefits. And if I do want to use it, well in that case we're starting with the computer in the "maximally Trust-secured-state", in which case the Trust system is maximally secured to protect and validate a small Trusted application into which I could securely type the PrivEK/SRK, make any security modifications I wished to make, and then have the Trusted application securely wipe the keys from it's protected RAM.

So anyone who doesn't opt to get a printed key gets 100% of any security benefits they want to claim, anyone who gets a printed key and keeps it in a bank vault gets 100% of any security benefits they want to claim, and even if I do use the key I'm doing so with essentially zero vulnerability to anything, unless it's something that already had the power to beat the Trust system anyway. And, of course, the point that they have no right to object if I decide that hypothetical level of risk is worth it.... I'm doing nothing and I'm asking for nothing that would diminish any security they claim they want for themselves.

Not that the Trusted Computing Group would ever permit any such thing, but it's pretty powerful for shooting down Trust--is-good proponents, and for making things crystal clear for bystanders trying to figure out which side they should be on :) At least that's the hope.

-

Comment Re:Why? (Score 1) 290

The point of the PrivEK key is that the chip uses it to send a "spy report" on exactly what operating system and software you're running, and without that key you cannot control the contents of that report. For example a website can check exactly what browser you're running, and whether you have an ad-blocker running. If you're not using an approved browser, or if you have an ad-blocker, then the website can refuse to display. It would just toss up a "helpful" error message telling you to fix your system, as in telling you to run an approved OS / run an approved browser / disable the ad-blocker.

The point of the SRK is that the chip can lock your files such that YOU can't read them or modify them, except under the strict control of the Trust chip. Think Uber-DRM system. You can't play a music file at all except with the exact approved music player, and you can't play the file at all without updating the pay-per-play playcount and reporting it the music company. Or you can't run software unless the date is securely verified to be within the approved software-rental time window. The range of DRM-style Trust enforcement is virtually unending.

If you don't have your keys then the Trust system secures your computer against you. If you did have access to your keys then you have final control of the system, and then it would be a legitimate security system securing your computer for you.

That's the overly-short overly-simplified answer. Let me know if you want to address anything more detailed.

-

Comment Re:Math challenged (Score 1) 189

You Sire, are math-challeged:
2^48 = 281.474.976.710.656

Nope, you missed. The equals-formula you posted is, in itself, correct. However you missed the Birthday Problem. The chance for collision in a group becomes large when you get up around the square root of number you posted... meaning in the ballpark of 2^24=16,777,216. Basically it's because in a group of N individuals there are about N^2 possible pairings that could collide.

-

Comment Re:Why? (Score 1) 290

Specifically, it is designed to be SECURE AGAINST THE OWNER. The Trusted Platform Module Technical Specification explicitly refers to the owner of the chip as an attack-threat which the chip MUST be secure against.

Citation needed ;) I'm sure you're misinterpreting some physical tamper-resistence line.

Unfortunately, being sure is all too often completely unrealated with being right.

It's in some text explaining design intent, explaining why they require certain internal data be handled in a particular way. They specifically state they are doing it this way to prohibit a "rogue Owner" from being able to register an Identity with than one Privacy Certificate Authority.

TCPA_Main_TCG_Architecture_v1_1b.pdf
According to internal document page numbering it's on page 267, but the PDF viewer software calls it page 277. The exact sentence is:
This feature prevents a rogue Owner from assembling identity_binding data structures outside the TPM and hence obtaining attestation to the same TPM identity from multiple Privacy CAs.

They explicitly named the Owner as the primary focus of their threat model. They explicitly took steps to secure the the chip against an owner attempting to manage his privacy identities. And they did it because the underlying "security threat" was that an Owner could attempt to use the duplicate anonymous identity to gain local control to modify a "security property" that was demanded by someone else via Remote Attestation of the first anonymous identity. And in this case a "security property" being demanded by someone else via anonymous remote attestation is basically a generalized way of saying a DRM-style-enforcement-commitment, and using a duplicate anonymous identity to modify that "security" setting basically means being able to break/escape the DRM.

Remember - they explicitly stated the security threat here was the OWNER. Furthermore note that theses are anonymous identities used for remote attestation.... this has nothing to do with securely checking the state of the system for yourself. This is securing teh state of the computer against the owner for the benefit of a remote party - specifically a remote party to who the owner doesn't trust - someone to who the owner specifically wants to remain anonymous. That pretty much means some random corporation or random website he doesn't want tracking him, and which wants something like DRM enforcement in place on his computer. And again, this is all in the context of them declaring the OWNER to be the threat they are securing against.

I don't doubt you've looked at it. But clearly you've looked at it from the perspective of how you think it impinges on your liberty

I've considered it from all angles. I would fully support a similar chip which was designed as a legitimate pro-owner security system. However that's not this chip.

rather than from the perspective of a security engineer trying to achieve simple properties such as executing code that isn't manipulated by an attacker.

I fully understand that issue, and that can easily be achieved with a legitimate security system, one securing the system for the owner rather than securing it against the owner, one where the owner has the final say in control and security settings.
(Note that an owner "opt-in" for something like a DRM scheme is an owner having an initial say on security settings, but the owner having the final say on security settings means he has full control to modify the security settings later.)

Let's play this game. I'll propose an alternative system, one where the owner can have that final say if he wants it, thereby having the power to avoid or solve 100% of the objections to the system, and you go ahead an try to name even one legitimate security protection that my solution fails to preserve just as well as the current system. (Note: Securing the computer against the owner obviously doesn't qualify under any legitimate definition of security. By definition it's impossible for someone to "attack" themselves or to violate their own security.)

There are tons of possible pro-security solutions and variants, the EFF made an interesting Owner-Override proposal, but I'll go with a minimalist solution that preserves 100% of any legitimate security you want to claim. I propose a system with essentially identical hardware and identical capabilities. The only hardware change I add is an option to export the RootStorageKey encrypted to the PrivEV. This in itself has zero impact on security because this value is completely useless and harmless without having the PrivEK. The second thing I add is an option when buying a chip or any device containing a chip. The option is that you can buy a chip/system exactly as they are sold today, preserving 100% of the security and capabilities of the current offerings, or you can, if you wish, buy an identical version (with identical certificates!!), which comes with a printed copy of the PrivEK.

Note that it's impossible for malicious software to access a PrivEK printed on paper, especially when I place that paper in my safety deposit box in my local bank vault, and the RootStorageKey is securely protected under the PrivEK.

So you are perfectly free to buy an effectively identical chip without the printed PrivEK if you don't want your key, and you can obtain 100% of all legitimate security offered by the current system, and I can buy an identical system with my printed PrivEK and lock it in a bank vault. You get 100% of the genuine security you have today, and I can have 100% of any genuine security you want to claim, and if I wish, if for some reason I feel it's necessary, I can make use of the PrivEK in whatever manner I deem appropriate secure, and I can use it to regain final control of my system if there's some problem that I want to fix. And if the PrivEK locked safely away in a bank vault, and the the computer is starting out secure state with the PrivEK unavailable to malware, then I personally inclined to consider it sufficiently secure to preserve and authenticate a minimal Trusted application into which I could type my PrivEK and recover my StorageRootKey and make any security changes I needed. The the Trusted application would then securely wipe the PrivEK out of the Trusted Application's RAM before returning to normal operation. And if you disagree with my judgement on the security of that process, well it's my security to decide. If you think that is somehow risky, well fine you do what you want and I'll do what I want. I'm not compromising your security.

An amazingly hyperbolic statement for someone who claims to have read the specs.
1) "The chip" tracks your hardware does it? You understand that the TPM is a completely passive chip waiting for people to come along and send it data, don't you?

Yes, of course I do. And you equally could have argued that a bullet is a "passive component". Or you might as well have said a TPM is a passive chip waiting for someone to send it electricity. It's an silly argument. We're discussing a chip functioning-as-designed, which means a chip in conjunction with the intended electricity and the intended CPU and the intended RAM and the intended software in a working computer.

The Platform Configuration Registers in a TPM only work in conjunction with software... and a CPU.... and electricity. Just like the gunpowder in a bullet only works in conjunction with a gun.

The Platform Configuration Registers in a TPM are designed to track the hardware and software configuration on a system. The fact that they don't work when the power is off, or when the software isn't installed, is a silly point.

2) Same point, again. If you export the EK into the OS, any malware anywhere can forge the attestation state

I covered this above, but I would like to point out that you really shouldn't make up dumb ideas and try to blame me for them, thankyouverymuch.

3) Only a few large companies are actually using TPMs and remote attestation for things like trusted network connect

TPMs are still in limited deployment. There's a lot of designed functionality that can't be put to much use until the chips are widely deployed. And that's doubly true of anti-owner aspects. ISPs obviously couldn't deploy Trusted Network Connect until the large majority of customers already have the chips. We've got Microsoft declaring the chips will be mandatory in all new Windows PCs starting in 16 months. (Microsoft had actually hoped to do this back with Vista, but they pretty well obliterated every schedule of everything on Vista.)

And the ISP-enforced Trusted Network Connect isn't something I dreamed up. There was a major World Tech conference where the keynote speaker laid out that exact scenario, saying the government should even force ISPs to make it mandatory. It went on and on about the need to defend the National Information Infrastructure. And not just the usual stuff about defending against viruses... oh no he didn't stop there.... the keynote speaker actually called out the need to defend our National Information Infrastructure against Terrorist Attack. Chuckle. A variety of other Trusted Computing proponents have laid out the same ISP-Trusted-Connect plan. It's not something that I dreamt up myself.

And in any case, a TrustedNetworkConnect possibility doesn't much matter if we dump the current chip for my printed-key-option proposal, or for any of the variety of other options for a legitimate security chip.

-

Comment Re:Prove you're right: Show me how to get my keys (Score 3, Informative) 290

Help me judge which of you is right.
Alsee says I can't have the keys to the TPM which comes with the computer I buy. You disagree with Alsee.

No, he explicitly agreed with me on that point:

I said: "The TPM technical specification is quite explicit that the owner of the computer is FORBIDDEN to ever get his keys"
He said: "Forbidden from getting them out of the TPM"

That's agreement.

He merely followed up with a lame explanation "not forbidden from using them in ways that allow for guaranteeing security properties". The Trusted Computing definition of "security properties" explicitly includes security against the owner. "Guaranteeing security properties" means you are unable to read or alter your own files in Sealed Storage. An example "security property" would be that you cane read (and run) a Sealed-Storage program without securely verifying that the date it is within the approved software-rental period. Or think DRM music file, the "security property" is that the chip won't let you play the music except with the approved DRM-music player, and only if it decrements the number of plays remaining in the pay-per-play count.

It also means enforcing the security of Remote Attestation, which in plain English means a cryptopgraphically secure "spy report" sent out to other people over the internet telling them exactly what software you are running. For example if you had your master keys you could tell a website that you aren't running an ad-blocker when you actually are. That would violate the anti-owner "security properties".

That's why your forbidden to have your keys.... then other people could not Trust that your computer would enforce anti-owner "security properties" against you.

Standard line argument is that it's all A-ok because it's all "opt-in". If you don't "opt-in" all "security properties" are still enforced against you, enforced in the sense in that nothing works (you can't violate security if nothing works and you can't do anything). If you don't "opt-in" you're denied any ability to read or modify Trusted-secured Files, if you don't "opt-in" you're denied the ability to run Trusted-secured programs at all, if you don't "opt-in" you won't be able to access websites at all if they use the Trust system to ensure you don't copy pictures or to check if you're running an ad-blocker. And if you don't "opt-in", then in a few years you might be denied internet access. The Trusted Computing group has created something called Trusted Network Connect, and Microsoft has an equivalent version called Network Access Protection. That's a system where a network (or your ISP) can ask for a Trusted Health Check. A "Health Check" is that spy report I mentioned before, it reports the exact software running on your computer. The "Health Check": ensures that you're not infected by a virus(*), and ensures that you're running an approved operating system with ALL of the mandatory patches, and enforces that you're running any mandatory "security software" they want you to run, and that you're not running anything they don't want you to run. And if you don't "opt-in" then you can't pass the "Health Check", and your computer is "quarantined".... no network access access. Obviously no ISP could ever deploy something like that.... not unless most customers already had Trust Chips in their Computers.... oh yeah Microsoft is making Trust Chips mandatory in all new PC's 16 months from now. But even then it would obviously be several more years before most people had Trusted PC's, before ISPs could deploy that sort of "Trusted Health Check" to get internet access. But don't worry, this is all a good thing.... it's just a Health Check.... to ensure you're not infected and spreading viruses

As he explained, there's nothing evil about the system.... they do allow you to use the keys in ways that you're permitted to use them, and everything is "opt-in". It's for your own protection.

(*)Footnote: Of course viruses can use the Trust system too, making it impossible for a Health Check to scan and identify a virus. All the Health Check would be able to see is some generic encrypted Trusted program is running. Oh yeah, and a virus could use the Trust system to impose additional "security properties".... such as a security property stating that ALL of your Trust-secured files and paid-software-rentals are irrevocably destroyed if the virus is ever removed. Yeah, the Trust system is perfectly designed for that sort of stupidity. It would be impossible to remove the virus without losing everything tied to the Trust system. The virus would merely need to drop some critical file like the DiskKeyStore or SystemCertificate into SecureStorage owned by itself. Remove the virus and everything dies with it. Securely enforced by the Trust chip.

-

Slashdot Top Deals

And it should be the law: If you use the word `paradigm' without knowing what the dictionary says it means, you go to jail. No exceptions. -- David Jones

Working...