Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Army to Require Trusted Platform Module in PCs 337

Overtone writes "Federal Computer Week is reporting that the U.S. Army will require hardware-based security via the Trusted Platform Module standard in all new PCs. They are a large enough volume buyer that this might kick start an adoption loop."
This discussion has been archived. No new comments can be posted.

Army to Require Trusted Platform Module in PCs

Comments Filter:
  • by hxnwix ( 652290 ) on Friday July 28, 2006 @12:48AM (#15796800) Journal
    Army requires TMP so that it can circumvent single-vendor prohibition and be Intel(R) only.
  • Oooh great... (Score:5, Insightful)

    by masklinn ( 823351 ) <slashdot,org&masklinn,net> on Friday July 28, 2006 @12:49AM (#15796804)

    The question still remains whether the user himself can trust the trusted computing platform.

    If your government or seller or whatever doesn't trust you, doesn't even try in the least, how the hell are you supposed to trust him? The most logical path would be to fully distrust him. And therefore to distrust and refuse trusted computing platform.

    • If your government or seller or whatever doesn't trust you, doesn't even try in the least, how the hell are you supposed to trust him? The most logical path would be to fully distrust him. And therefore to distrust and refuse trusted computing platform.

      Careful, we ARE talking about the Army here. I follow what you're saying, but this circular logic might cause someone in the Army to have an aneurysm from having more than a minimal amount of neurons firing!



      (BTW, I have a lot of respect for the Army as

    • Trusted (Score:5, Insightful)

      by Descalzo ( 898339 ) on Friday July 28, 2006 @01:12AM (#15796867) Journal
      From what I understand, Trusted in this context is used as in "I entrust it with my security" rather than "I find it worthy of my trust."

      If I am hanging from a rope over a cliff, I Trust the rope. I "Entrust it with my security" whether or not I find it worthy of that trust.

      • Re:Trusted (Score:5, Insightful)

        by interiot ( 50685 ) on Friday July 28, 2006 @02:01AM (#15796981) Homepage
        The point is: if the computer trusts someone else more than the end-user, in a security sense, then the end-user is not in control of the security of their machine. In a corporate IT context, this is (generally) a good thing. In an individually-owned computer, this is not really a good thing [schneier.com].
        • Re:Trusted (Score:3, Insightful)

          by Descalzo ( 898339 )
          That's my understanding of it. The Army can do what it feels it must do to protect its own security. My fear is, as the submitter wrote, "They are a large-enough volume buyer that this might kickstart an adoption loop."
          • Re:Trusted (Score:3, Interesting)

            by hany ( 3601 )

            IIRC (and if army is not completely crazy) army does not plan to use TCP as a way to give RIAA and MPAA control of army PCs.

            If that assumption is correct, army will be supplying encryption keys into TCP, not PC manufacturer, not RIAA, not MPAA, not Sony, etc.

            It also means, that TCP, as deployed in army, will be able to be "owned" (meaning "0wn3d", controlled, etc.) by the owner of the PC (in this case army), not media cartels.

            And that finaly means, that even I or you may be able to found such TCP usefull

            • Re:Trusted (Score:4, Insightful)

              by Fred_A ( 10934 ) <[fred] [at] [fredshome.org]> on Friday July 28, 2006 @04:31AM (#15797294) Homepage
              TCP and the whole concept of having trusted binaries running on your machine can indeed be a real boon in a security conscious environment provided that you have the tools to make use of that platform.

              In itself TCP isn't inherently evil, the idea makes sense and appears to be reasonably well concieved. What is feared is a lock-in from proprietary software makers coercing the hardware vendors in not releasing the tools to anyone but them.

              There might be a glimmer of hope if the trend continues with actions such as the EU vs. Microsoft anti monopoly suit. This kind of thing, focusing on interoperability could well be used so that FOSS (and through that possibly casual Windows and other commercial users) gets to access all the tools required to fully access the system (i.e. keys, etc.).
          • I don't have a problem with adoption so long as its use is not mandatory. I don't believe I've seen a single proposal which would make the use of this technology in a way that could undermine the end-user mandatory. Sure, it might be used to tighten up existing DRM systems. But I don't use DRM, and have no intention of doing so in the future. So why should this bother me?

        • Re: corporate IT sense

          If IT is in control, ok
          If MS is in control, not ok

          Same applies for end-user (or his friendly admin) in place of IT.

          If some people decide to trust MS/Apple with their security, fine, but I wont.
        • Thank you for that link. That was interesting. I had not read it before, or thought about Sony's rootkit in that light.
      • Re:Trusted (Score:5, Informative)

        by SiliconEntity ( 448450 ) on Friday July 28, 2006 @03:07AM (#15797105)
        From what I understand, Trusted in this context is used as in "I entrust it with my security" rather than "I find it worthy of my trust."

        No, that's a common fallacy; in fact, it's an intentionally constructed fallacy. Trusted in this context means that you have evidence to trust that the computer will behave in a specified way, particularly from the point of view of remote access. Normally when you connect to a computer remotely you have no way of knowing what it's doing. It could be essentially running any software at all. But if you connect to a Trusted Computer, it provides cryptographic evidence about its software configuration. Knowing what software it is running gives you grounds to know how it will behave; and to trust that behavior. That is the real meaning of Trusted Computing.
        • Re:Trusted (Score:3, Interesting)

          And its real use is Digital Rights Management: this doesn't just mean preventing people from playing MP3's, but ensuring that only the software that the document author or the software vendor authorizes to open a document can open that document. There are actually good security uses for such authentication. Unfortunately, it also means that documents become much more traceable, and that the encryption keys for almost all such software, especially purchased software keys, are sitting in a database somewhere
      • Re:Trusted (Score:3, Informative)

        by mrchaotica ( 681592 ) *

        Actually, Trusted in this context means "the people in control can trust my computer to be secure against me," where "the people in control" refers to those who hold the private key to the TPM. In the case of the general public, this is the Trusted Computing Group (which includes such bastions of personal freedom as Microsoft); in the case of the Army it should be the Army, but I fear it will still be the Trusted Computing Group.

        See, that's what's so bad about Trusted Computing: if the owner of the PC had

    • If your government or seller or whatever doesn't trust you, doesn't even try in the least, how the hell are you supposed to trust him? The most logical path would be to fully distrust him.

      Given how often and severely government suppliers and contractors like Halliburton, Bechtels-Parsons, etc engage in all manner of willful, obvious fraud- anyone in the government that trusts their supplier is most likely benefitting in some way from the fraud. I think the challenge wouldn't be to name all the suppliers

  • by DrJimbo ( 594231 ) on Friday July 28, 2006 @12:55AM (#15796820)
    TFA says:
    Is TCG creating specifications for just one operating system or type of platform?
    No. Specifications are operating system agnostic. Several members have Linux-based software stacks available. In addition to our work on the PC platform, we have a specification for Trusted Servers and are working to finalize specifications for other computing devices, including peripherals, mobile devices, storage and infrastructure.

    • No. Specifications are operating system agnostic. Several members have Linux-based software stacks available. In addition to our work on the PC platform, we have a specification for Trusted Servers and are working to finalize specifications for other computing devices, including peripherals, mobile devices, storage and infrastructure.

      This doesn't answer the question at all.

      It all depends on who controls the root certificates that are used by the trusted computing hardware to verify the signatures o

      • It all depends on who controls the root certificates that are used by the trusted computing hardware to verify the signatures of the BIOS and of the boot image.

        If the FOSS fraternity are left out in the cold by the certificate authority, this will lead to some almighty class-action type litigation. It would be utterly anti-competitive to lock out a huge potential competitor, and Europe in particular would have a field day with Microsoft. Look at the trouble MS got into merely by locking people to their br

      • by SiliconEntity ( 448450 ) on Friday July 28, 2006 @03:18AM (#15797127)
        It all depends on who controls the root certificates that are used by the trusted computing hardware to verify the signatures of the BIOS and of the boot image.

        I'm sorry, but you don't know how Trusted Computing works. Almost everything you have been told about it is a lie.

        There are no root certificates used by TC hardware to verify the signatures of the BIOS and the boot image.

        What happens is that the BIOS, OS loader and potentially the OS itself send information to the TPM chip about the hashes of the software that is loading. User software can then, if it chooses, query the TPM chip and get a cryptographically send message telling what these hashes are. The software can use this to report the software configuration that booted.

        The root certificates get involved because the TPM crypto key never leaves the chip. The TPM manufacturer has a root certificate which it uses to sign each TPM key. This way people can tell that a message actually comes from a valid TPM and not a fake. It prevents virtualization of TPMs. This is what allows software to report its configuration in a trustable way. It is what gives the system its name, Trusted Computing.
        • The TPM manufacturer has a root certificate which it uses to sign each TPM key. This way people can tell that a message actually comes from a valid TPM and not a fake. It prevents virtualization of TPMs.

          Unless the root certificate gets stolen.

          Not that I would ever advocate such a thing, goodness, no ! It would mean that we, the computer owners, would have complete control over our property - and then Disney might lose potential future profits ! Clearly Disney's intellectual property rights trump our p

        • While doubtlessly you are technically correct, for desktop computing i'm not sure it makes much difference.
          Since only the windows hash will allow secured files to be open and secured apps to be run.

          Microsoft will easily be able to convince the MPAA/RIAA that the only safe hash is the windows one and make the office formats "secured" to the windows hash. Some organisations like debian may not wish or be able to restrict peoples rights to their own machine so there will be no reason for anyone to value their
    • Several members have Linux-based software stacks available.

      Much like the NVidia drivers though, these stacks might involve a GPL shim and a non-GPL binary that's checked and verified by the TPM. Probably why GPL3 is getting ready real quick.

      You try customising the kernel and alter the stack, and your hardware (the TPM) refuses to run it. End of Linux as we know it.
      • these stacks might involve a GPL shim and a non-GPL binary that's checked and verified by the TPM

        No, the main one is TrouSerS [sourceforge.net]. It's fully open source and GPL'd. Contrary to the many lies which have been circulated about it, TC is fully compatible with Linux. In fact, that's where most of the research and development work is at this time. Trusted Grub [sourceforge.net] is another good example. It hashes the Linux kernel and some of the config files into the TPM chip before booting it. This way Linux systems can prove what ker
    • The paragraph after the one you quoted offers us additional hope:
      • "The TCG design does not have any requirement that software be "certified" in order to use it. The specification talks in some length about ways of using the platform to create certificates for keys that are provably secure and yet not identify the platform they came from."

      In principle then, FOSS operating systems should be able to use TPM to enhance the trust that their owners have in them, in contrast to the way in which MS systems will

      • ...in contrast to the way in which MS systems will use it to enhance the trust that content providers have in the platform.

        I personally think of this as FUD to some degree, simply because if one does not buy DRMed media, it doesn't affect MS users in any way. People seem to confuse a system supporting something with its mandatory use, which hasn't even been proposed.

    • That does not mean that they won't simply require the TPM to validate the system as running Windows.
  • ...I think of one of those dirty con guys that wants you to play three card monty or something. "Come on, it's not rigged....trust me." Yeah, sure buddy.
  • Macs only? (Score:3, Interesting)

    by sakusha ( 441986 ) on Friday July 28, 2006 @01:02AM (#15796835)
    Is TPM actually shipping in any product other than the Intel Macs?
    • Re:Macs only? (Score:5, Informative)

      by lukas84 ( 912874 ) on Friday July 28, 2006 @01:08AM (#15796857) Homepage
      Lenovo Thinkpads and Lenovo ThinkCentres. (Select Models).

      My R51 has one.
    • My friend's Gateway laptop (17" with Intel Core Duo 1.83GHz) has a TPM chip, but he says that it is nonfunctional.

      • Re:Macs only? (Score:3, Interesting)

        by jrumney ( 197329 )
        I have a Dell laptop with a TPM chip, which was also non-functional until explicitly enabled in the BIOS. I enabled it to play with the file encryption functionality it offered, but it turned out to be impractical. Judging by the performance I get, the TPM chip seems to have a 9600 bps serial bus connecting it to the motherboard.
    • If you buy a business-oriented motherboard from Intel, there is generally an option for a board with TPM. My 915GEVLK has the integrated video and audio and gigabit LAN I wanted, along with TPM which I can disable in BIOS. So long as it's not drastically raising the price of the board, there's nothing wrong with letting the end user have an extra chip or two that he can choose to use or not.
    • There may still be some controversy about whether TPMs are in all Intel Macs. In any case, there doesn't seem to be any software way to access them, unlike PCs.

      TonyMcFadden.net [tonymcfadden.net] has a reasonably up to date list of systems that have TPMs in them, as well as manufacturers of the chips themselves, software suppliers, etc.
  • As Pitr would say (Score:2, Insightful)

    by Lord Kano ( 13027 )
    Am thinkink that someone with a lot of pull is ownink shares in TPM vendors.

  • I personally abhor the notion of Trusted Computing on my personal computer, but if you're using a computer provided to you by the government or a corporation for the express purpose of working, it's their right to control what goes on on that computer. It's possible that this will help to stem the tide of malware (at least in corporate environments) by rejecting execution privledges, and allow IT staff to better enforce policies about what can and cannot be run on their computer. It would also help stop things like the Free USB Key Attack [darkreading.com] (formerly discussed [slashdot.org] on slashdot).

    Of course, this could also make users feel like they are not trusted, and could even lead to overconfidence in the security of the system. Still I see it as a major plus, at least unless I get saddled with it at home.
  • Given the way DRM is implemented it amounts to a serial chain of single points of failure, but that's what TCM is supposed to be the basis of. As errors in military procurement are standard, not an exception, this strikes me as, um, just a tad stupid (I think this may later emerge as the understatement of the century).

    In addition, for a sovereign nation it is, of course, a perfectly sensible idea to hand the on/off switch of your entire infrastructure to another nation, potentially giving rise to a whole
  • All of Apple's Intel-based Macs have a TPM module, in order to restrict Mac OS X to running on genuine Apple hardware.
    Does this decision pave the way for Apple to become a preferred supplier as shortly their entire model lineup will feature TPM modules with a relatively secure operating system?
  • Can someone explain to me what's bad about this Trusted Platform thing? Is it a windows thing only or would it be in linux too? Does it relate to Microsoft's trustworthy computing? Thanks.
    • It's deeper than the operating system, it goes right to the core of the system. The best explanation I've seen of it is from Ross Anderson's Trusted Computing FAQ. [cam.ac.uk]

      Other comments from Richard Stallman's Can you trust your computer [gnu.org] and the EFF's [eff.org] paper Trusted Computing: Promise and Risk [eff.org].

      Another good summary is this Benjamin Stephen and Lutz Vogel's video Misconceptions [youtube.com]

      From Anderson's FAQ:

      2. What does TC do, in ordinary English?

      TC provides a computing platform on which you can't tamper with the applicatio

      • by SiliconEntity ( 448450 ) on Friday July 28, 2006 @03:35AM (#15797167)
        TC provides a computing platform on which you can't tamper with the application software...

        That's a total lie. Almost everything in that piece of propaganda masquerading as a FAQ is a lie.

        If you want the truth about TC, try Seth Schoen of the EFF. He has a good summary in his recent blog entry [loyalty.org]:

        What the TPM does do is support remote attestation so that a computer user can tell the computer to prove to a remote party what software it is running (if the software that's running also supports being proven in a way that the remote party understands). Then the remote party can make its own decision about whether the software is good or bad, and what it wants to do about that.

        This sounds innocuous in a certain sense. We have learned to mistrust the notion of a single centralized entity that decides what we can and can't do. TCG is not that entity, and TCG is not chartering that entity; instead, we have an unlimited number of entities that potentially make their own decisions, on various scales, about what we can and can't do in particular contexts, small and large. (We don't know yet which of those entities will turn out to have enough power to set which kinds of policies, or how the network externalities will shake out. Some entities with a lot of power, like Microsoft, can try to delegate some of their power, but there are plenty of technical and business obstacles to be worked out on both sides of that sort of delegation.)

        What the TPM does do is support remote attestation so that a computer user can tell the computer to prove to a remote party what software it is running (if the software that's running also supports being proven in a way that the remote party understands). Then the remote party can make its own decision about whether the software is good or bad, and what it wants to do about that. The user could also choose not to offer any proof at all; however, although the user has the right to remain silent, the user's silence can and will be used against her. Not offering proof is, of necessity, the functional equivalent of offering proof of the most unacceptable and contrary-to-policy facts imaginable.

        That does offer an avenue for a lot of control over you via your computer -- if someone else controls a resource that you need, there is a prospect of conditioning your access to that resource upon the provision of proof that you're running software that the resource controller considers "good". Not TCG, but the individual entities that you deal with: a bank, an entertainment company, an employer, an ISP. Furthermore, each of them could have its own independent definition of what "good" means, because there is no central signing or certifying authority. It is logically quite possible that one entity might refuse to talk to you if you're running configuration A instead of B, whereas another entity would refuse to talk to you if you're running B instead of A. (This is trivially true if each entity gave you a bootable CD and said "you can only communicate with us while you're running from our CD" -- with a TPM and the appropriate software, they can actually tell, and you probably can't fool them.)

        The ISP scenario is the point at which the most pervasive possible control could be exercised. TCG has already developed a specification called Trusted Network Connect which is based on the idea that you can be forbidden to connect to a network unless you're running a software configuration that the nework operator approves. This is designed for use in corporations, most of which are accustomed to having a high (but imperfect) degree of control over the software running on their employees' PCs. Of course, the technology is more general, and, as TCG told me, there is nothing to stop it from being used by the People's Republic of China, or by a commercial ISP.

        Imposing this requirement on a general population has a very high cost; for one thing, it mea

        • As we all should now by now, the interesting thing when you start to add features to a system.

          Anybody care to consider what happens when we get the following:

          (1) "Trusted" Computing
          (2) "Trusted" Network Connections
          (3) A non "net neutral" Internet?

          You could well end up with a choice of only two sources of information: the media conglomerate that owns your cable company, local news paper and local network affiliate television station, or the other conglomerate that owns your DSL service, most of the radio sta
  • by Flying pig ( 925874 ) on Friday July 28, 2006 @02:08AM (#15796995)
    We recently visited a customer who seem to be on the verge of announcing that anybody accessing their systems with any sensitive information will be required to use e-Gap, a dongle based security system from a Microsoft subsidiary (and not to be confused, as Google does, with electronic Grant Application and Processing.) The internal IT people told us e-Gap would refuse to allow a client to connect if it did not have working anti-virus installed, and that in order to verify this, active-x objects would be downloaded to inspect the system. If I have this wrong, apologies, but I'm reporting what I was told.

    This is a worrying scenario. Apart from the minor issue that external users will not want to pay for the dongles and that the internal customer is seeing his IT bill spiral, Trusted Computing seems to be heading to a Mexican standoff situation as follows:

    Device 1: Permit me to inspect your system by downloading and running this program.
    Device 2: Only after YOU have allowed me to verify your credentials by uploading and running this program.
    Device 1: No, it is I who am deciding whether you are to be trusted!
    Device 2: No, it is I who am deciding that!
    Device 1: Anyway, my content is digitally signed by Microsoft, and you must trust it.
    Device 2: Microsoft? Not a hope in Hell. I require all downloads to be digitally signed by Steve Jobs in person with a DNA signature.

    And so on. Quis custodiet ipsos custodes? And how long before an army unit gets wiped out because of a defective dongle?

    • What's most amusing about that is the number of fake 'verification' sites that it will lead to, loaded with ActiveX controls that actually are disguised rootkits ... grab some large company's key, and then you could pose as them and -- since users would be used to just running ActiveX controls from that company -- nab their computers during the "security sweep."

      I love the irony. Use a technology probably responsible for more zombiefied machines than any other ... in order to ostensibly secure them.

      Somewhere
    • Unfortunately, if this type of tech gets into citizens' living rooms, they will probably not have the option of requesting credentials from all the important services. Governments/corporations do not want to be forced to provide actual, working credentials that can hold them accountable, so I really doubt they would allow the tech (read: Wintel) to do that.

      Of course, then this opens up the whole issue of a service getting 0wned and then securely propagating trusted malware.
  • "Federal Computer Week is reporting that the US Army will require hardware-based security via the Trusted Platform Module standard in all new PCs. They are a large-enough volume buyer that this might kickstart an adoption loop."

    Let's say the US Army buys a million night-vision goggles. Would that mean bird-watchers would throw away their good old binoculars and go in for this one?

    The TPM is actually a very sound functional and business requirement in the Army... it provides for centralised surveillance and
  • So now the question is, will it be legal to transfer (or as they say, "convey") GPLv3 [slashdot.org] software to a Trusted Computer? It violates the principle that users must be able to alter their software in such a way that remote servers can't tell. Will that make it illegal to run GPLv3 software on a TC?
  • by Opportunist ( 166417 ) on Friday July 28, 2006 @03:28AM (#15797148)
    It makes sense for the Army to require TCP. Stolen/lost laptops wouldn't immediately result in a security leak. But this can be achived cheaper, quicker and (and here comes the key point) with more control on the Army's side. Linux can encrypt documents just the same way TCP wants to offer, the difference lies in the open source concept: This inherently gives you the ability to check the security (provided you can read code, but I guess the Army can afford hiring someone who does) of your system.

    TCP requires you to trust the person/group that made the security for you. You put yourself completely into the hands of the corporation(s) that create your TCP platform, and you are fully dependent on their ability to come up with a good protection scheme. Not to mention that you have to trust them, implicitly, that they do not want to spy on you and that they are better than their adversaries.

    With TCP you hand over the responsibility for security. But you also hand over control. And it has the potential to lure you in a false sense of security which invariably leads to slacking. More than once I've seen a behaviour of neglect in a high security area (I've had my share of time in that field), with people relying so heavily on the technical implementations that they forgo the most basic security measures called for by common sense, because "Hell, what DO we have that security concept for, if I can't trust it fully?"
  • better one innit (Score:4, Insightful)

    by ajs318 ( 655362 ) <sd_resp2&earthshod,co,uk> on Friday July 28, 2006 @03:31AM (#15797152)
    A country's armed forces ought to have the power to demand the full source code of every application running on their computers, and the resources to write all their own software wherever necessary. There is no shortage of Open Source applications they could use for starting points .....
  • Film [lafkon.net]

    Advocacy [againsttcpa.com]

  • If I gather correctly, the TPM takes care of providing decryption keys to the operating system once it can confirm the system is in a known state. What I still don't understand is how this "known state" together with the necessary decription keys are communicated to the TPM in the first place. Is there a central authority that takes care of this? If so, how would this affect Open Source operating systems?
  • it's a great idea (Score:2, Insightful)

    by alizard ( 107678 )
    if the intent is to create spaces within computers where malware can run invisibly and with no possibility of elimination even if the users find out about it.

    Reminds me of the decision made to run modern US warships on Windoze.

    Military procurement and ripoff were probably synonymous as of when Sargon the Great's people were buying spears and grain to feed troops. The tradition has continued.

    The only question I've got here is how many members of the US Armed Forces are going to get killed by this set of m

  • just in case... (Score:5, Informative)

    by joe 155 ( 937621 ) on Friday July 28, 2006 @04:19AM (#15797275) Journal
    ...you're interested I read a rather interesting article about trusted computing the other day ( http://www.gnu.org/philosophy/can-you-trust.html [gnu.org] ). He makes some good points.
  • by jsse ( 254124 ) on Friday July 28, 2006 @04:24AM (#15797282) Homepage Journal
    The follow conversation heard during my college might help to answer(or not):

    "Sir, what is a trusted system?"

    "A system where we can't trust each other."

    A brief silence...

    "Then what would it be like in an untrusted system?"

    "That we can trsut each other."

    A long death silence...

  • Great idea (Score:3, Funny)

    by Anonymous Coward on Friday July 28, 2006 @04:34AM (#15797298)
    Give the power to disable software used by the US military to tech companies. Brilliant, why didn't anybody think of this earlier? Will software vandors be permitted to run validatation servers on sirpanet?
    ATTENTION DOD EMPLOYEE:
    MICROSOFT HAVE DISABLED THIS SYSTEM AS WE ARE IN THE PROCESS OF NEGOTIATING A GOVERNEMENT CONTRACT WITH IRAN. THE FUNCTIONALITY REQUIRED TO WAGE WAR WILL BE RESTORED WHEN THIS TRANSACTION COMPLETES.
    Did nobody in the DOD see that god awful Irobot film?
  • by trend007 ( 683790 ) on Friday July 28, 2006 @04:51AM (#15797331) Homepage
    Hi all,

    TCG/TCPM stuff, though not completely finished (the DAA mechanism that was introduced in v1.2 is a good example of how the TCG adapted to outside criticisms, and they're starting to work on v1.3) and surely not understood (the word "trust" is a huge factor in that), is having the same effect as PKI a few years back. Except that nowadays times of ignorance and fear (in particular of the big companies behing the TCG) multiply this effect by thousands. "Trust" is more and more acting like the point of concentration of the security problems, its complexity being coupled with new emerging (and very innovative) threats.

    First think of the TPM as a chip that provides standard cryptographic functions (RAS SHA-1, HMAC, AES), so instead of doing it in software anyone will be able to use hardware implementations. Furthermore there are facilities for key creation and management. With the special focus on this "security chip" (such chips already existed in various forms), the designers hope to improve drastically the level of security of modern computer science (95% of emails are spam, botnets of millions of computers, hackers make huge money out of their job, ransomware, etc. etc.).

    Obviously this TECHNOLOGY (and please always keep this in mind: it's a tool, it is to be used by other applications, most importantly OSs, to improve security; apart from secure boot, that is not compulsory at the moment, there's no obligation to use the TPM even if it's here) is not perfect, it will evolve. It will have to CONVINCE, to get TRUST. As I'm saying to most of my Trusted Computing colleagues, I think that challenges set by the opponents of TCG are actually a means to improve the security of this technology (but beware of popularity-seeking criticisms, not all the criticisms are well-founded).

    Read tha FAQ:
    https://www.trustedcomputinggroup.org/faq/TPMFAQ/ [trustedcom...ggroup.org]
  • Please correct me if I'm wrong.

    AFAIK in revision 1.2 it is possible to replace the master-key in the TPM module. This was a major point of criticism of previous revisions. Of course you then lose the "benefits" of the trust-web.
  • maybe not... (Score:3, Insightful)

    by ecalkin ( 468811 ) on Friday July 28, 2006 @04:55AM (#15797346)
    They also created a language called Ada that was a replacement for Cobol. Everyone thought that the DoD requiring new programming in Ada would cause the replacement of COBOL programming Everywhere.

    Where is Ada now?

    eric
  • http://youtube.com/watch?v=K1H7omJW4TI&search=trus ted%20computing [youtube.com]

    Who will decide for them what is trustworthy and what is not? Are they going to have a backdoor? I suppose the BSA http://www.bsa.org/ [bsa.org] just got a new enforcer!
  • by mcc ( 14761 ) <amcclure@purdue.edu> on Friday July 28, 2006 @05:38AM (#15797440) Homepage
    This would be a really worrying thing, but the fact is TPM has already won. It won the instant that Apple adopted TPM and the communities who were publicly worrying and complaining about Palladium and Trusted Computing for all those years went suddenly silent and shrugged the instant that nebulous notions like "freedom" came into conflict with solid, purdy white plastic.

    Here is the thing: TPM's adoption was waiting not on an adoption cycle exactly, but an apathy cycle. TPM was never something that the consumer was supposed to approve of, want, or even really know was there. The adoption of TPM was mostly counting on the consumer not having any idea what they were buying, counting on the blinking 12:00 effect, counting on the idea that most consumers would not even know TPM was in their computer until the first time that they try to do something and the computer says "no".

    TPM isn't there for the consumer. It's there to protect the computer from the consumers. It's there to allow software and content vendors to trust your computer, to trust your computer to ensure it will act in their interests and not yours. These vendors are the ones that TPM is being done for the benefit of, not the consumer. This means that in order for TPM to win, it isn't necessary for the consumer to "adopt" it. All that has to happen is for the consumer to fail to actively reject it when it is quietly dropped into the hardware they were going to buy anyway.

    And that's already happening. So although the military would legitmately represent an adoption cycle-- the military, of course, has a legitimate and logical need to create networks within which the machinery is trusted and the user is absolutely not-- it doesn't really matter. The military isn't the kind of adoption TPM needs to reach enough critical mass that vendors can begin requiring it in new applications, I don't think-- it's not like military hardware is going to be used to run lots of games and DRMed consumer media, as far as I know. The worrying thing is TPM's level adoption in the consumer segment, since that's where it has potential to do actual harm. And that's already begun, and so far nothing is happening to stop it...
  • You're in the Army. You're in the field under fire. You have a hardened Army laptop. You are sending and receiving
    vital messages back and forth with another unit directing fire around your position. Your laptop doesn't have any
    software or files on it that are personal to you. Not your music. Not your games, etc. What is has is a trusted and
    fool-proof means of getting and receiving messages that you can trust with your life and the lives of your unit.

    Therefore, you trust the info on your Army issued laptop.
  • The army is stupid. It should mandate it's own standard for this using NSA approved hardware.

    Sheesh
  • by briancnorton ( 586947 ) on Friday July 28, 2006 @03:27PM (#15801692) Homepage
    To say that "the army" is requiring all pcs to do anything is questionable at best. What this appears to apply to is the enterprise systems. That's maybe a couple hundred servers that fall into the command of Netcom. I see no mention of netcom having responsibility for things like desktops, agency by agency servers, etc. Never can tell though.

A triangle which has an angle of 135 degrees is called an obscene triangle.

Working...