Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security

Interesting Uses for Trusted Computing 323

An anonymous reader writes "The Unlimited Freedom blog has published a new article describing 'interesting' uses of Trusted Computing. (Google cache here). Trusted Computing, as implemented in Microsoft's NGSCB (Palladium) or the Trusted Computing Group (TCPA), has been one of the most controversial technology proposals of recent years, to put it mildly. But the article on Unlimited Freedom offers a new perspective. The author examines 12 different applications which could benefit from access to Trusted Computing technology. And most of them are uncontroversial or would actually improve privacy and anonymity. Among the examples listed are multi-player games, online casinos, P2P networks, anonymous remailers, distributed computing and mobile agents. The analysis provides an interesting contrast to the usual focus on Trusted Computing's impact on control over digital content."
This discussion has been archived. No new comments can be posted.

Interesting Uses for Trusted Computing

Comments Filter:
  • Alternatives (Score:4, Informative)

    by BWJones ( 18351 ) * on Thursday March 25, 2004 @12:20PM (#8668886) Homepage Journal
    Hmmm, it seems that another [slashdot.org] approach might also provide these desirable side benefits but also work to secure the Internet as a whole, and not have to use "Trusted" architectures. Although, there are new controversies from the following approach, in short, from my journal: "an emerging Internet security company, Symbiot [symbiot.com] is taking an entirely new, albeit controversial approach to Internet defense and cyberwarfare that should appeal to cyberpunks everywhere. Rather than the traditional passive response that has been used by sysadmins and CTO's worldwide, Symbiot is taking a more "active" defense approach by implementing a common subscription based access to a "threat database" that will allow participating networks to determine the degree of threat and respond democratically (by using the shared resources of other participating networks) and proportionally to the attack by allowing for a graduated response to cyber attacks. The potential of an asymmetrical response to a threat is also not out of the question.... Links for additional information are here [netcraft.com] and here [onlamp.com]."

    • Re:Alternatives (Score:3, Insightful)

      Um, the two concepts are utterly and completely unrelated to each other. They are suited to completely different purposes.

      Some might argue that given the spoofable nature of TCP/IP, Symbiot's concept is suited to zero purposes, but that's orthogonal to the point:

      Perhaps Symbiot considers their database of threats to be not only extremely valuable to competitors, but also extremely valuable to their targets. THEN THEY'RE GOING TO NEED SOME DRM, AREN'T THEY, SMARTIE?

      There are many, many acceptable uses of
  • DRM (Score:5, Funny)

    by Anonymous Coward on Thursday March 25, 2004 @12:21PM (#8668905)
    DRM == Deprive Rights from the Masses

    Just like Sauron's ring, DRM cannot be used for good.

    • or == Draconian Rules for Me sure DRM can be used for good. DRM has made you powerful, now fulfill your destiny and take Balmers place at my side.
    • So we should carry Microsoft into it's nearby Cracks of Doom [nps.gov] and throw them in? Woohoo! On the count of three, lift!

    • Funny? (Score:5, Insightful)

      by Anonymous Coward on Thursday March 25, 2004 @12:36PM (#8669122)
      As the original poster, I find it rather disturbing that my post was modded up as Funny.

      If the DRM catches on and it gets legitimized, we'll soon have closed and regulated hardware like network cards, audio and graphics card that won't transfer data, play music or show graphics unless the mandatory DRM chip gives the permission to do so.

      • Re:Funny? (Score:3, Insightful)

        by Anonymous Coward
        I think it was modded that way because sometimes you have to laugh at some of this BS (not your comment, the DRM), just to keep your sanity. If not, I know I would walk around angry every second.
    • My Righteous Leader RMS says DRM is Digital Restrictions Management [gnu.org].
    • Don't Remind Me: Dumb Reasoning Misshapes Deplorable Reality Mindsets.
  • Giftwrapped bullshit (Score:4, Interesting)

    by onyxruby ( 118189 ) * <onyxruby@ c o m c a s t . net> on Thursday March 25, 2004 @12:22PM (#8668907)
    I dont think so. Trusted computing is based in principal on evil. It should not be legitimized by finding ways to use it that were unintended. Endorsing something rooted in evil does not change the morality of the base. I don't care how shiny you giftwrap bullshit, it's still bullshit.

    Think of it this way, Germany and Japan conducted much in the way of medical research in WWII, but since they conducted experiments that were inhumane, tortorous, and used unwilling subjects. The medical community wont touch their research, not because it is fundamentaly flawed, but because their research was fundamentaly evil.

    Stand up for your morals here and fight trusted computing.
    • by garcia ( 6573 ) * on Thursday March 25, 2004 @12:24PM (#8668936)
      Whether we like the intended uses or not it's coming to a BIOS/OS near you. We might as well find "good" uses for it.

      Although I don't see how telling another system what process you are running could be a good thing.
      • by BlackHawk-666 ( 560896 ) on Thursday March 25, 2004 @12:26PM (#8668968)
        You can always flash that BIOS away and replace it with a new one that doesn't have the trusted computing crap in it. There are some open source alternatives out there already.
        • by garcia ( 6573 ) * on Thursday March 25, 2004 @12:30PM (#8669041)
          again, you people keep coming back to this. I have already stated that if MS wants to get DRM going in the direction they seem to be going they are going to require the BIOS to be trusted as well which means something that isn't LinuxBIOS or free. It's going to be MS/Phoenix or whatever.
      • And I'm sure there will be an option to disable it in that bios. And when that option disappears, Macs and their OpenFirmware will look very attractive.
        • by October_30th ( 531777 ) on Thursday March 25, 2004 @12:33PM (#8669067) Homepage Journal
          And I'm sure there will be an option to disable it in that bios. And when that option disappears, Macs and their OpenFirmware will look very attractive.

          Attractive to whom?

          The majority of people using computers? Hardly. If the software they run (like Windows, for instance, or media players) doesn't either work or work poorly without DRM you can bet that they'll find DRM bios more attractive.

          • The majority of people using computers?

            if the majority of people using computers were told that their innocence was just thrown in the creek out back (read:drm) in plain english, you would certainly see a backlash. nobody wants prying eyes, not even for mp3's.
    • Your analogy of Trusted Computing to medical research in concentration camps is shockingly inappropriate.

      Moreover, your assertion that Trusted computing should be fought because it is "immoral" and "evil" smacks of the very same totalitarianism you appear to despise. Are you the sole person to determine what is immoral and evil? What if I have a different morality or viewpoint? Will you compare me to a WWII doctor, then, too?
    • Score:5, Clinically insane

      What on earth does WWII have to do with trusted computing? It's a way to remove a lot of the blind faith people have in computers. Which, funnily enough, is the same blind faith that ends up screwing everyone when something goes wrong.

      To the paranoid, trusted computing is "evil". To those with their heads screwed on properly, it's just another tool in their belt.

      I'm not having a go at you, but the hysteria /. and other sources have built up around this topic. The same things

      • by onyxruby ( 118189 ) * <onyxruby@ c o m c a s t . net> on Thursday March 25, 2004 @12:55PM (#8669403)
        It's called an analogy. Dont take it out of context, ok? That being said, trusted computing is evil because it is about taking away the rights and choices of others in the name of profit. The fact that it is intended to be brought in a manner such that there is no alternative only attestifies to it's evil nature. There is absolutely no consumer benefit to trusted computing. Even the name is inherintly dishonest as trusted computers cant be trusted by their owners. My point was that trusted computing is fundamentaly evil, and my point stands. Benefiting from the evil does not make it any less evil. Got it?
      • Ok... to me trusted computing is that my computer warns me when any application is trying to send a data packet outside the computer, and allows me to inspect that data packet and either allow or disallow it, but in any case log it for me.

        It asks me if I want to run a certian application asking me strongly if it has never ran before and may be a virus or spyware.

        It needs to have the tools to set up secure and "trusted" connections to other computers and networks and protect me from the software developers
    • by korielgraculus ( 591914 ) on Thursday March 25, 2004 @12:40PM (#8669183)

      Actually the allied powers made extensive use of the Axis research projects after the war. One example was Unit 731, responsible for the research and development of biological weapons through human testing. Not only were the perpetrators not prosecuted for war crimes, Shiro Ishii, the commandant was given a job by the US military! Makes you wonder what that fight for decency was all about really doesn't it?

      Further details on Unit 731 can be found here [marshallnet.com].

    • Principal of evil? WTF type of FUD Is that. There is need in the computing world for secured/trusted computing, its called businesses. I sure as hell want my servers to run only code that is signed by my company. Not only with my company but for companies I do business with. I want my bank with most secure system out there

      This shit ain't going to take off in consumer side of things. I sure as hell wouldn't buy one for home use.

      • I sure as hell wouldn't buy one for home use.

        Except that you will have no choice, there will be no alternative, and that's what trusted computing is about, taking away choice. It doesn't matter if your a business or a home user, you will have to have trusted computing. Just curious, are you really able to get all of your code signed? You are aware that you don't have to have special hardware to run signed code, aren't you?
    • These defenses of TC always make me think of someone advocating National Socialism as a solution to that problem with the trains.

    • by Have Blue ( 616 )
      I dont think so. P2P is based in principal on theft. It should not be legitimized by finding ways to use it that were unintended. Endorsing something rooted in theft does not change the morality of the base. I don't care how shiny you giftwrap bullshit, it's still bullshit.

      There, fixed that for you.

      It just goes to show that technology is still just a tool. It can be used by people for good or evil.
  • by Anonymous Coward on Thursday March 25, 2004 @12:22PM (#8668913)
    ...I'm cool with Trusted Computing.
    • Exactly, who controls the "trusting" is the difference between the microsoft plan, and what TCPA is ... they are not even tangently related...

      People need to get a clue on the MASSIVE differences between TCPA(and ESS), Palladium and DRM -- they are all seperate technologies. TCPA is the follow-on to ESS.

      Lucky IBM has posted research to help those who like to scream and yell, but don't like to read...

      tcpa rebuttal [ibm.com]

      More TCPA research [ibm.com]
    • Exactly. And the piont of Trusted computing is that you are denied any control at all, short of the ultimate control of yanking the plug and tossing your machine out the window.

      The "Trust" in Trusted Computing is trust that YOU, the owner of your own machine, CANNOT tell your computer to do what you want it to do. If you try to change anything in the computer the Trust chip returns a filed result and nothing works.

      -
  • As long as... (Score:5, Insightful)

    by BHearsum ( 325814 ) on Thursday March 25, 2004 @12:23PM (#8668928) Homepage
    As long as my computer is being told what it can or cannot do by someone other than me, I DON'T WANT IT.
    • This is the key that keeps getting lost.

      There is some discussion of TCPA from some guy from IBM, insisting that it isn't all bad, *as long as the user retains control*. What we all really fear about "Trusted Computing" is that WE (the computer owner/user) are the ones who are *not* trusted.

      The real pain in all of this is that there is some good in Trusted Computing, if done properly. Unfortunately things are polarizing into two camps, corporations using DRM to protect THEIR property against their customer
    • Don't ever use a computer then unless you wrote every line of code running on it, because that is exactly what it is always doing. (what do you think a program is anyway?)
    • As long as my computer is being told what it can or cannot do by someone other than me, I DON'T WANT IT.
      I guess you don't install any software on it then?
  • FYI (Score:2, Informative)

    by pinkUZI ( 515787 )
    A nice faq [cam.ac.uk] on Trusted Computing.
  • Wishful thinking (Score:5, Interesting)

    by Ed Avis ( 5917 ) <ed@membled.com> on Thursday March 25, 2004 @12:27PM (#8668997) Homepage
    Applications like online casinos would also benefit from a magical honesty pill which users could take to prevent them from cheating - but it's not going to happen. The idea of trusted computing is to require a specially restricted client machine, but there's no way this could work and be secure enough for something like online gambling. An important rule of online security is *you cannot trust the client*, and even if the standard Dell PC that grandma buys is locked down with all sorts of nastyware, this will do nothing against a determined attacker who is able to program a computer to do what its he, its owner, wants.

    Although trusted computing could never provide real security, it can give a lot of inconvenience to 90% of the population to stop them doing things with their computer that Microsoft would prefer them not to do. Just like other copy-protection measures over the years, its purpose is to keep the majority of users under control, not to stop the real criminals.
    • by dave420 ( 699308 )
      Did you actually read the article? (what am I thinking... this is /.)

      That's the whole idea of trusted computing (amongst other things), is allowing a trusted remote service to know full well that the computer its talking to is on the level. It's based in hardware, and is drenched in encryption and intelligent process control.

      The trusted computing will provide more security than you've got now, by far. And if you don't like it, you can turn it off. It's that simple. No-one's going to force you to use i

      • Re:Wishful thinking (Score:2, Interesting)

        by Ed Avis ( 5917 )
        It's a fair cop. I did RTFA but _after_ posting my comment. Got to get in early...

        That's the whole idea of trusted computing (amongst other things), is allowing a trusted remote service to know full well that the computer its talking to is on the level. It's based in hardware, and is drenched in encryption and intelligent process control.

        It's been a while since I read up on TC, and that was only from doommongering sites mentioned on Slashdot, but I just don't understand this. If you have control of t

        • The "trust" occurs below the OS, at the BIOS level. So you could modify Bochs to fake a trusted BIOS but you'd still need a valid key to ping against the key servers. Regardless, DRM sucks.
      • Yes, and that is certainly good enough to delay the development of your average Counterstrike Aimbot.

        However, when significant amounts of money get involved, it's a whole different ballgame. Silicon can be debugged remotely. [optonics.com] And given how sophisticated the schemes that casinos deal with right now - going on under their noses - it eventually would be.

        Further, unlike a game hack, a true professional wouldn't necessarily broadcast the HOW-TO to the world. More likely, he'd just sit back, shuffle account
      • No-one's going to force you to use it, unless you want to run their software. That seems fair enough to me.

        It's been argued (to death, actually), that this is in itself a major problem. If you're in the IT industry, you've heard/spoken the phrases ten thousand times. "Vendor lock-in", "[forced] migration path", "monopoly", "barriers to entry", ..."Microsoft", etc.

        Point being, while no one will force you to use apps/systems X, Y, and Z; tomorrow it could be practically impossible to function in society

      • ...if you want to use My operating System.

        Seems quite possible to me.
      • It's based in hardware, and is drenched in encryption and intelligent process control.

        You forgot to mention that the silicon is mixed with fairy dust to make it 107% tamperproof.
    • by dekashizl ( 663505 )
      The article actually talks about gambling clients trusting casino servers, which is an interesting reversal on the typical applications of DRM we usually here. Feel free to read the article, or you can just post again along party lines and hope to pick up some cheap karma. From the article:

      Using remote attestation, player software could confirm that the casino was using a certified and validated software package for its game play calculations, one known to be free of bias and to give the player an hones

      • Re:Wishful thinking (Score:3, Interesting)

        by bnenning ( 58349 )
        The article actually talks about gambling clients trusting casino servers, which is an interesting reversal on the typical applications of DRM

        As usual, DRM isn't needed to achieve this; we already have existing algorithms. Here's how a casino can prove that it's shuffling a deck of cards fairly:
        1. The casino generates 225 random bits, enough for all permutations of a 52-card deck.
        2. For each bit, if it is a 1, the casino server generates 2 600-bit prime numbers and multiplies them together. If it's a 0, the
  • by ifreakshow ( 613584 ) * on Thursday March 25, 2004 @12:28PM (#8669003)
    I understand all of the benefits of trusted computing, but still find it hard to accept for two reasons.

    First, I don't beleive that any system that is physically in the users hand is secure. Given enough time and motivation crafty end users will crack the system. For an example we need look no further than mod-chips and video game systems.

    Second, I'm a tinkerer. I love to play around with new technology and software. Ultimately this technology would be in everything from your computer to your dishwasher. I'd hate to lose that ability to dig around the machine and software myself or have to pay extra to modify my computer and devices to gain that back.
    • Mod chips (like the ones for the PS2) are detected when you play online, and the service disconnects you.

      That's one point of trusted computing people don't mention much - It doesn't stop you from running dodgy apps or hacking your machine to pieces, but it tells anyone you interact with that the integrity of your application through which you're interacting has been violated.

      The way you decide what software sits on your box won't change. If you don't trust Microsoft, don't put their software on your box.

      • I certainly have thought more than two seconds about trusted computing and am not giving a knee jerk reaction. I don't believe I implied the MS would be looking at my porn.

        In fact, in the future I'd like to be able to run MS software(would be a TC app) when there is a need and non TC apps when there isn't. I don't want to have to modify my computer or bios just to do that.
      • The aspect where I can control what software runs on my machine is fine with me. The fact is I know what is running on my machine though, so its kind of irrelevant.

        My concern is the direct and indirect ability of others to affect what I can do with my machine, things that today are perfectly legitimate. It is completely possible, with DRM in the picture to begin regulating what software may be used to communicate over the Internet, for example. If I make a patch to Konqueror and then try to use that to con
      • Mod chips (like the ones for the PS2) are detected when you play online, and the service disconnects you.

        That is merely because no one made the effort to get around that.

        it tells anyone you interact with that the integrity of your application through which you're interacting has been violated

        Once you have extracted a key from one of teh chips that is no longer true. You can use that key to run anything you like and it is impossible for anyone else to detect that you have defeated the system if you pr
  • Who and how many? (Score:4, Insightful)

    by EndlessNameless ( 673105 ) on Thursday March 25, 2004 @12:28PM (#8669011)
    As long as there are multiple competing trust providers, and administrators can choose which ones to certify for interoperability with their systems, I don't see much of a problem.

    Of course, the problem is that right now there is essentially only one trust provider, and its previous behavior doesn't incline me toward trusting it.

    The benefit of using multiple trust certifications is that OSS could get in on the game... if someone wanted to set up a way to submit source and receive signed compiled binaries for a small fee. A bit of a hassle and in effective in the event a licensee wants to modify the code, but then again the licensee could pay the original OSS coders or submit the modified source for signing themselves.
    • by dave420 ( 699308 ) on Thursday March 25, 2004 @12:56PM (#8669413)
      There isn't one "trust provider". Microsoft won't have any more rights to get into a TC'd up computer than you will. They provide the layer, and you install whatever software you want on top of it.

      Do Via dictate what OS you use, simply because they made your chipset? No - it's the same with TC.

      YOU are the trust provider. If you don't trust microsoft, don't install windows. Without that installed, Microsoft can't touch you. In fact, without Windows installed, they're not trusted by your computer AT ALL.

      This is why it's getting a bad press - these facts are not made public as much as the "ooh! bill gates can see you in your underwear!" hysteria. TC is defined not by the hardware you use, but by the software you choose to install. No Windows? No Microsoft.

      • Yes, and if you don't install Windows what are you going to open those TC protected Word documents, which only open in TC protected MS word on TC protected windows with a TC protected BIOS, with? If TC takes off, the general public will be too dumb to make the choice against it, and screw the rest of us along with it
  • trust this (Score:4, Funny)

    by maxbang ( 598632 ) on Thursday March 25, 2004 @12:28PM (#8669012) Journal

    I got yer trusted computing right here [debian.org], pal.

  • digital certificates (Score:4, Interesting)

    by call_me_susan ( 765345 ) on Thursday March 25, 2004 @12:29PM (#8669019)
    I've read about half of it. So far, the gist is that Trusted Computing will require digital certificates for all executables, documents, emails, and web pages (along with images). He claims that since a repository system of certificates will need to be formed (much like we have SSL certs like Thawte now), the power to deny publishing will be concentrated in the hands of the certificate repositories, which presumably will be large corps and governments. He claims this is the "Good Old Days" of producer/consumer media that the entrenched powers prefer, unlike the supposed new era of peer-to-peer internet publishing, whereby anyone can create their own web pages.

    Actually, having signed certificates on documents and email is not a bad thing. I've wondered for years why the US Postal service hasn't created a trusted email system for a small postage fee. I use PGP signatures all the time to verify downloads from the Internet. A certificate/signature repository is just a convenience so I don't have to constantly email or call people asking for their public keys. In all likelyhood these repositories will be competitive-but-cooperative databases like DNS, so there will probably always be alternative or bargain signature repositories.

    Yes, things will likely get buckled down as the Internet gets more mainstream and govts get their heads around it, but I don't see the gloomy future he does. Maybe he just had too idealistic dreams of the future. The bottom line is that most people don't want to publish their own content, and wouldn't even if they knew how. Blocking inbound port 80 to consumers is not the equivalent of book-burning or censorship, especially if port 80 is largely unused by consumers except as a vector for worms. If you want to publish, you'll just have to find a plan that allows you to do so. The fact the the large ISPs are figuring out that they can charge an extra $10-20/month for this is not the end of world, so long as more than one competing ISP exists.
    Also, no matter how much the Internet falls under control of central authorities, new technologies will arise for the tech elite to go about their business as always. After all, we somehow managed to build the Internet and BBS's in spite of the fact that publishers and the media had total control of print and the airwaves. History will repeat.
  • Comment removed based on user account deletion
  • "a recent software update for Windows Media Player has caused controversy by insisting that users agree to future anti-piracy measures"

    I think its time I start looking in to Linux, the only thing that keeps me with MS are the games.
  • freedom of speech is a small price to pay, for a cheater free online gaming enviorment... seriously are these few good uses supposed to outway the bad?
    • Why can't they? It's the exact same argument people make in favor of Kazaa.

      "Well, I know 95% of the traffic on Kazaa is violating someone's IP, but, hey, 5% of that _is_ legal, so there is substantial non-infringing use! RIAA IS TEH SUCK LONG LIEV KAZAA FREE MUSIC!"

      And, please, this is not about _freedom of speech_. I love it when people argue by exaggeration. No one's taking away your ability to argue. They're, at worst, taking away your fair use "rights" (which, incidentally, are mostly in your head any
  • by gpinzone ( 531794 ) on Thursday March 25, 2004 @12:34PM (#8669090) Homepage Journal
    Among the examples listed are multi-player games, online casinos, P2P networks, anonymous remailers, distributed computing and mobile agents.

    The problem with the typical Slashdot users' attitudes to Trusted Computing is that these obvious benefits get ignored while they harp on all the negatives. That's why articles like this get written. There's good reason to point out the problems with Trusted Computing. For example, a multi-player game success story would be the XBOX Live system. By ensuring the games are signed copies and blacklisting modchipped XBOXes, they've effectively eliminated cheating and helped prevent piracy. The problem is that they also prevent third party development for a machine that customers want apps to be developed for. The Xbox Media Center is an incredible accomplishment that's stymied by the tight control Microsoft has over this particular form of Trusted Computing.

    If our opinions were more balanced, perhaps the inevibility of Trusted Computing would be more favorable to consumers and developers.
    • How do cryptographic checksums and hashes on chips cause 3rd party developers to be locked out of making things like peripherals? How would checking the codes on games against a database stop innovation? I'm confused.
    • Don't forget - you choose what software is on your PC, so you and only you decide what's "trusted" on your computer.

      Not a fan of Windows? Don't install it. Hey-presto! Microsoft are not trusted on your PC.

      If we get rid of this damned hysteria that surrounds this truly useful technology, we'll be able to enjoy its uses sooner. If everyone keeps bitching about how it's going to let the feds climb into your ass, we might never see it.

      • I've seen this sentiment several times so far on this topic... and it makes me chuckle.

        I'm going to make an assumption for a moment, which is not intended as a slight, just something to clarify a guess of mine. The assumption I'm going to make is that you are relatively new (within the last 10 years) to 'heavy use' of computers. I assume this because you seem to take the current ease of 'alternate OS install' for granted. This has not always been the case, and I'm not sure that I see that it logically mus

    • Bullshit. Trusted Computing means that corporations know exactly what software you're running. When they say "we only support IE6 on Windows" it will mean that if you're not running approved software -- all the way down to the hardware, no interpreters or VMWare or any such -- you can't go to their web site or use their services. Free software will go back to being a hobby instead of a commercial contender. And DRM enforcement becomes trivial. Trusted Computing buys me nothing.
    • Virtually all claimed benefits of Trusted Computing, including the ones listed in the linked story, fall to one of two arguments.

      (1) In cases where the system is working for the benefit of the owner of the machine - protecting it against outside attacks - you can do the exact same thing with identical hardware where the owner of the machine is given a printed copy of the key hidden in the trust chip.

      There is NO POSSIBLE WAY that simply knowing your key can reduce your computer's ability to protect you. T
  • by 3Daemon ( 577902 ) on Thursday March 25, 2004 @12:40PM (#8669179)

    Whilst people seem to have a knee-jerk reaction against "Trusted Computing", I think there is one crucial issue that actually determines wether or not it's a Good Idea(tm). And that is: Who holds the master keys to my computer?

    Point being that hardware level security features can be a great boon, as long as I decide what to trust and what not to trust.

    Ofcourse, that's pretty guaranteed not what MS wants to push, but still - when discussing "Trusted" architectures in general, I think it's a valid point. It could for instance enable me to say that I trust the FSF's list of trustworthy applications - and viruses and other malware would actually be physically unable to run on my workbox. How could that be wrong?

    Another issue I've thought about is - how can anyone be so sure it won't be cracked? People seem to be tinking that hardware enabled "security" (DRM, whatever) will finally give watertight security. Yet, to my knowledge, both PlayStations and XBOX'es has tried that trick - to no avail. (In the sense that those wanting to subvert the protection mechanisms seems perfectly able to do so).

    Ohwell, just my thoughts atleast. If I have misunderstood anything, feel free to correct me :)

    • Who holds the master keys to my computer?

      The central design criteria for Trusted Computing is that you are forbidden to know your own keys. Effectively the Trusted Computing Group controls them.

      Of course Trusted Computing is a purely "opt-in" system. You are given a choice - you can "voluntarily" opt-in and turn over total control of your machine to someone else, or you can opt-out and that entire portion of the computer WILL NOT WORK AT ALL. It would then be impossible to run (or even to install) any of
  • Trusting Software (Score:5, Interesting)

    by Sloppy ( 14984 ) * on Thursday March 25, 2004 @12:44PM (#8669238) Homepage Journal
    The analysis provides an interesting contrast to the usual focus on Trusted Computing's impact on control over digital content.
    I don't see much contrast. They all have one thing in common: it's about not trusting the machine's owner, and using someone's computer to serve someone else's interests.

    A lot of these examples are really creepy, and one point keeps coming up: making sure someone on the other side is running "legitimate" versions of software that are known to be unmodified. I just don't think that's a legitimate thing to care about. Specific software fingerprints shouldn't matter; interfaces should. Insisting on specific software instead of standardized interfaces, holds back innovation and flexibility. It's almost like the very point of "trusted computing" is to help create and sustain software monoculture. I think that's disgusting, and I know it's destructive to progress.

  • CPUID (Score:3, Insightful)

    by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Thursday March 25, 2004 @12:47PM (#8669286) Homepage
    Aren't all of those things ones that could/would have been done with the CPUID that Intel tried to put in the P3 that got privacy people so freaked out?

    Why then and not now? It's basically the same thing.

    • Because so many people freaked out, they had to allow it to be turned off. crippling it's effectiveness.

      In effect, objections were so numerous that they gave up, went away, and have now come back with a lot more friends to have another go.

  • by innerweb ( 721995 ) on Thursday March 25, 2004 @12:48PM (#8669297)
    ... many products that businesses are not willing to put on the net. It will also enable greater abuses by those who know how. I would not mind having one machine that is enabled, but it would be the only one, and only useful for certain things.

    I would say relax. TC(Trusted Computing) will actually be a great thing for open source. When people start paying full price for all their "warez", they will start to find that the wish list is bigger than the piggy bank. This technology will enable a great many things, and it does not have to be used (AFAIK). It will also be great for OSS development. It helps to know that the correct TC is being used to submit the code. It will make John Q Public feel safer.

    I am no expert on the ramifications of TC, but I do *much* work with companies that want to use the online world, and most of them limit their services due to the issues that TC will solve. Even in OSS, we have to make money. It is how the world goes 'round, puts food on the table. TC will make it easier in some ways to make money. It will also make it easier for the small guy to make money.

    That said, there are serious potential abuses of this technology, and I am still hesitant to boldly go forward. It will probably go forward without me if I do not though, so, all things being as they are, I need to learn how to use it and give it to my clients. They will want it. They have been wanting something like it for a while now.

    InnerWeb

    • A lot of the responses to this article are (rightfully, IMO) concerned that TC will only be used to serve the interests of corporations. So what I want to know is: why couldn't an open source trusted computing platform be created? One that we can be sure can be turned off at will and will sign applications for free and without bias.

      At first glance, TC may be incompatible with the concept of open source because the system is useless if everybody can sign their own code. However, if the signatures are contro
  • by Decameron81 ( 628548 ) on Thursday March 25, 2004 @12:48PM (#8669298)
    Oh, so that's what "trust" is all about? It's all about being able to trust ME?

    So my PC needs to be locked so I don't cheat in multiplayer games, steal from banks online, or modify my programs...? But why would I do that in the first place? Next thing they'll do is take away the knives from my kitchen to make sure I don't kill anyone?

    "Trusted computing" is all about remote hosts trusting YOU. The only way in which this can happen is by making sure YOU won't be able to behave as you want. Those who are pushing this initiative forward are doing so because they want to control what you do, they want to be able to certify what you can do with your PC. While it may be a good thing to try to make online games, online gambling, online banking and others as secure as possible, personal freedom shouldn't be limited in such ways!

    It's all a big paradox, because on one hand you get Microsoft releasing an OS that no-one trusts on a security level, while on the other hand they (and others) want to tell you how to use your computer to make sure you can be trusted?

    I don't know if you feel the same way, but those examples that would make "trusted computing" such an interesting idea make me feel like a cybercriminal of some sort.

    Diego Rey
  • A Shotgun (Score:4, Insightful)

    by headbulb ( 534102 ) on Thursday March 25, 2004 @12:49PM (#8669321)
    A shotgun is a good use for any hardware made with Palladium.

    Anyways something more serious. They (Palladium) are trying to implement something that should be totally in software not hardware. Its kinda like throwing hardware at virus's (which is what They are really doing)

    Like almost everything microsoft does They are pretty bland about their technolgies. For example can anyone give me a concise answer on .net the public would understand. Yep thats right its mostly a marketing word. (I shutter to call it that)

    Lets go through what Palladium does.
    1 "Critical data is in the user's control"
    Wow so thats what drm is all about.. I would of never known.. Seriously Why are they trying to implement part of drm in hardware.. Its not a portable device and even then. Lets get back on topic.. Users are already in control of files. Is it that the gui is confusing to users? Well whats to say that this new drm gui won't be either. I think this is more a case a gui design.

    2 'Programs and computers can prove they are the other computer/program'
    Seems to me that we can do that too in software.. SSH verifies the other computer when you connect. It's called keeping the private key private..

    3 Something about allowing certain users access to certain documents..
    We have this too. Its called permissions and useing pgp to send files..

    Well I am going to stop there.

    I am way past my original post.

    I will not support any manufactor that will suppport Palladium, I will go with apple before that happens.

    All and all this concept Microsoft is trying to do is overkill, if they only wrote secure code (they are doing better then in the past) they wouldn't need to take such drastic moves.
  • by chatooya ( 718043 ) * on Thursday March 25, 2004 @01:02PM (#8669493)
    If trusted computing depends on authentication via hardware, won't this function become less and less useful as computing becomes distributed across more devices and individuals are less tethered to specific machines? Or would we all carry a little TC device that plugs in to various 'toolbox' hardware? Any thougts?
  • Multi-player Games
    So, putting in all these "security" features in the consumer's PC is supposed to stop cheating? Far from it. Instead, it does two things:

    1. Makes cheaters more determined to find a way to cheat. It's a new challenge, nothing more. So, you can't run a software debugger. Well, what about a little home-made hardware plugged into the bus and a second PC (Trusted Computing PC, no less) acting as a remote debugging station with all the horsepower to analyze the data on the bus and send input
  • by Dr. Manhattan ( 29720 ) <sorceror171@nOsPAM.gmail.com> on Thursday March 25, 2004 @01:04PM (#8669525) Homepage
    There are better ways [ogi.edu]. (PDF, sorry.) It's also interesting to see other papers and such that reference [google.com] this paper.
  • This could all be done today - Microsoft would just have to download a patch into your player - but once TC makes it hard for people to tamper with the player software, and easy for Microsoft and the music industry to control what players will work at all with new releases, it will be harder for you to escape.

    I think I finally just understood TC, and I'm not quite as scared as I used to be. All these software lockdowns would happen today, but people would find ways around them. If TC came about, people
    • Try explaining to any person why they can't use their computer to do something they want to. In the end, computers are not being licensed to their owners, and the owners will not settle for being treated like they are.

      People are already treated like sheep by corporations like microsoft. When something doesn't work or crashes, the customers blame themselves for not doing it right.

      When the music industry tells them that their DRM 'protected' plastic disk won't play because their in-car CD player doesn't a

  • clods (Score:3, Funny)

    by hellmarch ( 721948 ) on Thursday March 25, 2004 @01:14PM (#8669655)
    you insensitive clods!!! large corporations are only trying to help us!!! now shut up and take your pill, they're watching us
  • Maybe I'm just naturally a suspicious person, but that blog seems like MS Turf to me, even the words used echo the PR language.

    For example:

    Fair Use is Not a Right
    It challenges claims made by some that DRM is evil because, among other things, it can take away "fair use" rights

    Linus is OK with DRM
    There's a great discussion on Slashdot this morning about Linus Torvalds approving Linux kernel support for DRM .

    A Canadian survey shows that while baby boomers generally agree that unauthorized downloads of Inte
  • by randomwalker ( 758064 ) on Thursday March 25, 2004 @01:18PM (#8669725)
    It is good to see a few more articles that look at possible uses of Trusted Computing as opposed to just stating that MS is evil. I feel Trusted Computing is a natural evolution of PC design. The PC architecture has traditionally been developed (like almost everything else that old) without any security in mind. Without security in hardware on a PC, there is definite limits to how secure of systems you can build on top of it.
    I looked at the NGSCB plans in detail. Most of the things that people complain or fear about in NGSCB or Trusted Computing are not justified by the architecture. It is well designed, does not remove any privledges from the owner, does not lower privacy, but does enable new levels of security to be built into a PC based system. Without initiatives like NGSCB and Trusted Computing, some system will have to be built in proprietary hardware designs (with security in hardware, and additional cost).
    I would like to see the Linux community use Trusted Computing features also. I fear if Linux does not act on this oppurtunity, MS will gain some advantage in the server market by offeringmore secure services based on Trusted Computing which Linux does not.
    More details on NGSCB and Trusted Computing can be found at http://www.marzenka.com/technology/security/NGSCB. htm [marzenka.com]
  • by Anonymous Coward
    As a record store owner, I have to say I am very pleased at the ideas of trusted computing and "DRM". I don't know a lot about computers, but I do know that following the advent of CD burning and file sharing, my sales have dropped nearly 40%. To make ends meet, I have to moonlight at a phosphorous processing plant; my health has deteriorated rapidly as a result. My wife has been forced to sell soiled panties on eBay, and my son just got his arm lopped off working in a lumber mill. So while this idea of sha
  • by Insount ( 11174 ) * <slashdot2eran&tromer,org> on Thursday March 25, 2004 @01:35PM (#8669963) Homepage
    The fallacy in this article is the assumption that NGSCB is perfectly secure and unbeatable. This isn't the case, and in fact there are reasons [weizmann.ac.il] to believe that at least some of its functions are theoretically impossible.

    NGSCB can be broken; you'll just have to go through a lot of trouble to do so (scrape off chip packaging and decode its internals without triggering intrusion detectors, etc.). This is sufficient to stop casual copyright infringement, or to keep your workers at check. But one ought to doubt if the expense of breaking NGSCB isn't worthwhile for online gambling, elections or other applications where the incentives are very high.
  • by BobGregg ( 89162 ) on Thursday March 25, 2004 @01:49PM (#8670152) Homepage
    Here was my favorite part of the article.

    >>Trusted Computing will totally change the security situation for financial transactions. For the first
    >>time, personal computers will be suitable platforms for financial operations. Compared to the
    >>security provided by TC, today's computers are defenseless against attacks, and it would be foolish
    >>to perform online banking transactions of any significant amount of money.

    Right, so I count three points here:
    1) TC is going to solve all the problems with online commerce.
    2) Today's computers are relatively defenseless.
    3) Doing online banking at present is foolish.

    Okay, I'll bite. First, I'd dispute the first conclusion, just because *no* technology solves problems of trust outright. I actually work for a major financial institution, and I help manage and maintain our online banking system. So I know, without doubt, that the majority of security problems in today's world are about 10% technology-related, and about 90% people-related - from people doing foolish things with passwords, to not checking the status of accounts when your relationships turn sour, that's what causes the vast majority of security issues with banking, online or no. So no, TC would *not* revolutionize online financial transactions.

    I'd dispute conclusion #2 too. Maybe TC computers would be more secure - but to say that today's are "relatively defenseless" ignores not only the reality of today's online environment (that eCommerce works pretty darn well), but also ignores point #1 (that most problems aren't tech-related anyway).

    As for number 3 - you can guess what my opinion of that is. Thbbbbpppbpbttt.

    Whatta crock.
  • Don't we already have solutions to all these issues? Isn't it already possible with software? There are already public/private keys for communication, certificate authorities, etc.

    If someone doesn't want to use it, why should they be forced into it through their hardware? Why don't the companies that would like authentication just use the current methods?

    Example: Blizzard wants to check that their code is unmodified? Create a certificate, sign their code, and check the signature.

    Is it just me, or
  • To 'paraphrase' Field Of Dreams [imdb.com]

    In a nutshell, that's what the issue of Trusted Computing all boils down to anyway, right?

    Who controls the power of duplication inherint in a personal computer?

    The owner/user of that computer?

    Or the hardware/software makers at the behest of the media cartels/corporate conglomerates/Federal government?

    Stock up on non-DRM hardware/software now and refuse to buy DRM/Trusted Computing encumbered hardware/software.

    That way, you will be voting against Trusted Computing using
  • Please correct me if I'm wrong on this. I don't think remote attestation can be used for peer-to-peer applications.

    Remote attestation is an inherently client-server "feature" because it requires that the server know the hash of the client. Therefore, the client must exist before the server is built, because the client's hash is a piece of data in the server's code. The client then asks for the OS to sign/send this hash to the server, which can verify it. However, it's not possible for a piece of software t
  • for some uses.

    I say this is bullshit. I won't accept an oppressive system, neither for good nor for bad deeds. I will not give in to smallish benefits that come with a hefty impact on freedom and usability.

    No cheater, no hacker, no worm, no virus, no nothing can annoy me that far that I will give up the rights to a computer I fully paid and own. I know what the real aim of the TCG is and I won't accept anything from them. No bargains, no rebates or extras on Palladium-Computers, no benefits from thei
  • by r5t8i6y3 ( 574628 ) on Thursday March 25, 2004 @03:31PM (#8671565)
    i very much appreciate the author's insights. but just as AARG! noticed the EFF report's shortcomings, so his/her analysis is also lacking at least one important perspective. what AARG!'s analysis fails to duly acknowledge is the idea that trusted computing supplies Microsoft (replace "Microsoft" with the existing powerful entity of your choice) with a tool to maintain their power over others.

    if Microsoft can enable *wide-spread* lock-in prior to alternatives sufficiently establishing themselves, alternatives may never appear. and if they do appear they may never become a true alternative due to Microsoft's ability to control the environment in which any alternative exists.

    we live in a society that allows the existence of monopoly corporations with more rights than people [66.102.7.104]. this allows environments to be created where choice is even harder to come by. customer lock-in means not only limiting/eliminating choice, it also means making it too painful to choose freedom.

    Microsoft will continue to attempt to lock-in customers by manipulating the environment so there is less choice. they may or may not succeed to one degree or another. trusted computing gives Microsoft a new tool (in addition to their immense leverage over the computing industry, their political power, their financial resources, and their existing monopoly position) in establishing an environment where choice effectively does not exist.

    in my mind this is a much more glaring omission than the technical misunderstandings of the EFF report. what's obvious is that the EFF is interested in being a watchdog for freedom, whereas AARG! seems to assume freedom will just happen.

    again, trusted computing gives corporations another tool that allows them to consolidate their power, increase their control, and create environments where alternatives exist only in name.

    i choose freedom, and will do all i can to rollback the expansion of corporate rights to pre-1886 levels.

    P.S.
    AARG!, if you read this i'd love to hear your reply (publicly as i don't use the email address attached to this account) to this concern. btw, is there a way to get a message to you?

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...