Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Internet

Can Poisoning Peer to Peer Networks Work? 391

andrewchen writes "Can poisoning peer to peer networks really work? Business 2.0 picked up my research paper from Slashdot and wrote an article about it. In my paper, I argue that P2P networks may have an inherent "tipping point" that can be triggered without stopping 100% of the nodes on the network, using a model borrowed from biological systems. For those who think they have a technical solution to the problem, I outlined a few problems with the obvious solutions (moderation, etc.)."
This discussion has been archived. No new comments can be posted.

Can Poisoning Peer to Peer Networks Work?

Comments Filter:
  • by Blowit ( 415131 ) on Tuesday September 03, 2002 @10:08AM (#4188551)
    Have each user vote for each server they download from. If a specific server gives out bad files, the users would vote as a bad server. Then it would not be able to connect to the P2P network.

    This would be moderation however, it would be the smartest way as each user would have their word on who is allowed and not allowed on the network.
    • Does that require either centralization (which attracts lawyers and introduces a single point of failure) or trust (P2P propagation of votes, which might be spoofable by a small conspiracy)?
    • Unfortunately, that would lead to bias from potential downloaders of music, as well as for manipulation of ratings by an individual or a group of individuals. Ultimately, this would only serve to flesh out targets by would-be P2P 'hunters', i.e. RIAA agents.

      If I see a list of servers, and a rating, I'm instinctively going to select one of the top rated servers. Most people's ratings of such servers would be a function of two distinct factors:

      - Does the server have what I'm looking for?
      - How quickly can I get this file from this server?

      If both factors are very favorable to me, I'm going give this server a good rating. If I can't connect, or the server doesn't have what I'm looking for, I'm going give the server a poor rating.

      If a server wants to become highly rated in this type of a system, the operators must provide

      - Lots of bandwidth
      - Lots of files

      Not many people can afford to do both. As a result, a 'cartel' of sorts would be formed, where the top few servers serve to a majority of the users, and the rest of the servers, of which there may be 20 times or more of, all serve to the minority.

      If the 'hunter' wants to kill this group, what does he do? He wouldn't want to poison each one systematically -- he'd want to go after the big targets that everyone feeds from. This rating system would only help him expedite this process.
    • But if I were the RIAA, my legions of henchmen would be voting down the servers that supply "stolen" music, and voting up the servers that supply poison. And they would meta-mod down anyone who disagrees with their votes.

      So to be useful, votes would require authentication in order to avoid ballot box stuffing. But authentication goes hand in glove with identification, and that's something the users of the P2P networks seem to be trying to avoid.

      Bottom line: voting is subject to the same poisoning that the files are subject to. It adds a layer of complexity that simply delays poisoning, but probably not for long. Hell, with the inevitable bugs (that end up denying users unpoisoned files) and long-term ineffectiveness, voting would probably be smiled upon by the RIAA.

      • However, if the voting is ONLY allowed after a download, then this poisoning can be significantly reduced...
      • If I were the goddamn RIAA or the MPAA (Jack "Maddog ... Grrrr!!" Valenti, I mean) I'd focus a little bit on image enhancement.

        If I were the RIAA, I'd tell my employees to stop acting like a bunch of two-bit hackers start giving the customers what they want.

        Really, this whole thing -- from poisoning P2P network to authorizing legal hacks on 14 year old uers -- is absurd.

        Hilary and Jack "Maddog ... Grrrr!!!" Valenti oughta take their fingers from the sockets and start talking with users and figuring out how they can get users what they want and the users can give the RIAA and MPAA what they want.

        It's a long process, but I'll tell you one thing: the more the RIAA and MPAA keep employing the shock-trooper tactics, the less goodwill and grace (if such goodwill and grace ever existed, but I think it did -- at least in part) they're gonna get from Joe and Joe-elle Consumer.

        • Not gonna happen...

          The RIAA and MPAA want money. Lots of money. The kind of money they're used to. The P2P sharers want music. Lots of music. For free, just like they're used to.

          Everybody keeps ranting "why don't they find a business model that works?" Here's your answer: There isn't one; there won't be one; there can never be one. First, it's an argument of corporations vs. the marketplace. Can you speak for every P2P user? Can anyone even claim to? Of course not, no one can. So it's already a one-sided discussion. The industries have no incentive to "talk" to the marketplace, since their only feedback comes in the form of "no revenue, no sales" in any case.

          Jack and Hillary aren't stupid -- they've already figured that much out, so I think they've come up with a simple plan. They've decided to squeeze every last nickel from every last legitimate consumer until the whole production system implodes from lack of revenue. Their business plan is to get to be so rich now that they won't care when it implodes.

          Under this plan, Jack and Hillary have no need to talk to anybody except to placate their respective industries. "Studios, crank out those movies. Recording companies, press those discs. We're taking good care of the whole Internet for you. We promise we'll have this piracy thing licked about the same time we reach $1,000,000,000 net worth (each.) So keep your stock prices up, please."

          • look at DVD's...provide so much material that it is more work pirating than it is to buy. Why does a DVD cost the SAME as a CD ? Last time I checked a movie was SIGNIFICANTLY more expensive to produce than a ALBUM, and yet DVD's sell for the same or LESS, and quite often contain the BLOODY soundtrack as well. If a CD included multimedia stuff, editing room floor tracks, useless bio info and oodles of extra crap at a reasonable price it will be more trouble to rip it than it would be to buy it. When the RIAA wakes up and realizes that, maybe, just maybe things will turn around, otherwise, one way or another the industry is dead. The MPAA is actually beginning to come around, slowly and not without a FIGHT, but they are evolving. I don't hold out the same hope for the record industry.
    • This would be changing constantly. First of all, joining a P2P is pretty easy assuming it is open to the public. And, as I am out searching for enimem (they throw out a lot of poison), I download a poision file, and now, I am a) blocked from the network or b) passing out poison myself.

      A blocking system can't work fast enough.
    • Have each user vote for each server they download from. If a specific server gives out bad files, the users would vote as a bad server. Then it would not be able to connect to the P2P network.

      A voting system can be abused by creating a large group of malicious users giving each other positive feedback. Andrew already mentioned this on his webpage. Routing on a P2P network may not be direct, so you may not be able to give a site bad feedback anyhow.
    • But your solution is going to involve too much interaction and it's just moderation and letting kids control the network.

      What I want to have in the future of P2P is system level protocols which require no user interaction.

    • Unfortunatly I doubt there is one easy way of keeping P2P unpoisoned. It's one of those thorny issues that appear simple but really turn out to big bastards, like cryptography.

      I was reminded of one of the AI Koans [everything2.com]

      One day a student came to Moon and said: "I understand how to make a better garbage collector. We must keep a reference count of the pointers to each cons."


      Moon patiently told the student the following story:
      "One day a student came to Moon and said: `I understand how to make a better garbage collector...

      [Ed. note: Pure reference-count garbage collectors have problems with circular structures that point to themselves.]
  • by Anonymous Coward on Tuesday September 03, 2002 @10:09AM (#4188555)
    Many users, when they download a "poisoned" file, get a little angry... and then they move on WITHOUT deleting the file! This leaves it in the system on yet another node and increases the chances that someone else will download it from them. If users take a little more responsibility for the network, these files wouldn't spread very well at all.
  • by curseur ( 567725 ) on Tuesday September 03, 2002 @10:12AM (#4188566)
    Because most users download files and never check them.
    Really annoying especially with large files you've downloaded at 1kbps
    • I could never understand the LONG lists of available files which are not usable.

      In addition, anyone using ATTBI should be forewarned that you should remove ANY and ALL movies from your shared folder on any P2P network. The MPAA is reporting violations to ATTBI's legal demands center and ATTBI *is* disabling users who violate rules.

      I suggest the removal of all shared movies if you are on ANY ISP, but especially large cable modem networks.
      • I could never understand the LONG lists of available files which are not usable.

        You've never run a BBS right? :^) The number of junk files uploaded even when they didn't need it for download ratios was amazing. Or uploading renamed copies of software already uploaded (with fscking BBS ads inserted into the zip to make size checks impossible.)

      • "I could never understand the LONG lists of
        available files which are not usable. "

        Are you talking about the people who post their playlists on a website, which is what you find when searching for a song title, but has the files themselves elsewhere?

  • by Kragg ( 300602 ) on Tuesday September 03, 2002 @10:14AM (#4188587) Journal
    Although this idea [checksums] works for newsgroups and some other centralized services, it does not with P2P. Basically, it comes down to the fact that you must trust whomever is actually doing the checksumming, or else they can just lie and publish false checksums. In the case of P2P networks, the checksumming is done by the same person you want to figure out if you can trust! As far as I know, this is an unresolvable problem.


    So, um... how about this... If it's a standard file, such as, say, the deviance rip of neverwinter nights, or the new MPEG of Two Towers, then it should always have the same checksum.

    Somebody somewhere needs to maintain a website with these checksums on. Then there's no dependence on the person who you're pulling the file from.

    Obviously doesn't work for random porn videos (although it would for more popular ones... which might also tell you whether they're any good).

    And there's nothing illegal about it.

    Problems?
    • Yes, but by the time you've downloaded it to check the checksum you've wasted n hours downloading trash.
    • eDonkey 2000 / http://www.sharereactor.com do this. The eDonkey network works by using links (as in clickable on web pages) that contain MD4 sums of the file + file size to let users know about files on the network. It does have some searching capabilities but they are limited. This is persumably fixed in the new Overnet project the guy is doing.

      The files are all downloaded in segments from multiple sources, and you sometimes get bad segments, but they are only a fraction of the total file size so you don't really care.

      You just plain can't poison eDonkey / Overnet - it won't work. It is also the only network that I would be tempted to use to distribute real content since it is guaranteed that the user will get what you want them to.
      • I just started using it last week -- I think I remember something whereby each file has some type of key / checksum (I'm not too familiar with the nuances of encryption)........... but I could be wrong.
      • As mentioned above, sites like ShareReactor pose as a single point of failure. The RIAA could (arguably) close down SR, which would be a tremendous loss for the ED2k P2P community. Of course, other sites or other ways of checksum distribution would spring up to try and fill that void - while not unpoisonable, I'd agree that the eDonkey network is very well defended against it.
    • Not at all. I am in fact considering coding something like that. I'm envisioning a separate p2p network where md5 checksums along with moderated content is kept as synchronized as possible. Users can submit new files/checksums, but those should be peer reviewed in some yet-undecided manner. It should be possible to blacklist md5s (VERY efficient in stopping virus propagation, bad mp3s etc).

      Then, the different clients can interface to the content p2p network, so that users that are considering downloading a file can have a better guess at the authenticity and quality of the file - given that they build in support for passing hashes along with the general search results.

      I would actually like to see a system where the content database is so well maintained that all systems can use it as a central QA tool, enhancing the file sharing experience.

      And folks - just because the technology can be abused, does not mean that it is inherently evil. I just would like to have the quality raised in some way.

      The downside is that it will probably become a way to censor information to some extent. We just need to minimize the risks, and maximize the benefits.
  • by Pedrito ( 94783 ) on Tuesday September 03, 2002 @10:15AM (#4188591)
    I disagree with your suggestion that checksums can't work. A way they could work is as follows.

    Create a website with logins for the users. Users of this web site can create lists of checksum for the files they create or have downloaded and verified as valid.

    Other users can check any given user's list, and perhaps even post comments about the user's list, a form of moderation, if you will.

    The validity of any single file on any random user's list would certainly be questionable, but some lists would become "trusted" by the community through trial and error. Others would be recognized as bogus and ignored.

    Just a thought. Give me more than a few minutes and I might be able to come up with a better one.
    • This is exactly what is addressed in the second part of his answer to this question in the FAQ:
      Another idea that is often proposed is moderation, specifically "webs of trust." That is, people keep lists of people they trust, and then they implicitly trust (often with diminishing degree) the people they trust, and so on. In the context of P2P, the each user would then receive a "trust rating," reflecting the number of people that trust them. However, this can also be defeated fairly easily, by creating groups of malicious users that trust each other - then, untrustworthy users may have high scores leading to problems in the future. This kind of fraud has happened on eBay, where people give themselves recommendations to mislead future partners.

      • This kind of fraud has happened on eBay, where people give themselves recommendations to mislead future partners

        Yeah, and it's run e-bay into the ground. Oh wait, no it hasn't.
        • EBay has benefitted significantly from PayPal in this regard, where even if you get screwed (which seems to happen regularly) you can recover your investment. How are you going to recover lost bandwidth once you've already downloaded something?
          • I've purchased dozens of items from E-Bay over a period of years. I've never once been screwed, though I have spotted some obvious fraud in the past.

            As for recovering lost bandwidth, you can't, but you can use checksumming along with moderation to improve reliability.
  • Always a way (Score:5, Insightful)

    by Lumpy ( 12016 ) on Tuesday September 03, 2002 @10:21AM (#4188625) Homepage
    Most of us who have been on P2P looking for files have been used to the fact that a large number of users are misconfigured (their firewall blocks your incoming request but heppily tells you they have the file you want) or are trading crap quality files. At that point you resort to brue force and using a bot to just grab everything it can to a large holding drive... a 40gig ide is dirt cheap and can easily hold the results of running a bot searching for "radiohead mp3" and grabbing EVERYTHING it finds over the course of about 3 days. but then you have to manually go in and delete all the crud, cruft and garbage. It's still faster than the old days of IRC trading but the signal to noise ratio has always been really bad.

    Granted poisining it can start to drive away the gimmie-gimmie crowd or the newbies.. but the hardcore and old-timers will stay and simply find a way around it. Hell a group of about 100 of us now have our own private open nap network going and we have only high quality known good files. any clients connecting not sharing or sharing crap are instantly banned/blackballed... so we do the moderation thing.. with a side requirement that you must be asked to join and prove your worthyness to us. Maybe that will be the direction P2P will go... back to the roots of IRC where you had to prove your worthyness, ratios were encforced, and real people made decisions to keep out the troublemakers...(RIAA) granted you dont get 30 bajillion users that way, but then you dont have to spend a night and 10 gig trying to find that song or file you want.
    • Re:Always a way (Score:3, Insightful)

      by warpSpeed ( 67927 )
      Hell a group of about 100 of us now have our own private open nap network going and we have only high quality known good files.

      You hit upon a good theme here. To counter act the problems, the signal to noise ratio, poisoning, etc, users will have to PUT MORE EFFORT into downloading warz, and MP3s. The P2P networks will thrive, but you will not have as much of the global swap fests, and free warz that you can get now. The most the people poisining the P2P world can hope for is to increase the level of effort required to use P2P effectivly. And along the way they will create some stonger social ties between the users. Ultimately they will end up strenthening the whole P2P movement...

    • Re:Always a way (Score:2, Insightful)

      by wa1rus ( 605203 )
      Granted poisining it can start to drive away the gimmie-gimmie crowd

      To be fair though, that's pretty much the point, isn't it.
    • If this is what people are forced to do to achieve Napster-like results, then RIAA et. al have basically won all that they set out to achieve. By raising the bar high enough and by forcing higher transaction costs on the users, industry effectively shuts internet piracy out for 99.9% of the population. Of course people like me, that 1% or whatever it is, will always be able to circumvent whatever they throw in my path (presuming that I'm willing and wanting to do so of course). However, that number is so small that they really would not bother spending much effort to enforce from a simple cost / benefit point of view. Why spend millions in legal and related fees to track down a group of consumers that only account for half that amount? They won't bother, like they didn't really before Napster came along.

      In fact, I would further argue, against the conventional wisdom on slashdot, that RIAA has basically won the war against P2P and other forms of mass piracy. At least once they shut out networks such as Fasttrack, and let it be known that there will no financial return for those that fund the development of piracy networks. Certainly the average Schmoe can download that super popular song via GNUtella with some effort, but getting much more than that like, say, the entire album at decent quality from same artist, is like trying to extract blood from a rock. That is not to say that they will retire their guns, but rather that it will just be an on-going series of small battles, more like maintenance, to hammer down any network, system, or device that pops up and starts to hemmorage their intellectual property.
      • but getting much more than that like, say, the entire album at decent quality from same artist, is like trying to extract blood from a rock.

        I (sadly) only started using Napster about a year before it got shut down, but I never found it a particularly good source for downloading an entire album, especially one in the same bitrate and overall quality. I thought that was nearly impossible.

        I'd say overall that only about 75% of the stuff was worth keeping (eg, 128kbps+, no skips/cutoffs/distortion) and I searched for mostly mainstream stuff (rock n roll). I got a fair amount of cutoff tunes, tunes with skips in the middle or just bad overall audio quality.

        I'd agree thought that the RIAA has effectively killed off P2P, except for people that make a serious effort at maintaining their own networks or of putting real resources towards mining gnutella-type networks.
        • I (sadly) only started using Napster about a year before it got shut down, but I never found it a particularly good source for downloading an entire album, especially one in the same bitrate and overall quality. I thought that was nearly impossible.

          I'd say overall that only about 75% of the stuff was worth keeping (eg, 128kbps+, no skips/cutoffs/distortion) and I searched for mostly mainstream stuff (rock n roll). I got a fair amount of cutoff tunes, tunes with skips in the middle or just bad overall audio quality.
          While I agree that Napster was hardly ideal at this, it was VASTLY better than the current alternatives and it was actually quite workable if you knew how to take advantage of it. Namely, you find all the users that have a good organized collection of kinds of files that you're interested in on a decent network connection, add them to your hotlist, browse their lists directly, and download exclusively from them. I discovered these users, in the first place, by improving my search method by searching for directories (folders), rather than files, and by searching for higher bit rate mp3s (since high quality tends to imply a more caring user). When you sort by path and/or username it becomes quite evident when someone has a large collection of good music. Of course, this kind of technique was out of the technical reach of most of napster's users at the time...but it was effective. These same techniques are crippled on today's "P2P" networks because you have (in reality, not their claims) a much much smaller set of users to search from, horrible latency, and volatility of the network makes finding a user 5 minutes later, never mind a couple weeks later, quite unlikely....plus the bad searching and listing interfaces...ick.
    • Hell a group of about 100 of us now have our own private open nap network going and we have only high quality known good files.

      And that's what the *AA want. As long as the networks split and isolate, they can monitor them and pick them off as they become big enough. Also, since being a member of a closed "pirating ring" is as good as an admission of conspiracy, they can start to use RICO laws, too. Yummy...

      In reality, the only safety in P2P for illegal sharing was its ubiquity. Once that's gone, you become an easy target. It's a lot easier to control five people than a mob of thousands.

  • Why not block all IP's in RIAA/MPAA IP ranges and any ranges that are putting crap onto the network.
    • what if the they take a few AOL accounts to do the poisoning: mind you that these have flexible IP adresses. Therefore you have to block all of AOL, which is A-OK by the RIAA I suppose...

      Or you could not live in the US and have no problem

      • Therefore you have to block all of AOL, which is A-OK by the RIAA I suppose...

        That would be nice to see, RIAA sat on by AOL.. cos ultimately that would be a breach of AOL's terms of usage.
  • by decarelbitter ( 559973 ) on Tuesday September 03, 2002 @10:21AM (#4188628)
    From the webpage:
    In particular, our analysis of the model leads to four potential strategies, which can be used in conjunction:
    1. Randomly selecting and litigating against users engaging in piracy
    This seems to be the option which involves the least technological action. However, randomly wouldn't work, if it were only because the P2P users don't all live in the same country, hence different laws apply. So some sort of not-so-random selection proces has to be implemented.

    2. Creating fake users that carry (incorrectly named or damaged files)
    Modern P2P programs support downloading files from multiple sources. If someone downloads such a fake file and discovers it, the file will almost always be deleted. So, these files will not propagate through the network, or at least not as fast and as much as the correct files. So a search where one file can be downloaded from many sources is in this case preferable before one with not many nodes serving the same file.

    3. Broadcasting fake queries in order to degrade network performance
    Now this is an interesting thing. The makers of the P2P programs who are being targeted by fake queries could ban such users, or could build in a feature where the user of a P2P program can ban a host his/herself, so that it will be excluded in further searches.

    4. Selectively targeting litigation against the small percentage of users that carry the majority of the files
    Some users carry gigs and gigs of files, but that doesn't mean they're very popular. If I setup a server where I host my 20CD collection of Mozart works I'll probably won't get as much traffic as when I publish the Billboard 100. It's not the quantity, but the content of the files served that counts. Search for Britney and you'll receive 1000's of hits. Search for Planisphere and a lot less results will show up.

    Nevertheless it's a good paper.
    • This seems to be the option which involves the least technological action. However, randomly wouldn't work, if it were only because the P2P users don't all live in the same country, hence different laws apply. So some sort of not-so-random selection proces has to be implemented.
      I disagree. I think this would be a highly effective means, should it become necessary. Once you eliminate US based servers you've already removed some 90% of the acceptible providers for US citizens. When you further remove those highly developed countries that have close ties with the United States, which are apt to go along with RIAA when force is brought to bear, then you will leave the remaining pool of servers to 1% or so of what it was. That 1% cannot sustain even 1% though, because the demand will be so high that it will effectively block all practical use. Now, mind you, this cooperation need not require super-active law enforcement or anything to that effect. In fact, I would argue that the the relatively simple complusion of prompt response from the servers' ISPs for suspension of service for, say, 90 days suspension of service would be more than enough to deter the file servers given that there is no benefit for being a file server and every reason not to be.

      Modern P2P programs support downloading files from multiple sources. If someone downloads such a fake file and discovers it, the file will almost always be deleted. So, these files will not propagate through the network, or at least not as fast and as much as the correct files. So a search where one file can be downloaded from many sources is in this case preferable before one with not many nodes serving the same file.
      Again, I disagree. It has been my experience than many users do not delete damaged files, they simply leave them. The so-called swarmed downloads only further expose the downloads to corruption since all it really takes is one corrupt segment to either cause the program to crash or at least play really unbearable sound (or whatever media). To further compound the problem, the industry could use their cash and their legitimacy to be the most available and desirable servers (so that your swarmed downloads are almost certain to select its servers).

      Now this is an interesting thing. The makers of the P2P programs who are being targeted by fake queries could ban such users, or could build in a feature where the user of a P2P program can ban a host his/herself, so that it will be excluded in further searche
      This is impossible in any current decentralized P2P scheme, don't you get it? How is any routing servent to know that the other servent it is connected to is not passing legitmate requests the hosts it is purporting to represent? It can't. It might attempt to throttle the traffic of any from any given node, but then that would necessarily mean throttling the ENTIRE network, which would be self-defeating.

      Some users carry gigs and gigs of files, but that doesn't mean they're very popular. If I setup a server where I host my 20CD collection of Mozart works I'll probably won't get as much traffic as when I publish the Billboard 100. It's not the quantity, but the content of the files served that counts. Search for Britney and you'll receive 1000's of hits. Search for Planisphere and a lot less results will show up.
      While it is almost certainly true that only 1% of the content accounts for 99% of the traffic, it is also true that only 10% of the hosts account for almost all of the servers. Of those 10%, roughly half of them, (those that HAVE the popular files, are SHARING, are on truly HIGH speed network, and are NOT FIREWALLED) account for the majority of it. If you take the biggest servers out first, you will have a big impact. What's more, once it becomes established that there are likely consequences for being an effective server of files, the industry need not literally attack every last one of them. They need only use fear to their advantage and allow the servers' own self-interest to take over.
  • by FreeUser ( 11483 ) on Tuesday September 03, 2002 @10:24AM (#4188643)
    The answer is quite simple, and would be very difficult for the sabateurs to subvert.

    GPG signatures (which BTW include a checksum) of content, with said signatures refering to an online alias rather than a real person (thereby maintaining anonymouty).

    A web of trust is formed, in which HollywoodDude is known and trusted, and has signed RipperGod's key, who in turn has signed FairUsers key, and so forth.

    Provide a separate way of obtaining the keys (e.g. multiple independent websites, multiple independent keyservers, and so forth), and people can simply filter out anything submitted by untrusted users. If something submitted by someone outside of the trust ring, and someone who is trusted sees the item and determines that it is worthwhile/good/whatever and not a decoy, they could sign the item themselves.

    Gaining trust would of course take time, probably requiring many worthwile submissions, but that is true in real life anyway, so why should it be any different online.

    If someone violates their trusted status (or their private key is stolen, which BTW would be a violation of the law), others in the ring of trust could revoke their trusted access and blacklist their signature.

    It isn't as convinient as just being able to share something with little or no thought, but it is emminently doable, and there really is no straightforward way to undermine such an approach.
    • It isn't as convinient as just being able to share something with little or no thought

      That's exactly what the paper's authors said, pointing out that the decrease in convenience is in itself a real danger, and they were right.

    • You don't need to know who authored the file. My suggestion of long ago is this: maintain a service serperate from gnutella that rates content. Refer to that content by name, but include MD5 signatures. The first signature is for the first 1k, the second signature is for the first 10k and so on, through the logarithmic orders of magnitude, base 10. One final signature would be for the whole file.

      Now web sites can present reviews that tie into this new protocol with a URL (something like "gsig://sigs.mediahype.net/ab3827d9827eab39f2c-1") and that URL is then submitted to a signature-aware gnutella client which contacts the signature server, downloads the filename and signatures and then gets that file from Gnutella. The file download will be aborted if the signature fails to match at any of the signatures, or it will be aborted immediately if the file size is smaller or larger than the one in the signature server.

      Sure, you can still put out a 10-second clip with empty noise after, but the download will stop at that 10-second mark. What's more: a smart client can keep the section that DID match the signatures and look for an intact copy to CONTINUE from. Thus, truncated versions will now be ignored immediately.

      This introduces a centralized client-server model for trust purposes, but reviewers are not providing content, just reviewing it. The MPAA and RIAA could even put up servers that review valid promotional content, and warn users of copyright violations in other files! *This* is the way to solve everyone's problems at once (unless of course your problem happens to be a failing business model).
      • Bitzi [bitzi.com] stores information on files found on P2P networks, indexed by a TigerTree hash appended to a SHA1 hash. Support for it has been integrated into several Gnutella clients (ShareAza, Limewire, etc.), which have also come up with their own URL systems (gnutella:// and magnet:// are the two existing ones right now).
    • Great.

      So when the MPAA downloads Star Wars Attack of The Clones they know that I'm the one who ripped it!

      I'm not going to put my GPG (PGP) signature on a document with plans to hijack planes either.

      • Great.

        So when the MPAA downloads Star Wars Attack of The Clones they know that I'm the one who ripped it!


        Go back and read my comment. The comment, not the title. To wit:


        GPG signatures (which BTW include a checksum) of content, with said signatures refering to an online alias rather than a real person (thereby maintaining anonymouty).


        There is absolutely nothing about GPG that requires the key to refer to an actual, human identity. If everyone knows that TrustedDude is a trustworthy person, that is sufficient. No one needs to know that TrustedDude is in fact a 15 year old kid in New Jersey who spends his free time violating copyright (or perhaps not, there are all kinds of legitimate uses for P2P networks, not least among them improved accessibility to popular legal content, like free software whose primary ftp servers are often overloaded).
    • No, I'm afraid that system would just not be able to take off. Far too much inital setup, and a lot of continual maintenance.

      What is needed is source obfustication. Instead of connecting directly to a node with the files, we lay the network out like a system of routers. Each node can only communicate with it's neighbor, which means you can only know the next hop, never the source. Without a doubt this exponentially decreases download speeds (each node downloads and uploads the file before it gets to you) but with swarming, and dynamic metrics for each node, it could work out. Of course caching would be the obvious next step.

      As for overloading the network with crap, go right ahead. I'm more than happy to waste RIAA/MPAA bandwidth time and time again. But that's just me.

      For a solution, let's be democratic about it. When we search for a file, we find that 10 people have it. If 8 have marked it as being a fake, I probably won't download it.

      So that takes care of everything except 3. Broadcasting fake queries in order to degrade network performance. The downside of p2p networks is that they have a lot of overhead. Privacy demands overhead. However, if one is sending excessive requests, they may be blocked temporarily. Of course the lower the TTL, the higher the tolerance for a large number of searches.

      Any questions?
    • Do GPG signatures on blocks(about 50-100k) of files instead of entire files. When you have a contradiction of checksum's on blocks of files, alert that the user that someone is a liar. Take all the results of the search for that file, and all the gpg signatures and present the user with two options that are the sum of their trust levels. Most files can be previewed to check if it is bogus, and the user can blacklist anyone that even trusted that host, and their IP's as well. From then on, none of those IP's will be allowed to connect to this host. Eventually, they'll exhaust their IP supply before they end piracy.

      Obviously the user would get to select the appropriate action if one of the files are just better than the other with a rating mechanism as well :) (A per file rating instead of a per host rating)

      Other advantages to this method are:
      *Checksums can't be faked except in NP time. (use a random block size to thwart a super computer precalculating bad blocks that MD5 to the right hash... use multiple hashes)
      *Multiple host download is gauranteed to be the same file (even when being poisoned).
      *A computer need not have the entire file to share a block of the file, therefore files propogate the network in a more exponential manner. (host A gets block 1 from B. Host C gets block 2 from B, Host C and A trade blocks 1 and 2. Host D comes along and wants the same file, and can download from A and C instead of bogging down B. Works even better because all connections that I've seen are duplex even if they have a slower upstream. Conserve network bandwidth by refering downloaders to other people who have downloaded before... search for the GPG signature of the hosts on the network.)

      Overall, I see this kind of thing being implemented very soon because it's not that difficult, and it's pretty obvious. Maybe the next edition of Gnutella will support this.

      Of course there are loopholes where the RIAA/MPAA could buy half a million IP addresses or have a lot of computers on the network, but you don't have to have an unbreakable system, just a system that costs more to break than they think they will see in profits from breaking it.
  • faked hashes (Score:3, Interesting)

    by vurtigo ( 605110 ) on Tuesday September 03, 2002 @10:24AM (#4188647) Homepage

    The problem faked hashes can be addressed using trees of checksums rather than just a simple checksum although a workable implementation would require embedding into the P2P protocol.

    The idea is you break the file up into smallish sized blocks (100k or so) and generate a hash for each one of these. For each 8 first level hashes, you feed them into a crypto hash function to generate a second level hash. For each 8 second level hashes... you generate a third level hash. This allows a continuous (per 100k blocks) proof that the content is valid... The size of the proof grows with the log of the content so it is not much of a problem.

    • The crew at the Open Content Network [open-content.net] have released a specification for serializing hash trees. The specification is called the Tree Hash EXchange (THEX) [open-content.net] and is being implmented in both the Open Content Network and Gnutella. Furthermore, this specification is compatible with the TigerTree hashes used for Bitzi [bitzi.com].
  • by Anonymous Coward on Tuesday September 03, 2002 @10:25AM (#4188652)
    The RIAA/MPAA don't need to poison P2P networks. Nor do they need to use lawsuits and the threat of DMCA. The easiest, best way to stop illegal sharing of copyrighted materials is to provide a legal, reasonably priced electronic distribution alternative.

    Really. Most users, given the choice, will pick the "honest" legal way to get their music and videos. Will there still be pirates? Of course, but you can never stop them and, heck, you're not losing money on them anyway. They wouldn't spend the money on the music.

    Treat honest customers as honest, embrace new distribution methods. The problems go away. Think of the cost savings: they wouldn't have to buy any more senators.
    • by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Tuesday September 03, 2002 @11:27AM (#4189074) Journal
      Really. Most users, given the choice, will pick the "honest" legal way to get their music and videos. Will there still be pirates? Of course, but you can never stop them and, heck, you're not losing money on them anyway. They wouldn't spend the money on the music
      In fact, really... most users, given the choice will take the least expensive road available to them as long as their chances of being caught are minimal, and as long as it doesn't involve stealing anything tangible. If you think most people are decent, law abiding citizens, why not take a poll and see what percentage of drivers nowingly speed? The fact is that Piracy is perceived by many as a "victimless crime", so there's no justification for a law against it in most people's opinions. These people will continue to violate the law so long as they feel they can continue to get away with it.

      While lowering the price of the media would make *some* difference, it wouldn't make enough of a difference to be worthwhile.

    • I'd rephrase that to "most users will pick the *easiest* way (not necessarily the cheapest or most honest). But the principle is the same. Make it *easier* to find the desired MP3 from an RIAA server, make the downloads reliable-quality, dirt-cheap, and encumbrance-free, and no one will bother with the perils and pitfalls of P2P.

    • I totally agree with you... The problem is that the accessibility of the internet flies in the face of traditional licensing models for film and music. Record and film studios are accustomed to portioning out the rights to their products in nice tidy chunks - US broadcast rights for year X, European broadcast rights for year Y, VHS disribution rights for Africa, airline/radio play rights, sequel rights, rights to re-use the content in other productions, etc. (it used to be that natural borders kept these categories separate; now they have to use artifical borders like the DVD region coding system).

      But with internet distribution there is only one "right" to sell: once the content is on the net, anyone can get it, anywhere, anytime. While a tremendous boon for consumers, this completely destroys the old, picture-perfect system of nice little independent packages of "rights." And that is why traditional media companies are keeping their heads buried in the sand, horrified at the collapse of their nice neat rights packages, hoping that this whole internet distribution thing will finally blow over. They are praying for ubiquitous DRM systems to re-create all those nice little borders...
    • Two Problems (Score:2, Interesting)

      I see two problems with this idea.

      1. Their problem They don't want to change. They don't want to give in to this non-physical technology. They don't understand it, so they condemn it. It's human nature. They aren't simply hard-headed.
        -or-
      2. Our problem They will sell it to us for $5 per 64-bit mp3 to make up for the "lost sales" on the "pirated" copies. 128-bit will cost you $10. They won't offer any higher quality because it would "take away from CD sales."
  • by Anonymous Coward on Tuesday September 03, 2002 @10:25AM (#4188655)
    tune, I may end up with somthing thats bland, repetitive and annoying.

    And, pray tell, how am I supposed to know the difference?

  • Simple! (Score:5, Funny)

    by Eric_Cartman_South_P ( 594330 ) on Tuesday September 03, 2002 @10:31AM (#4188689)
    Everyone posting a real song should name it beggining with, "RIAA sucks, fair use is good, and Disney love$ politicin$". They would never want to spread such text, so every song name beggining with the text simply MUST be real.

    • Re:Simple! (Score:4, Interesting)

      by decathexis ( 451196 ) on Tuesday September 03, 2002 @11:12AM (#4188974) Homepage
      A more 'toothful' modification of this idea would be to require all files to include some DMCA-protected text, like DeCSS.

      Or, maybe, a "licence":

      By making this File available on the Network, directly or through an Agent, the Distributor hereby
      gives up any and all Rights to its Content, as well as any other Works of Art matching this File in name.


      Having distributed content together with such licenses (or hired someone to do so), it might be a bit harder for the labels to defend copyright claims for individual songs.

  • I think webs of trust are a good idea.
    Poisoning such a web could prove difficult. I trust personal friends highly, the aren't a poisoning group.
    People I or they don't know well won't get a high trust rating, and would be suspected if they were poisoning the group.

    I think slashdot type moderation works well too, combined with a decent sized web of trust should be a pretty stable system
  • Flooding a network with spoofed files would drive users to more reliable music sources -- like the labels' own online sites.

    The problem is the labels don't have their own online sites. Sooner or later (its bound to happen) the labels are gonna hire some college grads who grew up on sharing and understand the problem. Maybe then a compromise will be reached.
  • by Kjella ( 173770 ) on Tuesday September 03, 2002 @10:51AM (#4188801) Homepage
    Checksumming - no good. Any program could pretend to have the right checksum, but send false data. No point in figuring out *afterwards* the download is corrupt.

    Webs of trust - hardly. Imagine a network of antis giving eachother good reviews, they'd certainly be better off than someone without any reviews at all. It's very *unlikely* that the one you're P2P'ing with has a trust chain you accept.

    "Database" of who are good traders and not - Fake databases would screw that, you wouldn't know which ones to trust as you have no central server. The problem is that if there's to be any real P2P exchange happening, it's usually *strangers* meeting.

    My friends could do a web of trust or a database, but then we'd much more likely to setup some mutual leech ftp servers instead and skip the entire P2P-networks.

    Kjella
    • If you find a poisoned file in a trusted chain, you can now discount that person, and that entire chain.
      Trust should work both ways.
      Several unrelated "I got a good file" ratings could give you a cloud of trust. I think it oculd work.
  • Use Limewire (Score:4, Informative)

    by asv108 ( 141455 ) <asv@nOspam.ivoss.com> on Tuesday September 03, 2002 @10:59AM (#4188868) Homepage Journal
    The latest versions of limewire use hashes from a specification called HUGE that probably defeat this type of posioning attack. You can check out a recent interview with limewire team here [zeropaid.com]. Go here [limewire.org] if you want to download the code or check out the dev docs(Which are pretty outdated).
  • by KelsoLundeen ( 454249 ) on Tuesday September 03, 2002 @10:59AM (#4188873)
    What the second-to-last paragraph in the paper? There's a missing word. A pretty important word, too. (How can this paper be featured all over the map and have an error like this?)

    Anyway, is it:

    "Or perhaps the carrying capacity of a well-designed P2P network is huge, and *NO* amount of flooding can overwhelm the network."

    Or:

    "Or perhaps the carrying capacity of a well-designed P2P network is huge, and *ANY* amount of flooding can overwhelm the network."

    Which is it: "no" or "any?"

  • by Jim McCoy ( 3961 ) on Tuesday September 03, 2002 @11:19AM (#4189025) Homepage
    I love the smell of undergraduate sophistry in the morning...

    The author of this paper seems to suffer from the common practice of those in a hurry to finish their term papers that if they somehow ignore the elephant in the room that disproves their point they might end up getting partial credit for impressing people with how well they can tap dance around the elephant. In this case the well-established practice of using a secure hash function as a self-verifying mechanism to prevent DoS attacks that try to flood a network with garbage files is the elephant.

    In his FAQ regarding the paper, Mr. Chen correctly addresses the problem of a lack of centralized authority in using hash functions as distributed/P2P but apparently did not make more than a cursory examination of the subject or else he would have seen the various methods available for solving such a problem. I can only assume this is the case because reputation systems beyond simple moderation are not addressed and flow-constrained trust networks are never mentioned in this section.

    As someone who seeks to pass off a "bad" file (this report) as a "good" file, perhaps sooner rather than later Mr. Chen will learn how the distributed moderation and trust system known as peer reputation works. Surely I am not the only one who finds it more than a little ironic that a paper by an author who claims that distributed moderation doesn't work is being submitted to a peer-reviewed journal in an attempt by the author to bootstrap his own reputation?
    • Overkill (Score:3, Informative)

      by Cryogenes ( 324121 )
      Distributed trust and peer review are fine and good but not even needed for the simple task at hand.

      Look at the warez scene to see how it goes. A handful of release groups whose names are known to everybody who is even vaguely interested is sufficient to ensure supply. If these groups are attacked by fake releases (rarely happens) they can use hash keys as you suggest (some already do).

      Websites like www.sharereactor.com also safeguard against fakes - another mechanism which is strong enough to defeat the entire problem by itself.

      What I am saying is that distributed moderating à la slashdot will not evolve. Instead, we will have a handful of "authorities" - Web sites or public keys - that everyone trusts.

      Note that authority - when not combined with power - is a Good Thing (TM).
    • It's also funny to see you present as a solved problem something that's actually a very active area of research and pretty much still in its infancy. If you ask ten people who've been working on peer reputation how it works, you'll probably get five saying "it doesn't...yet" and the other five giving you five (or more) different algorithms. You're probably correct that there's a solution in there somewhere, but please don't make people think all the interesting stuff in that area has already been done.

      In other words, watch out for that elephant. ;-)

  • by jidar ( 83795 ) on Tuesday September 03, 2002 @11:31AM (#4189094)
    Taken from Andrew Chens responses to the solutions:

    Although this idea works for newsgroups and some other centralized services, it does not with P2P. Basically, it comes down to the fact that you must trust whomever is actually doing the checksumming, or else they can just lie and publish false checksums. In the case of P2P networks, the checksumming is done by the same person you want to figure out if you can trust! As far as I know, this is an unresolvable problem.

    Actually, the checksums should still work I believe, in much the same way that file sizes work now. Consider the reason the files that are being injected are set to the same size as the real file; the purpose is to mask these files to the naked eye. Checksums could be used for the same purpose.
    The reason for this is because as people find good files they will tend to keep them while deleting the bad files. Sure if we only get 1 result back then we don't know one way or other, but if we have 10 results back and 8 of the 10 of the same checksum, we can assume those 8 are the good files.
    Of course the problem with this is that a great many people don't bother to delete bad files after downloading, but should the poisoning become too much of a problem we can entice more people to clean up their shared files by way of the client interface.

    All in all, I think this would combat poisoning very well.
    • Fake Checksums (Score:3, Informative)

      by nuggz ( 69912 )
      Here is a file
      Bobs_Song.mp3 5 M Hash -XXXXXXX
      You don't know that I gave you the wrong hash till you're done.
      It can only tell you that you have the wrong file, after you have it
  • I hope the same people who defends the right to distribute mp3 they don't own the copyright for, will be the same people who defends a person/company's right to violate the GPL.
  • The RIAA and all the lawyers in the world will never be able to completely stop pirating. Look at how much money the feds throw at drugs and the number of addicts on the street. If enough people want something, they'll get it.

    I know one of my chief frustrations is to search for a song and either have it incomplete, or be of poor quality (e.g. pops or other defects) or to simply have it not be the same song that I downloaded. If I could search for a song, pay $SOME_SMALL_AMOUNT (e.g. $1US) for it and download a 'known perfect' copy at my choice of bitrates (e.g. 128, 160, etc.) then sure as heck I'd do it.

    Distributing these poisoned files would take an enormous amount of bandwidth, so they'd have to have some sort of agreement worked out with ISPs and a mass-content provider, say Akamai. Akamai has tens of thousands of servers located in hundreds (if not more) of ISPs throughout the nation. I think on peak usage they're pushing out 100 GB/sec. in the US (if not more). Simply say "Ok Akamai, can we buy 10GB on each of your servers and push all these MP3s out?". Then you write a gnutella client for each box which offers all the MP3s up for distribution.

    I can't remember how the gnutella protocol works but I think it broadcasts search requests to the nodes that store a cache of what they have and what their neighbors offer and then can pass the request off. Have your client log all the requests (so you can tell the record companies which songs were requested more) and of course offer up your files when requested. If you do this with 10,000 boxes full of identical content chances are you're going to drown out any signal out there.

    If you're really tricky, you can even have the client 'fake' files so you don't actually need to have the file on the box; you could send a pre-existing obfuscated file, or even dynamically build and stream the poisoned MP3.

    Of course, all of this is moot if you still don't have a very easy, cheap method of offering MP3s online for the mass public. You could pitch it like this "Yeah, so you won't make much money off of offering $SOME_SMALL_AMOUNT for each MP3. But you're a fool if you think simply shutting Morpheus off will result in even 10% of the Morpheus users buying the actual CD or using a painful, userUNfriendly pay-per-MP3 system. However, what if we have a method to net you 20 or 30% of users who wouldn't pay you anyway?" So the pitch would be "We can't get you all of them, but our method would give you more than you're getting now!". Frankly the people who post on SlashDot (from the very negative response to the Subscription model) are not a good cross-section of the vast majority of internet users out there :).

    So in your obfuscated file you have it play maybe 20 seconds of the file and then say "Sorry, this is a copywrited file. Pirating files costs artists money. If you want to buy this MP3 for $SOME_SMALL_AMOUNT, please visit http://www.somestore.com. 80% of $SOME_SMALL_AMOUNT earned will go directly to the artist."

    It gives them a reason to buy it - not only do you have SomeStore.com very easily accepting payment, but you ACTUALLY PAY THE ARTISTS A MAJORITY OF THE MONIES EARNED! So it can quell the naysayers who say "Well the artist wouldn't receive anything anyway!" (rant: but who are you hurting more, the billion dollar-industry or the Artist who NEEDS even the small cut they receive from each CD sold?).

    Some drawbacks could be of course that someone writes a 'detector' to find and ignore the invalid MP3s, or they block the IP addresses of the servers, etc. but that is easily fixed. Most non-power users (e.g. the great and huddled masses of the internet) don't want to update their Morpheus client every time a new version is released. Heck, even programs which offer hassle-free updating (e.g. antivirus, windowsupdate.com) very rarely are by the majority of internet users. Also, you'd work out the server IP settings with the ISP so that they would rotate to a random IP in their pool - since most of the servers are located in most ISPs you couldn't ban the single IP but perhaps a subnet. But since the IPs are in the ISP, you have now banned a large chunk of users. If they are in every ISP, you will have to ban every ISP (see the problem in banning IPs?).

    So, to boil it down to a sentence:
    Have very easy-to-use, hassle-free, cheap, reliable, etc. method for users to buy MP3s and they WILL
  • by dotslash ( 12419 ) on Tuesday September 03, 2002 @12:04PM (#4189350) Homepage
    A P2P program call edonkey (don't laugh) has partially solved this problem.

    In order to dowload a file, you can use a URI such as (ed2k://|file|The_Adventrues_Of_Pluto_Nash(2002).C D1.FTF.eDKDistro.Sharereactor.bin|559778352|1b153e 31f5fdbe829488989d04dda2b1|/ [ed2k]
    ). The URI contains the "local filename", size and SHA-1 hash. A companion web site [sharereactor.com] acts as a directory of URI's for popular content. The content is screened by the folks running the site. It has now reached the point where the "pirate" teams have accounts and post SHA-1 encoded URIs before releasing the content into the wild. Most edonkey users don't use the embedded search and instead use directories such as sharereactor.

  • by mikec ( 7785 ) on Tuesday September 03, 2002 @12:15PM (#4189428)
    Mr Chen apparently does not understand public key cryptography. Using a "web of trust" does in fact work.

    The author writes

    For the uninitiated, checksums work by examining a file and creating a string that "fingerprints" the data. It can be used in many situations, but the most common application is to verify that a file has been correctly transfered. The basic idea, in relation to P2P, is that every file on a user's computer is checksummed, and this checksum is then published to everyone else. Then, it may be possible to create a directory of "correct" checksums, to make sure you are actually downloading what you want. Although this idea works for newsgroups and some other centralized services, it does not with P2P. Basically, it comes down to the fact that you must trust whomever is actually doing the checksumming, or else they can just lie and publish false checksums. In the case of P2P networks, the checksumming is done by the same person you want to figure out if you can trust! As far as I know, this is an unresolvable problem.

    This is not an unresolvable problem at all; this is where web of trust comes in. The basic idea is for the publisher to sign the checksum using his or her private key. Others can then verify the signature using the publishers public key. This allows me to verify, using only a few bytes of information, that a publisher named SecretAgent did indeed publish a file. If I know that SecretAgent has previously published a lot of "good" files, then the file is probably good. If I don't have any experience with SecretAgent, but I do know that PrivateBenji is trustworthy, and PrivateBenji vouches for SecretAgent, then the file is probably good.

    The author fundamentally misunderstands webs of trust:

    Another idea that is often proposed is moderation, specifically "webs of trust." That is, people keep lists of people they trust, and then they implicitly trust (often with diminishing degree) the people they trust, and so on. In the context of P2P, the each user would then receive a "trust rating," reflecting the number of people that trust them. However, this can also be defeated fairly easily, by creating groups of malicious users that trust each other - then, untrustworthy users may have high scores leading to problems in the future. This kind of fraud has happened on eBay, where people give themselves recommendations to mislead future partners.

    A web of trust is not a "trust rating" ala eBay. A web of trust is a specific group of people who vouch for each other. Creating a malicious group of people who trust each other does not cause problems. (In fact, it can actually help.) If I trust A, based on experience, and if A trusts B, based on experience, then I can probably trust B. The fact that C, D, and E are malicious doesn't cause problems, because neither A nor B trusts them.

  • If they get to poison the networks, then that means that they are using the networks --just as we are.

    I wonder what would happen if some ordinary user did the same things? Right or wrong?

    Dealing with the problem this way is far better than using the law because it is hard to define the law in a way that makes good sense for everyone long term particularly when we don't yet know how P2P could benefit us all.

    Besides, they can place any number of promotional information into their files just as easily as they can garbage and they should. Why not? They might even be able to write off more of the expense.

    What the media companies need is good marketing. They are the content source. (for now) All they need to do is add value in ways that leverage the network effect that P2P offers and they *will* make money.

    Anyway, the result of this is likely not all bad because file sharing will get somewhat marginalized, we all preview before we download large files and everyone is reasonably happy and free to use the net in creative ways.

  • Popular files are more likely to be valid. Poison is less likely to be popular. Poison sinks to obscurity.

  • block checksum (Score:3, Interesting)

    by bogado ( 25959 ) <bogado&bogado,net> on Tuesday September 03, 2002 @01:02PM (#4189747) Homepage Journal
    one could keep a trusted block signature for each file. Say you have signature file that has one MD5 for each x bytes of the file. This file and it's MD5 hash is the identity of the file. On would then choose to download this file before the file itself and then download the blocks of x bytes from the file in a rendomised order, and possibly from diferent nodes. I guess this would add some otherwise uneeded downloads, but would help to restart the stoped downloads and would detect poison nodes easily.

    To bad I am so late in posting this...
  • Is to create a network specifically dedicated to trading, say, opensource code, research papers, personal public diaries, and the like.

    (Bye bye, karma) I may sound like a troll, but at least I'm being honest.

    Peer-to-peer filesharing has a great deal of potential, but if its only popular use is piracy, well, we already get enough bad press, don't we? It'll only get worse.

    (Sorry about the soapbox I'm standing on...)
  • I wonder if the author has considered that the primary applications of this work are probably not in influencing file-sharing networks so much as in politics. The P2P network that first comes to mind is ordinary web access within China. This is a situation where the government has an active interest in preventing any politically sensitive information from being propagated within the country, and so the ideas of this paper are directly applicable.

    I'll leave the relevant ethical issues as a matter of discussion -- but I would suggest that this is a far more serious reason to be concerned about corporate research into network interruption.
  • Even simpler than all these attack strategies. Simply produce the produce the way customers want it.

    Enough people will defect to the faster, more direct, legitimate servers. Where they can get the whole album and a movie in 2 hours instead of 2 weeks. The price should be good enough to encourage this.

    The P2P networks relies on enough users mirroring enough copies of enough products. Reduce the user base and the number of nodes drops until it just doesn't work anymore.
    You can see this on the unpopular P2P networks now.

    So either you will end up with:

    1. a few users sharing lots of files (which can be picked off with civil copyright laws).

    2. a few users sharing few files (which means they can't find the files they want on the network, so are less likely to be running a P2P just to support other users, so the number of people spirals down).

    The one thing I don't think you will end up with is many people legitimately downloading and then sharing the files. Quite simply, you would eat up your bandwidth using P2P which you need to do the downloading.

    Another factor is the charging, many ISPs are moving to a download limit, e.g. TOnline is moving to 5GB limit per month, then pay 1.5 cents per MB.

    So a movie would cost $7 to download after you've used up the first 5GB. Or for that matter to upload to another user!
    So you could pull maybe 7 movies a month on the flat fee.
    A lot of users on P2P systems will disappear as this becomes the norm.

    So P2P is really just a temporary problem for copyright holders, just as long as they get their legitimate sales systems in place and don't go pissing off the consumers with DRM, funny licenses etc.

  • by bwt ( 68845 ) on Tuesday September 03, 2002 @02:50PM (#4190556)
    In particular, our analysis of the model leads to four potential strategies, which can be used in conjunction:

    1. Randomly selecting and litigating against users engaging in piracy
    2. Creating fake users that carry (incorrectly named or damaged files)
    3. Broadcasting fake queries in order to degrade network performance
    4. Selectively targeting litigation against the small percentage of users that carry the majority of the files


    This mostly summarizes the war on drugs and the government's strategy against alcohol prohibition in the 1920's. Neither worked and the countermeasures are simple and straight forward.

    A "directed" web of trust, objective quality measurement, and knowledge compartimentalization defeat the above strategy. The countermeasure of creating large numbers of mutally trusting attackers doesn't work when trust "flow" is taken into account. The keys to such a system are:
    1) trust is assymetric
    2) nodes define and change who they trust based on their own assessments
    3) Nodes protect their knowledge of the web of trust

    To see how this works, consider the cops and the drug dealers. The fact that the cops all trust each other does not result in the drug dealers trusting them. When a dealer is compromised, no matter how high up the chain it goes, trust shifts to rivals. Even when a kingpin falls, lines of trust will still exist that aren't compromised.

    Drug dealing is not as popular as file sharing, is substantially more damaging to peoples lives and society, and has motivated levels of funding that are not matchable by publicly traded firms (who must demonstrate at least mid-range ROI). Despite all of these advantages, the war on drugs has been a dismal failure. The bottom line is that the internet makes distribution of content a commidity, where it was formerly a task of enormous complexity and value add. Economics will determine the rest, unless the US adopts and maintains a totalitarian government.

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...