Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Internet

Collapsing P2P Networks 226

Andrew writes "I'm a undergraduate at the University of Washington, and after seeing this article on Salon, I dusted off a paper I had written last year. I examined P2P networks under a model usually used in describing animal populations, and found that it may be possible to cause a collapse in the network based on the intrinsic nature of the technology. Just as in animal populations, P2P networks require a sizable "critical mass" of users, and overharvesting can cause a systemic collapse - what if this were done on purpose? Quite ominously, my second recommendation on disruption was carrying damaged or incorrectly named files. You can read theabstract and the actual paper"
This discussion has been archived. No new comments can be posted.

Collapsing P2P Networks

Comments Filter:
  • by kraven_73 ( 586236 ) <kraven_73NO@SPAMoperamail.com> on Tuesday June 18, 2002 @07:35AM (#3721045)
    As stated in Salon there are a lot of bogus files. As for now there is enough stuff out there to get the song you want. However, maybe this is just a first try of the music industry to frustrate users of p2p networks. When they get things going they could probably flood the networks with songs, without any means to distinguish them from the good ones.
  • by Albanach ( 527650 ) on Tuesday June 18, 2002 @07:37AM (#3721052) Homepage
    Would a Slashdot style system of user moderation of shared files be a solution? Perhaps public and private keys to sign files as your online handle. Well known names would sson spring up and their signature could be used to verify the quality of the shared file before downloading. Of course there are many reasons people wouldn't want to sign files they might be sharing or have downloaded...
  • by GnomeKing ( 564248 ) on Tuesday June 18, 2002 @07:38AM (#3721055)
    P2P as a concept is unlikely to collapse

    Networks come and go, and encounter obstacles as the number of people using the network increases, but as one reaches "critical mass" another is born because the first became too unstable

    There are a large number of p2p networks at the moment, some are more successful than others, but importantly they use very different technologies, some of which are less affected by increasing numbers of users
    The fasttrack model appears quite comfortable with several million users, when the orignal gnutella protocol couldnt cope with that number (iirc)

    I'm sure that a number of p2p protocol designers will attempt to use the ideas in the paper to avoid the various pitfalls
  • by CaptainAlbert ( 162776 ) on Tuesday June 18, 2002 @07:43AM (#3721070) Homepage
    From the introduction in the paper:

    > This paper aims to address the following
    > questions:
    > 1. How must the depensation model be modified
    > in order to account for conceptual
    > differences between P2P networks and animal
    > populations?
    > 2. What are the conditions necessary to cause
    > catastrophes in P2P networks?
    > 3. What does the model imply about other ways
    > to limit or stop P2P networks?
    > 4. What is the most effective method to stop
    > P2P networks?

    I bet if you'd set out to answer a more interesting question, you'd have obtained a more interesting answer.

    Natural populations are well known for their ability to adapt to their environment; to mutate or change their behaviour in response to stimuli (threats) in their surroundings. If you truly wish to study P2P networks as if they are ecosystems or populations, there are plenty of more productive entymological questions to be asked.

    This paper reads like a biologist saying "given, say, fish - how can we go about killing them?"

    Nice to see *some* scientific analysis of this subject, however misdirected.
  • Definition? (Score:3, Insightful)

    by Mr_Silver ( 213637 ) on Tuesday June 18, 2002 @07:52AM (#3721083)
    Whilst one man alone is not going to change things I get a little annoyed by the fact that people call Napster a peer-to-peer application.

    In fact, it was really a client server application which only on downloading a file did it actually make any connection with any other user.

    True P2P has no server and needs no server. Napster had and needed such a thing to work.

    Personally I wouldn't call it peer-to-peer at all, but if I was forced to, I'd far rather call it a hybrid P2P and Client/Server solution.

  • 1. Randomly selecting and litigating against users engaging in piracy

    countermeasure: encryption + the bad press that randomly sueing upstanding citizens would bring.

    2. Creating fake users that carry (incorrectly named or damaged files)

    countermeasure: webs of trust & md5 hashes.

    3. Broadcasting fake queries in order to degrade network performance

    countermeasure: evolve to shun the DoS nodes (again, webs of trust & a 'witness system' needed).

    4. Selectively targeting litigation against the small percentage of users that carry the majority of the files

    countermeasure: This being the most effective [scare] tactic of the four, the best way to deflect it would be hiding your identity, or somehow spreading everything available very thin (freenet style) for plausible deniability, or serving from offshore, or rotating selections...

    --

  • by CoderByBirth ( 585951 ) on Tuesday June 18, 2002 @08:08AM (#3721138)
    I agree that it would probably be possible to quite easily kill any P2P network; imagine one of the nodes in any Gnutella-type network sending faked information all over the place or something similar, or some kind of malignant Direct Connect client.

    But let's say that the music industry/whoever did this, would it be legal just because P2P networks are "possibly used" for distributing copyrighted material?

    I don't see the difference between sinking someones Direct Connect hub and launching a DoS attack against a webserver.
  • Er, what? (Score:4, Insightful)

    by Rogerborg ( 306625 ) on Tuesday June 18, 2002 @08:08AM (#3721139) Homepage

    This is hardly news. I can't remember the last time that I shared a music file from gnutella that was correctly named, labelled, untruncated and not a radio edit (mea non culpla, the first thing that I do is to fix the damn things, before making them available for re-sharing).

    For exe's, it's even worse. There seems to be a deliberate misnamimg of some files, e.g. GTA2 labelled as GTA3, or in some bizarre cases files named as "game Foo (actually game Bar)". What on earth is the point of that? If you're warning that there are misnamed versions out there with this filesize, then say that, otherwise just name it correctly and be done with it.

    Porn is the worst of all. I've lost count of the number of god damn bangbus promos (or worse, trailers that spawn popups) that I've shared and ditched, and I'm now so sick of it that I won't download anything under 5MB (most of the trailers are smaller than that).

    What I can't understand in all this is that I'm sharing these from other gnutella users. Sure, they are injected through malice (or avarice), but what is wrong in the heads of users that they don't understand that this is our network, and our responsibility to clean up the file naming? Nobody is going to step in and do it for us. It's only going to get worse over time, and I'd rather download three different but accurately named versions of the same file than one misnamed version that turns out to be another badly encoded asian lipstick lesbian popup spawning commercial.

    Repeat the mantra: our network, our responsibility.

  • by Joakim A ( 313708 ) on Tuesday June 18, 2002 @08:21AM (#3721165)
    Yes. Distributed networks without central servers is the way to go. A protocol for fingerprinting of all files and user moderation would be really neat. Might be pretty hard to implement though as we can't trust client side software to calculate the fingerprint. Especiallay as we want to check the file before it is downloaded. Signing might be the way to go but then you can tie the rip to the 'creator', don't want that...

    Anyway, these things tend to get solved by smart people who have way to much spare time. I am truly amazed of the amount of work dedicated to cracking and warez..

    /J
  • by Hellkitten ( 574820 ) on Tuesday June 18, 2002 @08:48AM (#3721245)

    what's needed, then, is a /distributed/ modeeration system

    And how do you intend to stop them from 'spamming' this distributed system with fake moderations?

  • by governorx ( 524152 ) on Tuesday June 18, 2002 @08:50AM (#3721258)
    Well if it isn't bad enough already I must say that a lot of files on a p2p network are already incorrectly named. Ever tried to download pr0n and find out that whats actually in the video is quite the opposite to whats in the filename. Seriously though, Ive seen people with files on their computers that are false in a certain way.

    So we have to go through the points of the author and refute them.

    1. Incorrectly named files. Been Done. Been done on purpose. People keep on d/ling. For some reason this example comes to mind: Busy Child from Chemical Brothers sounds exactly the same a Busy CHild from Crystal Method. And I have the Crystal Method cd. BTW, before the Napster fiasco ended, it was possible to crypt filenames.

    Also if you name a file incorrectly, the search engines on p2p clients will probably never hit them anyway. And if the filename is misleading other sources can be checked and if nothing else, the filesize/dates of modification can be compared.

    2. Broken files. It happens very often. But there are multiple sources on a p2p network so even assuming that clients get baited they will delete the useless/damaged file and re-d/l it from another source. Comparing filesizes before downloading is also a good strategy.

    3. As for overharvesting, its just a way to block some traffic and I doubt that this would be legal. If an organisation put resources to this end, many surfers will get a slower service than what they expect and cause backlash. If I get a group of people and we block an street intersection the cops would surely interfere. Essentially what he is saying is lets spam the p2p network to hell until its abandoned. Too bad we all know what we think about spam.

    Die llama spammers.
  • by catwh0re ( 540371 ) on Tuesday June 18, 2002 @09:45AM (#3721494)
    A noted idea was to spread around false files. Similar to how software companies distribute blank cd's in local asian piracy markets, after purchasing a few dollars in cd's that are all blank or not working is meant to convince a person that it's not worth trying to buy pirates and a waste of money. This is different on P2P networks. Usually on these networks you are downloading from people that are, technology wise, near you. Meaning that you are rarely downloading something from someone several countries away, usually only this when it's a rarer item. So what happens when you get a dud file, well normally you delete it, and don't share it for other users. (Also why we tend to see 96k mp3's getting the curb while 128+ and other more acceptable copies being more prevalent.) There is a nature to purify our file collections, deleting the unacceptable and keeping the cherished. In short: As time goes on the real version is always the most prevalent, and users will normally dig until they find it, purifying the virtual file base of a p2p network.
  • by bwt ( 68845 ) on Tuesday June 18, 2002 @10:42AM (#3721844)
    Aiding the enemy, huh?

    Only if you believe in security through obscurity.

    If these weaknesses exist, then sooner or later the RIAA & MPAA will find them. The RIAA will probably hire some "experts" and pay them big wads of cash for "consulting" to find such weaknesses. I wouldn't expect them to monitor Slashdot for research relevent to the P2P battles -- they are far too arrogant for that. Consider their CSS encryption scheme and misguided attempts to use watermarking, which were derided as buffoonery here. This is a battle that will be won by the side that has the better scientific analysis, and I believe that open discussion is a better scientific analysis paradigm.

    I think that ultimately, the weaknesses this author discusses must be addressed through some kind of peer review/rating system. A desireable attribute of a P2P system would be robustness to "attack". The internet has posed tremendously interesting problems in "signal-to-noise" improvement, and making networked systems filter noise better is a very desirable feature with important societal implications. Analysis like this can only spur the drive for solutions. If that drive is stronger on the P2P side than the publisher's side, then P2P will perpetually be ahead.

    An open forum might be able to achieve a state of "innovation dominance" over a "proprietary" opponent if a critical mass is achieved such that the opponent's practical capability is maximized only if they spend all of their time trying to "keep up" with innovations available in the open forum. Knowledge is power, so the more knowledge that enters the fray via an open forum, the closer that forum is to innovation dominance.
  • by NitroWolf ( 72977 ) on Tuesday June 18, 2002 @01:34PM (#3723108)
    While the P2P networks may be similar to flesh and blood animals, the biggest difference is that evolution in P2P networking software occurs on timescales a biological system could not hope to match.

    Given a threat to its existance, a P2P network can adapt in a matter of hours at best, weeks or months at worst. To change the behavior, defenses, etc... of a biological animal would take thousands of years at best. The flip side is that new threats are developed almost as fast. But the bottom line is, eventually the signal:noise ratio on a P2P system can be tuned enough to allow a signal to get through, no matter what problems might plague it.

    Worst case scenario is that you have a voting system that allows *very* different users to vote on certain file share hosts, the ones with the most votes are generally going to be a valid source of the files... while this will present a higher profile target for the major corporations, if you have 10,000 of these high vote people, it's going to be financially problematic.

    Even if you have one or two, (or 50) cases of ballot box stuffing when it comes to high vote hosts, an authorized admin of some sort could flag that particular host as being bogus.

    There are many, many spin offs of this concept that would make it next to impossible for any single entity to compromise the P2P network into non-existance. It may be cumbersom, but it would work.

"The four building blocks of the universe are fire, water, gravel and vinyl." -- Dave Barry

Working...