Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Internet

Collapsing P2P Networks 226

Andrew writes "I'm a undergraduate at the University of Washington, and after seeing this article on Salon, I dusted off a paper I had written last year. I examined P2P networks under a model usually used in describing animal populations, and found that it may be possible to cause a collapse in the network based on the intrinsic nature of the technology. Just as in animal populations, P2P networks require a sizable "critical mass" of users, and overharvesting can cause a systemic collapse - what if this were done on purpose? Quite ominously, my second recommendation on disruption was carrying damaged or incorrectly named files. You can read theabstract and the actual paper"
This discussion has been archived. No new comments can be posted.

Collapsing P2P Networks

Comments Filter:
  • You know what we do with those types, don't you [bayinsider.com]?
    • Well if it isn't bad enough already I must say that a lot of files on a p2p network are already incorrectly named. Ever tried to download pr0n and find out that whats actually in the video is quite the opposite to whats in the filename. Seriously though, Ive seen people with files on their computers that are false in a certain way.

      So we have to go through the points of the author and refute them.

      1. Incorrectly named files. Been Done. Been done on purpose. People keep on d/ling. For some reason this example comes to mind: Busy Child from Chemical Brothers sounds exactly the same a Busy CHild from Crystal Method. And I have the Crystal Method cd. BTW, before the Napster fiasco ended, it was possible to crypt filenames.

      Also if you name a file incorrectly, the search engines on p2p clients will probably never hit them anyway. And if the filename is misleading other sources can be checked and if nothing else, the filesize/dates of modification can be compared.

      2. Broken files. It happens very often. But there are multiple sources on a p2p network so even assuming that clients get baited they will delete the useless/damaged file and re-d/l it from another source. Comparing filesizes before downloading is also a good strategy.

      3. As for overharvesting, its just a way to block some traffic and I doubt that this would be legal. If an organisation put resources to this end, many surfers will get a slower service than what they expect and cause backlash. If I get a group of people and we block an street intersection the cops would surely interfere. Essentially what he is saying is lets spam the p2p network to hell until its abandoned. Too bad we all know what we think about spam.

      Die llama spammers.
      • Will the real slim shady please shut up! [microdot.co.uk]

        You'll have to excues my P2P ignorance, never used it, but couldn't a rating system like E-Bay, or even Karma on Slashdot be used to label bad hosts and good hosts for the desired files? Then, the spoofing done by different members could be identified.
        • The problem would be---what makes the rating system any more trustworthy than the files themselves? Remember, both eBay and Slashdot have centralized control, a metasystem above the individual users.

          Such a metasystem, in a P2P environment, would need to be decentralized and yet trustworthy. (It's must not be as easy for a spoofing client to say "I'm trustworthy" as it is for them to say "I have files to share! Download my pustulent VBS payload!".) This is a complex research question, to which there's no one simple answer. A lot of people are trying, though... see some of the threads on this story for good links on the subject.

          --grendel drago
    • by bwt ( 68845 )
      Aiding the enemy, huh?

      Only if you believe in security through obscurity.

      If these weaknesses exist, then sooner or later the RIAA & MPAA will find them. The RIAA will probably hire some "experts" and pay them big wads of cash for "consulting" to find such weaknesses. I wouldn't expect them to monitor Slashdot for research relevent to the P2P battles -- they are far too arrogant for that. Consider their CSS encryption scheme and misguided attempts to use watermarking, which were derided as buffoonery here. This is a battle that will be won by the side that has the better scientific analysis, and I believe that open discussion is a better scientific analysis paradigm.

      I think that ultimately, the weaknesses this author discusses must be addressed through some kind of peer review/rating system. A desireable attribute of a P2P system would be robustness to "attack". The internet has posed tremendously interesting problems in "signal-to-noise" improvement, and making networked systems filter noise better is a very desirable feature with important societal implications. Analysis like this can only spur the drive for solutions. If that drive is stronger on the P2P side than the publisher's side, then P2P will perpetually be ahead.

      An open forum might be able to achieve a state of "innovation dominance" over a "proprietary" opponent if a critical mass is achieved such that the opponent's practical capability is maximized only if they spend all of their time trying to "keep up" with innovations available in the open forum. Knowledge is power, so the more knowledge that enters the fray via an open forum, the closer that forum is to innovation dominance.
  • As you have spent some time studying this field, you have probably run into realworld P2P happenings that follow the "rules" stated in your paper, could you name these, causes and results and the services in question?
  • Start of a bad trend (Score:2, Interesting)

    by rattler14 ( 459782 )
    True, the music industry could make tons of phony user aliases and bombard the servers with numerous useless queries and corrupt files. But where does it stop? This same technique could be used by companies to overload a competitors internet servers and capabilities... This method, though very possible, seems more like a mild virus attack that could potentially lead to a backlash of similar attacks from some pretty pissed off users.

    Seems like a plausible solution, with some negative side effects.
    • by oakbox ( 414095 ) on Tuesday June 18, 2002 @07:04AM (#3721123) Homepage
      Isn't that the point though? You can't go to court suing Sony because they created a lot of damaged versions of their songs. How does this sound?

      "I was trying to download an illegal copy of their copyrighted music and it was damaged!"

      I think this is one case where they could simply set up some distributed PC's (different IP's in different class C's) and just have P2P clients serving 'bad' versions of their own copyrighted music. Set up a little consortium of several different records companies, and it becomes DAMN hard to apply an effective filter.

      You might counter by setting up a central key list of 'correct' MD5 checksums, but then THAT list becomes a target of litigation from the RIAA.

      I don't like it, but it is an elegant solution. Use the power of P2P against itself. Anonymity works both ways.
      • by ColaMan ( 37550 )
        I think this is one case where they could simply set up some distributed PC's (different IP's in different class C's) and just have P2P clients serving 'bad' versions of their own copyrighted music.

        Somebody is already doing this ,to some extent.
        Searches on gnutella (for just about anything) bring up hits with file names like "your search terms.MPG" ... at 20k or so, I'm not interested. But still, it means somebody's written a client that replies to the P2P network with flawed data deliberately.
        • You've noticed this too? Is there any trend to the IPs of machines sharing these? Are they all at sony.com or something? (Hey, they could be grievously stupid...) In any case, perhaps some provider like Gnucleus [gnucleus.com] could provide a realtime ban-list of this kind of abuse. Centralizing this information wouldn't have any legal ramifications, and while it's a flawed, stopgap solution, it would work, at least for a while.

          I wonder if those results are virii or something. I usually just filter them out by requiring filesizes about 100k...

          Have you noticed the "[searchterms] free bangbus passes.htm" and .url files you get sometimes? I think it's just spammers doing some of this, and not the actual media industries.

          --grendel drago
      • I think this is one case where they could simply set up some distributed PC's (different IP's in different class C's) and just have P2P clients serving 'bad' versions of their own copyrighted music. Set up a little consortium of several different records companies, and it becomes DAMN hard to apply an effective filter.

        Time to build the undernet.
        The issue with the internet today is that everyone is welcome, as it should be. But it also mean that when devising open ended software systems, any user can recieve and make use of those tools, and by the same token, any user can misuse those tools.

        The solution would be an undernet. Existing alongside the current internet, it would rely on some extenssions to the Protocol that are not made widely available. Software could then be written that would function only for memebers of the undernet. Now, change the phrasing slightly, to undernets. Append a group identifier to all packet headers sent by undernet members to other undernet members. If abstracted widely enough, it could even allow different members to remain connected while cycling through spoofed IPs.

        This is most clearly desireable when the group that supports the undernet is working toward common goals or ideas. Then if members begin to polute the data-pool with broken files, picking out and removing the offender becomes both easier, and more effective.

        I'm sure a one of the IP wizards could come up with something more graceful and effective, so don't judge the superficiality of the proposed solution, so much as the concept of the closed group with regulated, but anonomous, access.

        -GiH
  • All these P2P programs can have a lot of problems:
    - music downloaded can be wrong or low quality
    - music is often illegal
    - download is speed is really slow

    What labels should do is let users download music for a small fee. For example by buying 100 songs for 100 bucks. Songs to be chosen by the user at any time.

    I think a service like this can be really succesfull. The labels do not need to be affraid of piracy because of the crappy quality or low survival rate of these programs.
    • by ranulf ( 182665 )
      What labels should do is let users download music for a small fee.

      Good point. I actually very much approve of these tactics being used to hinder people freeloading, despite being shocked how expensive music and films are too buy.

      However, I am very much for record compaies distributing music via the internet. By cutting out the end retailer, who typically take 50% of the final price of the CD, and removing the cost of media manufacture, there is no reason why these goods shouldn't be available for those that want to download them. There'll always be the hardcore fans who want the boxed editions (check out special edition box sets, etc..) but a lot of people also are only concerned about the actual music. In fact, it could probably even be argued that if music companies sold the music in MP3 formats, the die-hard music afficionardos would still buy the real CDs just for the quality difference.

      But back to the P2P issue. You get what you pay for. If you expect to download things for free, you can hardly complain when those things aren't what you expected. If you use a warez search engine, chances are you'll spend the next 10 minutes closing all the popup windows, even if you never actually downloaded anything! You don't see many people up in arms about that.

      And if you think the record companies don't deserve their profits, think again... Why do you think there are always scores of new bands signing up to these labels? Because the record companies invest heavily in lots of bands, many of whom will flop dismally. They invest in advertising, gigs, promotional CDs, PR parties, you name it. If they end up making 10 times the profit you think is far on a particular band, bear in mind that there were probalby four other bands that they promoted that didn't make it that got the chance.

      • by MoneyT ( 548795 )
        There's a slight difference here between the Warez sites and these new "tactics".

        With the warez sites, the ads are there because these guys can't find anyone else to host them, so they need the money for the ads. The ads are not being put up by Bungie, or Blizzard or EA or any of the other companies.

        As for the p2p networks however, these files are being placed with the intent of misleading the consumer. Unfortunately for the people trying to use this tactic is in the same way that moderation works on slashdot, so does moderation work in p2p. If a file is a crappy sound loop, no one (or very few people) will keep the file. They will simply go back out untill they find the right file. Then once they have it, they'll keep it. So picture it like this.

        The company distributes 100 sound loop files. After a month or so, the number of soundloop files is probably still 100 give or take (and with certain programs like Limewire, identical files are grouped). Now, as soon as one person buys the CD, there is a legit copy (legit meaning real). One person downloads his copy, now there are 2. One person downloads from each of them, 4. One download from each of them, 8. 16, 32, 64, 128. Etc etc etc. In the mean time, the sound loop is still at 100.

        Sure the soundloop tactic would be effective maybe for the first few weeks, but afterwards, it's more a waste of money.
  • As stated in Salon there are a lot of bogus files. As for now there is enough stuff out there to get the song you want. However, maybe this is just a first try of the music industry to frustrate users of p2p networks. When they get things going they could probably flood the networks with songs, without any means to distinguish them from the good ones.
  • by potcrackpot ( 245556 ) on Tuesday June 18, 2002 @06:36AM (#3721048) Homepage
    The practice of flooding the system with bad files is far more sinister than most of us realise.

    This is actually the next step in the Taliban's fight against capitalism. They are continuing their religious war, attempting to reduce our morale by preventing us listening to music, except in short frustrating bursts of the same 10 seconds.

    Their aim is to reduce us, to bring us down from within by sabotaging our right to Good Music In MP3 Format.

    We Will NOT give in.

    Uh, wait. Why did they start with 'No Doubt'?
  • So does that include my Goldfish and Parrot disrupting p2p?
  • by Albanach ( 527650 ) on Tuesday June 18, 2002 @06:37AM (#3721052) Homepage
    Would a Slashdot style system of user moderation of shared files be a solution? Perhaps public and private keys to sign files as your online handle. Well known names would sson spring up and their signature could be used to verify the quality of the shared file before downloading. Of course there are many reasons people wouldn't want to sign files they might be sharing or have downloaded...
    • The problem is that as soon as a registry with file signatures is done, RIAA will strike back saying "you can identify the faulty files, so do it".
      • what's needed, then, is a /distributed/ modeeration system - perhaps a bolt-on to the Gnutella protocol? RIAA/MPAA can't sue Gnutella, Inc., cos they don't exist - there are just people writing code and people running code. Yes?
        • Yes. Distributed networks without central servers is the way to go. A protocol for fingerprinting of all files and user moderation would be really neat. Might be pretty hard to implement though as we can't trust client side software to calculate the fingerprint. Especiallay as we want to check the file before it is downloaded. Signing might be the way to go but then you can tie the rip to the 'creator', don't want that...

          Anyway, these things tend to get solved by smart people who have way to much spare time. I am truly amazed of the amount of work dedicated to cracking and warez..

          /J
        • what's needed, then, is a /distributed/ modeeration system

          And how do you intend to stop them from 'spamming' this distributed system with fake moderations?

    • This was discussed [slashdot.org] last time slashdot covered [slashdot.org] this, and it seems like the consensus was that it would be trivial for an organization to create thousands of bogus users and stuff the balots.

      To me, a web of trust [slashdot.org] makes more sense. It would take some time to get "into the web", but it take even longer for an organization to build up enough trust to effectively distribute bogus files. As soon as they start, their trust is ruined, and everyone knows not to download from that person.
  • by GnomeKing ( 564248 ) on Tuesday June 18, 2002 @06:38AM (#3721055)
    P2P as a concept is unlikely to collapse

    Networks come and go, and encounter obstacles as the number of people using the network increases, but as one reaches "critical mass" another is born because the first became too unstable

    There are a large number of p2p networks at the moment, some are more successful than others, but importantly they use very different technologies, some of which are less affected by increasing numbers of users
    The fasttrack model appears quite comfortable with several million users, when the orignal gnutella protocol couldnt cope with that number (iirc)

    I'm sure that a number of p2p protocol designers will attempt to use the ideas in the paper to avoid the various pitfalls
  • by GnomeKing ( 564248 ) on Tuesday June 18, 2002 @06:41AM (#3721065)
    In particular, our analysis of the model leads to
    three potential strategies, which can be used in conjunction:

    1. Randomly selecting and litigating against users engaging in piracy
    2. Creating fake users that carry (incorrectly named or damaged files)
    3. Broadcasting fake queries in order to degrade network performance
    4. Selectively targeting litigation against the small percentage of users that carry the majority of the files
    • by Anonymous Coward
      ...Our four, four! potential strategies are:

    • choices 4 and 1 are the same.
      • Choice 4 is much more likely to give "good" results since more of the major holders of illegal material are targetted

        Holywood would get better results by shutting down illegal DVD manufactureres of spiderman in korea (or wherever they are) rather than someone who makes a copy for his friends

        Choice 1 gives everyone the same chance of being targetted and thus small time distributers/downloaders will be hit a higher percentage of the time and not have as great an effect on the overall level of content available
        • I've actually seen the VCD of Spiderman here in India and it's rotten. Asian subtitles, bad color, stretched vertically, terrible sound, etc. VCDs are really popular here, though, because the population simply couldn't afford the exhorbitant prices the MPAA would want to charge them.
    • 1. Randomly selecting and litigating against users engaging in piracy

      countermeasure: encryption + the bad press that randomly sueing upstanding citizens would bring.

      2. Creating fake users that carry (incorrectly named or damaged files)

      countermeasure: webs of trust & md5 hashes.

      3. Broadcasting fake queries in order to degrade network performance

      countermeasure: evolve to shun the DoS nodes (again, webs of trust & a 'witness system' needed).

      4. Selectively targeting litigation against the small percentage of users that carry the majority of the files

      countermeasure: This being the most effective [scare] tactic of the four, the best way to deflect it would be hiding your identity, or somehow spreading everything available very thin (freenet style) for plausible deniability, or serving from offshore, or rotating selections...

      --

      • by LordLucless ( 582312 ) on Tuesday June 18, 2002 @07:21AM (#3721163)
        Yes, you can probably counter all these tactics, but they would still do their job.

        If the labels can force p2p networks into a more complex model, it culls the less technically able users. I think if the p2p music sharing networks evolved into systems requiring md5 hash lookups, trust networks and other countermeasures, Joe Schmoe wouldn't be bothered using them. He wants something he can just hook up to, grab stuff, and leave.

        Music piracy has always happened. Its just booming now. They just want to stop the boom, not eradicate it entirely.
        • If the labels can force p2p networks into a more complex model, it culls the less technically able users

          That really depends how complex the user experience becomes. Napster was far more technically complex than the traditional "download from a website" model, but it still attracted millions of regular users. That's because all of that complexity was hidden behind a cute little easy-to-install UI. Kazaa is even more complex than Napster, but the user experience is almost exactly the same (excepting the spyware, of course.)

          Many of the countermeasures suggested above would be fairly easy to integrate in a transparent way, and I imagine they will be. In the long term, I think this is a losing game for the record companies. The cost of maintaining the "war" on p2p systems is going to be far, far higher (by many orders of magnitude) than the cost of building a smarter p2p network (and for many p2p coders, it isn't even about cost.) Also, the more successful the labels are, the tougher and more resistant the p2p networks will become.

          In the short term, on the other hand, it might make sense. If the labels make a strong effort to pull people into cheap, legal music download services now, this sort of disruption will serve them well. But I'm not particularly confident that the labels have their act together on this. (And even if they do, the battle will still go on over video downloads.)

      • 2. Creating fake users that carry (incorrectly named or damaged files)

        countermeasure: webs of trust & md5 hashes.

        Hmmm. My understanding is you can't compute an MD5 hash until you've got the whole file. So if the malicious host lies about the MD5 sum, you can't know until after you've downloaded the file.

        A workaround would be to publish checksums for 1/4 of the file, and 1/2 of the file, and 3/4 of the file, etc. If the MD5 sum fails to match, you abort further downloading. Perhaps the victim publishes a notification that a damaged file was found. (But then you have to worry about invalid, forged warnings.)

        This doesn't even solve the problem, it only limits the time wasted. Malicious hosts can create files that are accurate for the first 50%, and get the user to waste 50% of their time. Half a song is a lot less than half as valuable as a full song. Perhaps you add a "resume" function like FTP so that the user can try to download only the remainder of the song elsewhere, again comparing intermediate checksums along the way.

        • And this is where the "web of trust" is built. If you download a file from a malicious host, time to label that host as a black hole. Let's say you're using gnutella and assume that it has implemented the following feature. If you find that host A is malicious, you ignore the hostmask, similar to IRC host matching ("ignore host-a.domain.tld" or "ignore *.domain.tld"). Now, whenever a query arrives from that host, your client replies something like "440: I'm ignoring you, because you have a poisoned share file(s): weird_al-eatit.mp3."

          It's a simplistic approach, but it helps avoid the "unknown" on a first download from that host. It gives the user a course of action for future interaction with that host.


      • Within so few sentences...

        countermeasure: webs of trust & md5 hashes

        and

        the best way to deflect it would be hiding your identity

        Put simply you cannot hide identity and be a trusted partner in a transaction. And the cost of setting up "trust" mechanisms should be understated, I can generate my own CA and own X509s from openssl. How will you know what CA to trust ? Commerical ones cost money.

    • 5. As a content-provider, buy a cable company (AOL/TW?), control massive broadband marketshare, and cap upstream bandwidth, deny static IP's, and tip off the FBI to folks to illegally violate copyright.
  • by CaptainAlbert ( 162776 ) on Tuesday June 18, 2002 @06:43AM (#3721070) Homepage
    From the introduction in the paper:

    > This paper aims to address the following
    > questions:
    > 1. How must the depensation model be modified
    > in order to account for conceptual
    > differences between P2P networks and animal
    > populations?
    > 2. What are the conditions necessary to cause
    > catastrophes in P2P networks?
    > 3. What does the model imply about other ways
    > to limit or stop P2P networks?
    > 4. What is the most effective method to stop
    > P2P networks?

    I bet if you'd set out to answer a more interesting question, you'd have obtained a more interesting answer.

    Natural populations are well known for their ability to adapt to their environment; to mutate or change their behaviour in response to stimuli (threats) in their surroundings. If you truly wish to study P2P networks as if they are ecosystems or populations, there are plenty of more productive entymological questions to be asked.

    This paper reads like a biologist saying "given, say, fish - how can we go about killing them?"

    Nice to see *some* scientific analysis of this subject, however misdirected.
    • by capt.Hij ( 318203 ) on Tuesday June 18, 2002 @07:01AM (#3721111) Homepage Journal
      Some native populations have an amazing capacity for rebounding. This is especially true of insect populations which have a reputation of getting through population bottlenecks better than any other animal. However, the "Allee effect" is a well known biological phenomenon.

      Many populations have a critical population level, and if they fall below that level they have a low probability of rebounding. For example, fruit fly maggots are more efficient when eating in groups and cannot survive if they cannot get enough eggs on the same fruit.

      By the way if you pick up an ecology journal you are likely to find at least one paper on this subject. Trying to understand the Allee effect is an important aspect of understanding an organism and how it interacts with its environment.

      • Some native populations have an amazing capacity for rebounding

        Okay, fine. At least the major P2P networks (Gnutella, Napster) have a fair amount of diversity, providing resistance against a community-wide single attack wiping out the community.

        Furthermore, all of them can rebound quickly. A patch can be distributed across the network very quickly -- in a week or so, updates can be installed by the majority of users. That's *fast* compensation, allowing easy rebounding.
    • This paper reads like a biologist saying "given, say, fish - how can we go about killing them?"
      Not really, it's more like "Given this species and it's environmental factors, what change in those factors could lead to it's extinction?" which is an entirely reasonable and useful question to ask.
  • by Anonymous Coward
    a few years ago a denial of service attack was launched against the gnutella p2p network. this was done by sending out large 'ping' packets, which were then sent all throughout the network, effectively using up the entire bandwidth of many slower nodes. i don't recall how this was stopped, perhaps by a client update, or maybe the attackers just stopped. if the later is the case, gnutella is probably still vulnerable to such attacks.
  • Why don't they.... (Score:1, Interesting)

    by HowlinMad ( 220943 )
    Just make a Beowolf cluster of these networks then?
  • Definition? (Score:3, Insightful)

    by Mr_Silver ( 213637 ) on Tuesday June 18, 2002 @06:52AM (#3721083)
    Whilst one man alone is not going to change things I get a little annoyed by the fact that people call Napster a peer-to-peer application.

    In fact, it was really a client server application which only on downloading a file did it actually make any connection with any other user.

    True P2P has no server and needs no server. Napster had and needed such a thing to work.

    Personally I wouldn't call it peer-to-peer at all, but if I was forced to, I'd far rather call it a hybrid P2P and Client/Server solution.

  • Because everyone knows but none have yet said it.

    Sharereactor/Edonkey cannot be flooded with damaged or renamed files and neither can any other network/client that relies on hashes of the downloads to ensure the file is the same.

    As for using loads of bandwidth by doing loads of useless searches in an automated way, it would be very interesting to see how the different networks coped with this, especially the "next gen" edonkey, which is called "flock" and is in beta, and is supposed to use no servers...

    graspee

  • it seems that a simple and relatively inexpensive measure, which Herron says requires no more than "an intern in a room," might be worth serious consideration on the part of the recording industry.

    Further questions about the proposed intern scheme were referred by Stacey Herron to her associate Mr William Clinton, recently put in charge of seeing the 'preferred files replicate and populate'
  • You'll ruin it for all of us!
  • if a user doesn't like a previewed track, "then the industry and that record would have benefited from [that user's] ignorance."

    I'm suprised the record labels let Britney Spears have any time at all in that case - hell, think of the teenage boy market if all they knew was what she looked like *grin*
  • The RIAA will use this to their advantage if they are not doing so already. If the P2P community does not stay one step ahead, the RIAA will literally make Gnutella and other file sharing systems useless.
    • P2P is in an ever evolving state. Before Napster bit the dust, doomsayers were saying it would be the end of filesharing. Whoops, they missed the mark there. It's sort of like the Hydra. Cut off one head and 2 more take it's place. And in essence that's what will happen. IF the RIAA ever managed to kill gnutella (arguably the largest system currently), a whole bunch of people would be scrambling to create new networks. The result would be 3 or 4 new and effective networks. Sure they'd be smaller, but only temporarily.

      Like it or not, P2P is here to stay. It's a system with legal and illegal uses. The legal nature is what keeps the creation of a complete ban impossible, and the illegal nature is what keeps the system evolving.
  • The problem always existed with P2P networks that they could be poisoned, with misleadingly named files for multimedia files and viruses for wares.

    A system that permits sharing of copyrighted material is hardly going to provide a simple way back to resolve the real originator of the material. It is difficult to prove but probably likely that many bad files come from persons connected with the production and distribution of the original material.

    There are several sites now that publish checksums and sizes of P2P files. If you trust the site, then you have a way of validating files.

    The main issue remains is so-called leaching. That is, those who take but do not give. This may be out of fear or out of selfishness or it may even be just that the user is new. The community response seems to allow small downloads to anyone but to restrict larger downloads to those who do share themselves. I believe there are even some automated tools that will perform this check.

    • Re:MD5, etc. (Score:4, Interesting)

      by jawtheshark ( 198669 ) <slashdot.jawtheshark@com> on Tuesday June 18, 2002 @07:36AM (#3721204) Homepage Journal
      The main issue is so-called leaching.

      While I agree entirely with the fact that leeching is a problem, you should consider these facts:

      • Not many people have the bandwith to share. I don't, I share nevertheless but restrict upload speed to 3KByte/second and 2 allowed connections. Why? I have only DSL 256/64kbps, which means I have about 8Kbyte/second upload and I give away a potential 6. I find that generous. This is however not enough! People do not have the patience to wait at these speeds, most of the time uploads that start on my machine (I check that from time to time) about 99% are cancelled by the remote side.
        Yet, I download! Most of the time pr0n, and from time to time music (usually when I heard a good song on the radio).
      • Firewalls. I have a firewall... and I will not in any case turn it of because I want to run Gnucleus. This effectively reduces my own choices to download: anyone who runs a firewall too is not able to communicate with my machines. If everyone runs a firewall, P2P networks like Gnutella would become useless. PUSH only works when the receiver does not have a firewall.
      So technically this makes me a leech: I want to share files but due to bandwidth restrictions and due to firewall issues my sharing-abilities are clearly diminished. I have the goodwill but not the resources.
      It wouldn't be the first time a P2P client advertising T1 performance aborts me and I find that very frustrating. Probably people using the tools you mentioned, and considering me a leech. Nice... :-(

      Oh, and one thing about the whole P2P thing I don't like are the insanely large filenames filled with idiot keywords. Keywords in filenames....tsss.... Better would be a kind of database that associates keywords with files you chose on your harddisk. At least that way your files could have halfway decent-length filenames. Of course maintaining that would be a bit of work, but maintaining a filesystem filled with junk-filenames isn't any better.

      Finally a little question for the P2P junks out there: many people claim they get to learn new kinds of music by P2P sharing. I won't say it isn't true, but how? You still need a handle to search new stuff? You just type in random keywords, or what? Just curious, because I'd like to broaden my musical horizonts a bit.

      • people claim they get to learn new kinds of music by P2P sharing. I won't say it isn't true, but how?

        The best way is simply having people with eclectic tastes recommend random shit to you -- either IRL, IM, on message boards, etc. Another way, which I like, is using Amazon's recommendation system.

        Also, some file sharing apps have a "Browse User" option, and this is very handy for queueing up bands you've never heard of from a user with possibly similar tastes.

        Not everyone likes being spoonfed engineered culture...

        --

      • Finally a little question for the P2P junks out there: many people claim they get to learn new kinds of music by P2P sharing. I won't say it isn't true, but how? You still need a handle to search new stuff? You just type in random keywords, or what? Just curious, because I'd like to broaden my musical horizonts a bit.

        There are three things I do to find new music:

        1) Type in random keywords. This may seem silly, but it can yield interesting results.

        2) Search for a genre. You would be surprised at the amount of music that people catagorize/name by genre. Pick a genre that you don't know very well (IDM, dub, afrobeat) and search for it. You will get a seemingly random selection of music. Download these, listen to them and if you like them, search for the artist and/or stuff in the id3 tag. You will find more of their stuff, plus usually stuff they did with their friends.

        3) Listen to KEXP [kexp.org]. KEXP is possibly the best radio station in the world. They stream cd quality over the web. They are a public station (I'm a member) from Seattle. Check their time schedule (it's Pacific America time) and check out DJ Riz. This guy is the most inventive, relaxing, best DJ around.
      • I run DSL too (a faster version though) and limit my uploads to 2, I find that better than bandwidth throttling. The router/firewall shares the bandwidth nicely, so both uploads go reasonably quickly at about 8KB/sec each (thats bytes).

        Your point about file names is a given. Fast-track might be scumware but allows you to meta-tag files which is useful. This what I miss most when I use Gnutella.

        OTOH, I allow inbounds for P2P on my firewall and I have no problems sharing files.

  • The failiar of ANY system is ultimatly inevitable. Best bet is to play with it while it still works
  • by jake_the_blue_spruce ( 64738 ) on Tuesday June 18, 2002 @07:06AM (#3721135) Homepage Journal
    Collobarative Recommendations such as Amazon.com uses, (or Eigentaste [berkeley.edu] or RecTree [cs.sfu.ca] in academia) finally have algorithms that make it fast enough for an average PC to perform the operations. A decentralized version would not only foil spoofing and spamming, but would let you discover new things beyond the industry marketing machine. Does anyone have information on such work?
  • by CoderByBirth ( 585951 ) on Tuesday June 18, 2002 @07:08AM (#3721138)
    I agree that it would probably be possible to quite easily kill any P2P network; imagine one of the nodes in any Gnutella-type network sending faked information all over the place or something similar, or some kind of malignant Direct Connect client.

    But let's say that the music industry/whoever did this, would it be legal just because P2P networks are "possibly used" for distributing copyrighted material?

    I don't see the difference between sinking someones Direct Connect hub and launching a DoS attack against a webserver.
    • I don't see the difference between sinking someones Direct Connect hub and launching a DoS attack against a webserver.
      So do you have a problem with /. linking to webservers that are likely to go down due to the load?

      Unlike traditional web site DoS attacks, based on sending malformed messages (provable intent), DoS attacks in P2P can look like normal requests from normal clients that just come in really fast. IANAL, but much criminal law arises from intent, and the web DoSers (or bounce DoSers) clearly have intent. P2P networks just have a high-overhead protocol.

      I don't think, in the end, that you can rely on laws to stop such problems. If you design a flooding mechanism into a protocol, you better be sure to rate-limit somehow... Maybe make people do some amount of work to perform a flood (though precomputation becomes problematic, because you want it to some extent, but not too extreme an extent).
  • Er, what? (Score:4, Insightful)

    by Rogerborg ( 306625 ) on Tuesday June 18, 2002 @07:08AM (#3721139) Homepage

    This is hardly news. I can't remember the last time that I shared a music file from gnutella that was correctly named, labelled, untruncated and not a radio edit (mea non culpla, the first thing that I do is to fix the damn things, before making them available for re-sharing).

    For exe's, it's even worse. There seems to be a deliberate misnamimg of some files, e.g. GTA2 labelled as GTA3, or in some bizarre cases files named as "game Foo (actually game Bar)". What on earth is the point of that? If you're warning that there are misnamed versions out there with this filesize, then say that, otherwise just name it correctly and be done with it.

    Porn is the worst of all. I've lost count of the number of god damn bangbus promos (or worse, trailers that spawn popups) that I've shared and ditched, and I'm now so sick of it that I won't download anything under 5MB (most of the trailers are smaller than that).

    What I can't understand in all this is that I'm sharing these from other gnutella users. Sure, they are injected through malice (or avarice), but what is wrong in the heads of users that they don't understand that this is our network, and our responsibility to clean up the file naming? Nobody is going to step in and do it for us. It's only going to get worse over time, and I'd rather download three different but accurately named versions of the same file than one misnamed version that turns out to be another badly encoded asian lipstick lesbian popup spawning commercial.

    Repeat the mantra: our network, our responsibility.

    • (or worse, trailers that spawn popups)

      Yeah, but that only happens with MS's wonderful ASF format. I got annoyed with that too and wrote a simple util that strips out all the ASF "Script_Command_Object's". No more popups.

      but what is wrong in the heads of users that they don't understand that this is our network

      Why do people piss in the pool? Why do punks tag bridges? Same thing.

      --

    • What I can't understand in all this is that I'm sharing these from other gnutella users. Sure, they are injected through malice (or avarice), but what is wrong in the heads of users that they don't understand that this is our network, and our responsibility to clean up the file naming? Nobody is going to step in and do it for us. It's only going to get worse over time, and I'd rather download three different but accurately named versions of the same file than one misnamed version that turns out to be another badly encoded asian lipstick lesbian popup spawning commercial.

      I think the problem is that people are pretty lazy and with big fat hard drives now pretty standard, what's the use of bothering to clean stuff up (other than keeping your porn downloads from others using your computer)? It's easier just to queue up a bunch of downloads and forget about the crap ones when you are done then it is to clean up the file names of the good ones and get rid of the bogus ones.

      Maybe once it gets difficult to get even a few good downloads, people will start becoming more responsible with their sharing, but I doubt it. They will just give up and say it just doesn't work anymore.
    • Sounds like a client side problem. I don't have popups in my browser. I can only wonder what messed up program would put popups in video files. Mplayer sure doesn't have that problem :-D
  • from the Salon article:

    "All it takes is an intern in a room."

    Isn't that how a President was brought down? ;-)
  • by Anonymous Coward
    the Circle is a fully scalable p2p system
    (unlike gnutella, kazaa, morpheus, etc.
    which do NOT scale)
    it is based on a decentralized hashtable.
    files have md5sums to avoid fake files.
    and it uses a trust network.

    see http://www.csse.monash.edu.au/~pfh/circle/

  • Filling the network with corrupt files might have some short-term effect but eventually those files get filtered out [openp2p.com] as users find them useless and start deleting them.
  • Sub-network. (Score:3, Interesting)

    by Grendel Drago ( 41496 ) on Tuesday June 18, 2002 @07:59AM (#3721293) Homepage
    Hmm; I wonder if limiting access to the network will help with this. I'm going to be setting up an on-campus Gnucleus-LAN network this fall, and it'll be IP restricted to on-campus addresses. (This is mostly to deal with restrictions on off-campus bandwidth.) Maybe this will offer better control... but then again, maybe people will be just as lazy about it.

    I wonder if this has been considered---limiting the network to a certain class of user (and thus excluding exploit boxes bought by AOL Disney Time Fox sitting on a DSL line somewhere in east Peoria) might be a solution to a few of these woes. Of course, for the user on some DSL line in east Peoria, that's little consolation.

    I could send little SMB alerts to people who shared broken files... but that'd probably just annoy them mightily, and the lazy bastards wouldn't properly maintain their files anyway.

    --grendel drago
  • Modern life should, we are told, lead to the formation of geographically independent social groups.

    /. is a decent example, but is too generic. Better examples are 'class of 91, X Uni' or 'ex-Cisco c.2001'.

    I'll bet a lot of these smaller groups operate pretty effective low tech P2P networks - i.e. 'anyone got a pic of murray vomitting?' gets mailed out - some responses come back.

    The problem with GENERIC P2P is that you need roughly the whole population of the world to take part to make it work effectively. And even then - you start needing to ask more specific questions - i.e. which murray?

    Surely small group P2P is the way forward. So a framework for niche-P2P, which then collects the results in a single 'ultraP2P' could operate well. Each niche P2P is autonomous - and can be voted out of ultraP2P by algorithm or human depending on ratio of good to bad behaviour.

    Each user is a member of one or more niche P2P, and shares directly with these. Even if ultraP2P sucks, the nicheP2P's individually can suck or rock depending on the group.

    This is how the web was won! And newsnet before it. No reason why it shouldn't work for P2P.
    • That's a really interesting idea... Gnucleus-LAN already offers some sort of clustering (though it is indeed by geographical area); now all we need is some kind of gatewaying interface to attach the LAN network to the greater Internet network.

      (Of course, in my case, all of the traffic would go through the gateway, not just the protocol data. The whole reason I'm setting up Gnucleus-LAN is to keep from using off-campus bandwidth.)

      --grendel drago
  • http://slashdot.org/comments.pl?sid=28940&threshol d=1&commentsort=0&tid=99&mode=thread&cid=3108069

    The idea is not a new one, and works surprisingly well.
  • A noted idea was to spread around false files. Similar to how software companies distribute blank cd's in local asian piracy markets, after purchasing a few dollars in cd's that are all blank or not working is meant to convince a person that it's not worth trying to buy pirates and a waste of money. This is different on P2P networks. Usually on these networks you are downloading from people that are, technology wise, near you. Meaning that you are rarely downloading something from someone several countries away, usually only this when it's a rarer item. So what happens when you get a dud file, well normally you delete it, and don't share it for other users. (Also why we tend to see 96k mp3's getting the curb while 128+ and other more acceptable copies being more prevalent.) There is a nature to purify our file collections, deleting the unacceptable and keeping the cherished. In short: As time goes on the real version is always the most prevalent, and users will normally dig until they find it, purifying the virtual file base of a p2p network.
  • by iiii ( 541004 ) on Tuesday June 18, 2002 @09:37AM (#3721803) Homepage
    The comparison of P2P use and animal populations is fascinating, and although the parallels will be limited it might yield some useful ideas.

    The most interesting parallel animal model has got to be the experiment [slashdot.org] designed to reduce (or eliminate) Tsetse fly [museums.org.za] (and other insects [new-agri.co.uk] ) populations by releasing large numbers of sterilized males into the natural population.

    The process of P2P sharing would correspond to mating, since you have to have two participants. A successful mating would correspond to a user getting the file they wanted, and therefore being more likely to use the service in the future. Getting a dud file is like a wild female mating with a sterilized male. Yields no offspring, user is less likely to continue using service. One or two cases of sterile matings have no impact, but when it is a significant percentage the population will decline, I'm sure the parallel with P2P holds.

    The author seems focused on studying the best way to eliminate P2P, though, so he's probably hoping to get research grant money from RIAA.

  • most downloads you try on any p2p network fail. because the user is too stupid to open up their firewall, or they intentionally closed it so it looks like you are sharing files but you really aren't, or they fill it with bogus files, etc...

    I have had to resort to a harvester approach on gnutella. set the bot looking for XY without Z and snag everything that matches this until I stop it. Yes I end up downloading that song 35 times, but out of those 35 files, maybe 2 are acceptable... so deleting I go.

    It's getting worse based on lazyness and pure stupidity and including pure greed. (I downloaded an mp3 that was nothing but a porn site advertisment.. EVERYTHINg from this one user was that same porn advert.. and he had at least 60 files all named popular band/song)

    unregulated P2P sucks and is getting worse. Most of us old-timers are reverting back to IRC and private ftp stashes (20-30 friends all dropping files there, retrieving etc...)

  • Aren't the users of these networks already doing this all on their own? I've seen versions of songs performed by bands that were dead by the time the song named in the title was actually written, Beethoven Symphonies attributed to nearly everyone else, etc. 99% of any group of users knows crap, and they seek to prove it at every turn, and yet these networks haven't killed themselves off from inside yet.
  • While the P2P networks may be similar to flesh and blood animals, the biggest difference is that evolution in P2P networking software occurs on timescales a biological system could not hope to match.

    Given a threat to its existance, a P2P network can adapt in a matter of hours at best, weeks or months at worst. To change the behavior, defenses, etc... of a biological animal would take thousands of years at best. The flip side is that new threats are developed almost as fast. But the bottom line is, eventually the signal:noise ratio on a P2P system can be tuned enough to allow a signal to get through, no matter what problems might plague it.

    Worst case scenario is that you have a voting system that allows *very* different users to vote on certain file share hosts, the ones with the most votes are generally going to be a valid source of the files... while this will present a higher profile target for the major corporations, if you have 10,000 of these high vote people, it's going to be financially problematic.

    Even if you have one or two, (or 50) cases of ballot box stuffing when it comes to high vote hosts, an authorized admin of some sort could flag that particular host as being bogus.

    There are many, many spin offs of this concept that would make it next to impossible for any single entity to compromise the P2P network into non-existance. It may be cumbersom, but it would work.
  • Here's a wild-ass idea.

    How do real life societies of humans and animals protect their communities from invasion?
    I will assume that the number of "legitimate" users vastly outnumbers the invaders.

    Could it be possible to mark or remember hosts who pass around bogus files, and then pass that information to other users on the network?

    For example, I download a file from a user or group of users. When the download completes, I naturally check it. The P2P client then pops up a window asking me whether the file was valid or not. If not, I hit "no". This "no" could then be associated in some sort of metafile that inclues the IP address and other identifying information about the host, and this metafile can be shared with all other users on the network.
    Like a virus, I could merge my metafile with the metafiles of other users on the system.
    On subsequent searches, the client will check the host results list against my metafile and warn me who the probable invaders are. I could also set filters that automatically exclude hosts from uploading and downloading if they have more than say, 5, black marks against them, effectively blackballing them from the network.

    I realise that the invaders could easily change their IP address, but after passing 5 bad files they'd be off the network again.
    • Then they just distribute bad metafiles that claim thousands of users are spreading bad files. I like the "web of trust" idea, where I keep a budy list, and my search results are ranked by the number of degrees of seperation via buddy lists.

"The only way for a reporter to look at a politician is down." -- H.L. Mencken

Working...