Collapsing P2P Networks 226
Andrew writes "I'm a undergraduate at the University of Washington, and after seeing this article on Salon, I dusted off a paper I had written last year. I examined P2P networks under a model usually used in describing animal populations, and found that it may be possible to cause a collapse in the network based on the intrinsic nature of the technology. Just as in animal populations, P2P networks require a sizable "critical mass" of users, and overharvesting can cause a systemic collapse - what if this were done on purpose? Quite ominously, my second recommendation on disruption was carrying damaged or incorrectly named files. You can read theabstract and the actual paper"
Aiding the enemy, huh? (Score:2)
Re:Aiding the enemy, huh? (Score:2, Insightful)
So we have to go through the points of the author and refute them.
1. Incorrectly named files. Been Done. Been done on purpose. People keep on d/ling. For some reason this example comes to mind: Busy Child from Chemical Brothers sounds exactly the same a Busy CHild from Crystal Method. And I have the Crystal Method cd. BTW, before the Napster fiasco ended, it was possible to crypt filenames.
Also if you name a file incorrectly, the search engines on p2p clients will probably never hit them anyway. And if the filename is misleading other sources can be checked and if nothing else, the filesize/dates of modification can be compared.
2. Broken files. It happens very often. But there are multiple sources on a p2p network so even assuming that clients get baited they will delete the useless/damaged file and re-d/l it from another source. Comparing filesizes before downloading is also a good strategy.
3. As for overharvesting, its just a way to block some traffic and I doubt that this would be legal. If an organisation put resources to this end, many surfers will get a slower service than what they expect and cause backlash. If I get a group of people and we block an street intersection the cops would surely interfere. Essentially what he is saying is lets spam the p2p network to hell until its abandoned. Too bad we all know what we think about spam.
Die llama spammers.
Re:Aiding the enemy, huh? (Score:2)
You'll have to excues my P2P ignorance, never used it, but couldn't a rating system like E-Bay, or even Karma on Slashdot be used to label bad hosts and good hosts for the desired files? Then, the spoofing done by different members could be identified.
Rating system. (Score:2)
Such a metasystem, in a P2P environment, would need to be decentralized and yet trustworthy. (It's must not be as easy for a spoofing client to say "I'm trustworthy" as it is for them to say "I have files to share! Download my pustulent VBS payload!".) This is a complex research question, to which there's no one simple answer. A lot of people are trying, though... see some of the threads on this story for good links on the subject.
--grendel drago
Re:Aiding the enemy, huh? (Score:3, Insightful)
Only if you believe in security through obscurity.
If these weaknesses exist, then sooner or later the RIAA & MPAA will find them. The RIAA will probably hire some "experts" and pay them big wads of cash for "consulting" to find such weaknesses. I wouldn't expect them to monitor Slashdot for research relevent to the P2P battles -- they are far too arrogant for that. Consider their CSS encryption scheme and misguided attempts to use watermarking, which were derided as buffoonery here. This is a battle that will be won by the side that has the better scientific analysis, and I believe that open discussion is a better scientific analysis paradigm.
I think that ultimately, the weaknesses this author discusses must be addressed through some kind of peer review/rating system. A desireable attribute of a P2P system would be robustness to "attack". The internet has posed tremendously interesting problems in "signal-to-noise" improvement, and making networked systems filter noise better is a very desirable feature with important societal implications. Analysis like this can only spur the drive for solutions. If that drive is stronger on the P2P side than the publisher's side, then P2P will perpetually be ahead.
An open forum might be able to achieve a state of "innovation dominance" over a "proprietary" opponent if a critical mass is achieved such that the opponent's practical capability is maximized only if they spend all of their time trying to "keep up" with innovations available in the open forum. Knowledge is power, so the more knowledge that enters the fray via an open forum, the closer that forum is to innovation dominance.
Re:Aiding the enemy, huh? (Score:2)
I have no doubt that such disruptive practices occur.
What the publishers probably don't yet understand is how to predict the results of a given level of attack. For example, I'm willing to bet that they distribute their attacks over the various P2P networks. It seems obvious from this paper that a focused attack that uses all available resources on one P2P network is probably more effective than splitting resources over all the networks. It's clearly a non-linear disruption to attack resources curve. In some sense, an attack that doesn't kill the network is useless when viewed as a long term battle.
Interesting document, any realworld links? (Score:2)
Re:Interesting document, any realworld links? (Score:1, Informative)
Re:Interesting document, any realworld links? (Score:2)
Start of a bad trend (Score:2, Interesting)
Seems like a plausible solution, with some negative side effects.
Re:Start of a bad trend (Score:5, Interesting)
"I was trying to download an illegal copy of their copyrighted music and it was damaged!"
I think this is one case where they could simply set up some distributed PC's (different IP's in different class C's) and just have P2P clients serving 'bad' versions of their own copyrighted music. Set up a little consortium of several different records companies, and it becomes DAMN hard to apply an effective filter.
You might counter by setting up a central key list of 'correct' MD5 checksums, but then THAT list becomes a target of litigation from the RIAA.
I don't like it, but it is an elegant solution. Use the power of P2P against itself. Anonymity works both ways.
Re:Start of a bad trend (Score:3, Informative)
Somebody is already doing this
Searches on gnutella (for just about anything) bring up hits with file names like "your search terms.MPG"
Ah! (Score:2)
I wonder if those results are virii or something. I usually just filter them out by requiring filesizes about 100k...
Have you noticed the "[searchterms] free bangbus passes.htm" and
--grendel drago
Re:Start of a bad trend (Score:2, Interesting)
Time to build the undernet.
The issue with the internet today is that everyone is welcome, as it should be. But it also mean that when devising open ended software systems, any user can recieve and make use of those tools, and by the same token, any user can misuse those tools.
The solution would be an undernet. Existing alongside the current internet, it would rely on some extenssions to the Protocol that are not made widely available. Software could then be written that would function only for memebers of the undernet. Now, change the phrasing slightly, to undernets. Append a group identifier to all packet headers sent by undernet members to other undernet members. If abstracted widely enough, it could even allow different members to remain connected while cycling through spoofed IPs.
This is most clearly desireable when the group that supports the undernet is working toward common goals or ideas. Then if members begin to polute the data-pool with broken files, picking out and removing the offender becomes both easier, and more effective.
I'm sure a one of the IP wizards could come up with something more graceful and effective, so don't judge the superficiality of the proposed solution, so much as the concept of the closed group with regulated, but anonomous, access.
-GiH
cheap music please (Score:1)
- music downloaded can be wrong or low quality
- music is often illegal
- download is speed is really slow
What labels should do is let users download music for a small fee. For example by buying 100 songs for 100 bucks. Songs to be chosen by the user at any time.
I think a service like this can be really succesfull. The labels do not need to be affraid of piracy because of the crappy quality or low survival rate of these programs.
Re:cheap music please (Score:3, Interesting)
Good point. I actually very much approve of these tactics being used to hinder people freeloading, despite being shocked how expensive music and films are too buy.
However, I am very much for record compaies distributing music via the internet. By cutting out the end retailer, who typically take 50% of the final price of the CD, and removing the cost of media manufacture, there is no reason why these goods shouldn't be available for those that want to download them. There'll always be the hardcore fans who want the boxed editions (check out special edition box sets, etc..) but a lot of people also are only concerned about the actual music. In fact, it could probably even be argued that if music companies sold the music in MP3 formats, the die-hard music afficionardos would still buy the real CDs just for the quality difference.
But back to the P2P issue. You get what you pay for. If you expect to download things for free, you can hardly complain when those things aren't what you expected. If you use a warez search engine, chances are you'll spend the next 10 minutes closing all the popup windows, even if you never actually downloaded anything! You don't see many people up in arms about that.
And if you think the record companies don't deserve their profits, think again... Why do you think there are always scores of new bands signing up to these labels? Because the record companies invest heavily in lots of bands, many of whom will flop dismally. They invest in advertising, gigs, promotional CDs, PR parties, you name it. If they end up making 10 times the profit you think is far on a particular band, bear in mind that there were probalby four other bands that they promoted that didn't make it that got the chance.
Re:cheap music please (Score:3, Interesting)
With the warez sites, the ads are there because these guys can't find anyone else to host them, so they need the money for the ads. The ads are not being put up by Bungie, or Blizzard or EA or any of the other companies.
As for the p2p networks however, these files are being placed with the intent of misleading the consumer. Unfortunately for the people trying to use this tactic is in the same way that moderation works on slashdot, so does moderation work in p2p. If a file is a crappy sound loop, no one (or very few people) will keep the file. They will simply go back out untill they find the right file. Then once they have it, they'll keep it. So picture it like this.
The company distributes 100 sound loop files. After a month or so, the number of soundloop files is probably still 100 give or take (and with certain programs like Limewire, identical files are grouped). Now, as soon as one person buys the CD, there is a legit copy (legit meaning real). One person downloads his copy, now there are 2. One person downloads from each of them, 4. One download from each of them, 8. 16, 32, 64, 128. Etc etc etc. In the mean time, the sound loop is still at 100.
Sure the soundloop tactic would be effective maybe for the first few weeks, but afterwards, it's more a waste of money.
Re:cheap music please (Score:2)
Re:cheap music please (Score:2)
Music industry strikes back? (Score:2, Insightful)
sinister motive? (Score:4, Funny)
This is actually the next step in the Taliban's fight against capitalism. They are continuing their religious war, attempting to reduce our morale by preventing us listening to music, except in short frustrating bursts of the same 10 seconds.
Their aim is to reduce us, to bring us down from within by sabotaging our right to Good Music In MP3 Format.
We Will NOT give in.
Uh, wait. Why did they start with 'No Doubt'?
Re:sinister motive? (Score:2, Funny)
Re:sinister motive? (Score:2)
"Yeah, that is true, but I think it's *supposed* to suck"
Re:sinister motive? (Score:1)
Animal Zoo (Score:1)
User moderate shared files (Score:3, Insightful)
Re:User moderate shared files (Score:1)
Re:User moderate shared files (Score:2)
Re:User moderate shared files (Score:2, Insightful)
Anyway, these things tend to get solved by smart people who have way to much spare time. I am truly amazed of the amount of work dedicated to cracking and warez..
/J
Re:User moderate shared files (Score:3, Insightful)
what's needed, then, is a /distributed/ modeeration system
And how do you intend to stop them from 'spamming' this distributed system with fake moderations?
Re:User moderate shared files (Score:2)
To me, a web of trust [slashdot.org] makes more sense. It would take some time to get "into the web", but it take even longer for an organization to build up enough trust to effectively distribute bogus files. As soon as they start, their trust is ruined, and everyone knows not to download from that person.
Re:User moderate shared files (Score:2)
Although a single network may collapse... (Score:3, Insightful)
Networks come and go, and encounter obstacles as the number of people using the network increases, but as one reaches "critical mass" another is born because the first became too unstable
There are a large number of p2p networks at the moment, some are more successful than others, but importantly they use very different technologies, some of which are less affected by increasing numbers of users
The fasttrack model appears quite comfortable with several million users, when the orignal gnutella protocol couldnt cope with that number (iirc)
I'm sure that a number of p2p protocol designers will attempt to use the ideas in the paper to avoid the various pitfalls
Well, atleast we know who skipped maths lessons (Score:5, Funny)
Pythonesque... (Score:1, Funny)
Re:Well, atleast we know who skipped maths lessons (Score:2)
choices 1 and 4 are different... (Score:2)
Holywood would get better results by shutting down illegal DVD manufactureres of spiderman in korea (or wherever they are) rather than someone who makes a copy for his friends
Choice 1 gives everyone the same chance of being targetted and thus small time distributers/downloaders will be hit a higher percentage of the time and not have as great an effect on the overall level of content available
Re:choices 1 and 4 are different... (Score:2)
Re:Well, atleast we know who skipped maths lessons (Score:5, Insightful)
countermeasure: encryption + the bad press that randomly sueing upstanding citizens would bring.
2. Creating fake users that carry (incorrectly named or damaged files)
countermeasure: webs of trust & md5 hashes.
3. Broadcasting fake queries in order to degrade network performance
countermeasure: evolve to shun the DoS nodes (again, webs of trust & a 'witness system' needed).
4. Selectively targeting litigation against the small percentage of users that carry the majority of the files
countermeasure: This being the most effective [scare] tactic of the four, the best way to deflect it would be hiding your identity, or somehow spreading everything available very thin (freenet style) for plausible deniability, or serving from offshore, or rotating selections...
--
Re:Well, atleast we know who skipped maths lessons (Score:5, Interesting)
If the labels can force p2p networks into a more complex model, it culls the less technically able users. I think if the p2p music sharing networks evolved into systems requiring md5 hash lookups, trust networks and other countermeasures, Joe Schmoe wouldn't be bothered using them. He wants something he can just hook up to, grab stuff, and leave.
Music piracy has always happened. Its just booming now. They just want to stop the boom, not eradicate it entirely.
Re:Well, atleast we know who skipped maths lessons (Score:2)
That really depends how complex the user experience becomes. Napster was far more technically complex than the traditional "download from a website" model, but it still attracted millions of regular users. That's because all of that complexity was hidden behind a cute little easy-to-install UI. Kazaa is even more complex than Napster, but the user experience is almost exactly the same (excepting the spyware, of course.)
Many of the countermeasures suggested above would be fairly easy to integrate in a transparent way, and I imagine they will be. In the long term, I think this is a losing game for the record companies. The cost of maintaining the "war" on p2p systems is going to be far, far higher (by many orders of magnitude) than the cost of building a smarter p2p network (and for many p2p coders, it isn't even about cost.) Also, the more successful the labels are, the tougher and more resistant the p2p networks will become.
In the short term, on the other hand, it might make sense. If the labels make a strong effort to pull people into cheap, legal music download services now, this sort of disruption will serve them well. But I'm not particularly confident that the labels have their act together on this. (And even if they do, the battle will still go on over video downloads.)
Re:Well, atleast we know who skipped maths lessons (Score:2)
countermeasure: webs of trust & md5 hashes.
Hmmm. My understanding is you can't compute an MD5 hash until you've got the whole file. So if the malicious host lies about the MD5 sum, you can't know until after you've downloaded the file.
A workaround would be to publish checksums for 1/4 of the file, and 1/2 of the file, and 3/4 of the file, etc. If the MD5 sum fails to match, you abort further downloading. Perhaps the victim publishes a notification that a damaged file was found. (But then you have to worry about invalid, forged warnings.)
This doesn't even solve the problem, it only limits the time wasted. Malicious hosts can create files that are accurate for the first 50%, and get the user to waste 50% of their time. Half a song is a lot less than half as valuable as a full song. Perhaps you add a "resume" function like FTP so that the user can try to download only the remainder of the song elsewhere, again comparing intermediate checksums along the way.
Re:Well, atleast we know who skipped maths lessons (Score:2)
It's a simplistic approach, but it helps avoid the "unknown" on a first download from that host. It gives the user a course of action for future interaction with that host.
Contradication.. (Score:2)
Within so few sentences...
countermeasure: webs of trust & md5 hashes
and
the best way to deflect it would be hiding your identity
Put simply you cannot hide identity and be a trusted partner in a transaction. And the cost of setting up "trust" mechanisms should be understated, I can generate my own CA and own X509s from openssl. How will you know what CA to trust ? Commerical ones cost money.
Re:Well, atleast we know who skipped maths lessons (Score:2)
Ask a silly question... (Score:4, Insightful)
> This paper aims to address the following
> questions:
> 1. How must the depensation model be modified
> in order to account for conceptual
> differences between P2P networks and animal
> populations?
> 2. What are the conditions necessary to cause
> catastrophes in P2P networks?
> 3. What does the model imply about other ways
> to limit or stop P2P networks?
> 4. What is the most effective method to stop
> P2P networks?
I bet if you'd set out to answer a more interesting question, you'd have obtained a more interesting answer.
Natural populations are well known for their ability to adapt to their environment; to mutate or change their behaviour in response to stimuli (threats) in their surroundings. If you truly wish to study P2P networks as if they are ecosystems or populations, there are plenty of more productive entymological questions to be asked.
This paper reads like a biologist saying "given, say, fish - how can we go about killing them?"
Nice to see *some* scientific analysis of this subject, however misdirected.
Re:Ask a silly question... (Score:4, Informative)
Many populations have a critical population level, and if they fall below that level they have a low probability of rebounding. For example, fruit fly maggots are more efficient when eating in groups and cannot survive if they cannot get enough eggs on the same fruit.
By the way if you pick up an ecology journal you are likely to find at least one paper on this subject. Trying to understand the Allee effect is an important aspect of understanding an organism and how it interacts with its environment.
Re:Ask a silly question... (Score:2)
Okay, fine. At least the major P2P networks (Gnutella, Napster) have a fair amount of diversity, providing resistance against a community-wide single attack wiping out the community.
Furthermore, all of them can rebound quickly. A patch can be distributed across the network very quickly -- in a week or so, updates can be installed by the majority of users. That's *fast* compensation, allowing easy rebounding.
Re:Ask a silly question... (Score:2)
gnutella has already been dos'd.. (Score:1, Interesting)
Why don't they.... (Score:1, Interesting)
Definition? (Score:3, Insightful)
In fact, it was really a client server application which only on downloading a file did it actually make any connection with any other user.
True P2P has no server and needs no server. Napster had and needed such a thing to work.
Personally I wouldn't call it peer-to-peer at all, but if I was forced to, I'd far rather call it a hybrid P2P and Client/Server solution.
I have to say this, so sorry... (Score:2, Informative)
Sharereactor/Edonkey cannot be flooded with damaged or renamed files and neither can any other network/client that relies on hashes of the downloads to ensure the file is the same.
As for using loads of bandwidth by doing loads of useless searches in an automated way, it would be very interesting to see how the different networks coped with this, especially the "next gen" edonkey, which is called "flock" and is in beta, and is supposed to use no servers...
graspee
Re:I have to say this, so sorry... (Score:1)
Re:I have to say this, so sorry... (Score:2, Informative)
What a guy.. (Score:1)
Further questions about the proposed intern scheme were referred by Stacey Herron to her associate Mr William Clinton, recently put in charge of seeing the 'preferred files replicate and populate'
shuuuuut uuuuup! (Score:1)
Corporate Strategy Revealed! (Score:1)
I'm suprised the record labels let Britney Spears have any time at all in that case - hell, think of the teenage boy market if all they knew was what she looked like *grin*
Time to re-vamp P2P specs (Score:1, Troll)
Re:Time to re-vamp P2P specs (Score:2)
Like it or not, P2P is here to stay. It's a system with legal and illegal uses. The legal nature is what keeps the creation of a complete ban impossible, and the illegal nature is what keeps the system evolving.
MD5, etc. (Score:2)
A system that permits sharing of copyrighted material is hardly going to provide a simple way back to resolve the real originator of the material. It is difficult to prove but probably likely that many bad files come from persons connected with the production and distribution of the original material.
There are several sites now that publish checksums and sizes of P2P files. If you trust the site, then you have a way of validating files.
The main issue remains is so-called leaching. That is, those who take but do not give. This may be out of fear or out of selfishness or it may even be just that the user is new. The community response seems to allow small downloads to anyone but to restrict larger downloads to those who do share themselves. I believe there are even some automated tools that will perform this check.
Re:MD5, etc. (Score:4, Interesting)
While I agree entirely with the fact that leeching is a problem, you should consider these facts:
Yet, I download! Most of the time pr0n, and from time to time music (usually when I heard a good song on the radio).
It wouldn't be the first time a P2P client advertising T1 performance aborts me and I find that very frustrating. Probably people using the tools you mentioned, and considering me a leech. Nice...
Oh, and one thing about the whole P2P thing I don't like are the insanely large filenames filled with idiot keywords. Keywords in filenames....tsss.... Better would be a kind of database that associates keywords with files you chose on your harddisk. At least that way your files could have halfway decent-length filenames. Of course maintaining that would be a bit of work, but maintaining a filesystem filled with junk-filenames isn't any better.
Finally a little question for the P2P junks out there: many people claim they get to learn new kinds of music by P2P sharing. I won't say it isn't true, but how? You still need a handle to search new stuff? You just type in random keywords, or what? Just curious, because I'd like to broaden my musical horizonts a bit.
Re:MD5, etc. (Score:2)
The best way is simply having people with eclectic tastes recommend random shit to you -- either IRL, IM, on message boards, etc. Another way, which I like, is using Amazon's recommendation system.
Also, some file sharing apps have a "Browse User" option, and this is very handy for queueing up bands you've never heard of from a user with possibly similar tastes.
Not everyone likes being spoonfed engineered culture...
--
Finding new styles (Score:2)
There are three things I do to find new music:
1) Type in random keywords. This may seem silly, but it can yield interesting results.
2) Search for a genre. You would be surprised at the amount of music that people catagorize/name by genre. Pick a genre that you don't know very well (IDM, dub, afrobeat) and search for it. You will get a seemingly random selection of music. Download these, listen to them and if you like them, search for the artist and/or stuff in the id3 tag. You will find more of their stuff, plus usually stuff they did with their friends.
3) Listen to KEXP [kexp.org]. KEXP is possibly the best radio station in the world. They stream cd quality over the web. They are a public station (I'm a member) from Seattle. Check their time schedule (it's Pacific America time) and check out DJ Riz. This guy is the most inventive, relaxing, best DJ around.
Re:MD5, etc. (Score:2)
Your point about file names is a given. Fast-track might be scumware but allows you to meta-tag files which is useful. This what I miss most when I use Gnutella.
OTOH, I allow inbounds for P2P on my firewall and I have no problems sharing files.
Re:MD5, etc. (Score:2)
This will map an inbound connect to a particular IP address on your local nextwork. You can only configure one port/protocol combo to a particular local address, but it works fine, whether you have a local Web server or a P2P application on your side of the firewall.
Re:MD5, etc. (Score:2)
What I do is somewhat cruder because one system serves files (it has a shared file area) and runs a p2p client. Other systems use the common file area for downloads. Without private filesystems for Pp2p, it makes it kind of difficult for my son to grab pr0n.
However, I had heard something about a Gnutella proxy on their site but I have no idea what is happening about it.
Re:Attack of the Giant Leeches! (Score:2)
I do. I live in Germany where we have a home recording tax on all recordable media including CD-R.
Impotent by Nature (Score:1)
Solution: Decentralized Collaborative Filters (Score:3, Interesting)
Disrupting P2P networks - legal? (Score:3, Insightful)
But let's say that the music industry/whoever did this, would it be legal just because P2P networks are "possibly used" for distributing copyrighted material?
I don't see the difference between sinking someones Direct Connect hub and launching a DoS attack against a webserver.
Slashdot linking - legal? (Score:2)
Unlike traditional web site DoS attacks, based on sending malformed messages (provable intent), DoS attacks in P2P can look like normal requests from normal clients that just come in really fast. IANAL, but much criminal law arises from intent, and the web DoSers (or bounce DoSers) clearly have intent. P2P networks just have a high-overhead protocol.
I don't think, in the end, that you can rely on laws to stop such problems. If you design a flooding mechanism into a protocol, you better be sure to rate-limit somehow... Maybe make people do some amount of work to perform a flood (though precomputation becomes problematic, because you want it to some extent, but not too extreme an extent).
Re:"Black Ops". (Score:2)
Er, what? (Score:4, Insightful)
This is hardly news. I can't remember the last time that I shared a music file from gnutella that was correctly named, labelled, untruncated and not a radio edit (mea non culpla, the first thing that I do is to fix the damn things, before making them available for re-sharing).
For exe's, it's even worse. There seems to be a deliberate misnamimg of some files, e.g. GTA2 labelled as GTA3, or in some bizarre cases files named as "game Foo (actually game Bar)". What on earth is the point of that? If you're warning that there are misnamed versions out there with this filesize, then say that, otherwise just name it correctly and be done with it.
Porn is the worst of all. I've lost count of the number of god damn bangbus promos (or worse, trailers that spawn popups) that I've shared and ditched, and I'm now so sick of it that I won't download anything under 5MB (most of the trailers are smaller than that).
What I can't understand in all this is that I'm sharing these from other gnutella users. Sure, they are injected through malice (or avarice), but what is wrong in the heads of users that they don't understand that this is our network, and our responsibility to clean up the file naming? Nobody is going to step in and do it for us. It's only going to get worse over time, and I'd rather download three different but accurately named versions of the same file than one misnamed version that turns out to be another badly encoded asian lipstick lesbian popup spawning commercial.
Repeat the mantra: our network, our responsibility.
Re:Er, what? (Score:2)
Yeah, but that only happens with MS's wonderful ASF format. I got annoyed with that too and wrote a simple util that strips out all the ASF "Script_Command_Object's". No more popups.
but what is wrong in the heads of users that they don't understand that this is our network
Why do people piss in the pool? Why do punks tag bridges? Same thing.
--
Re:Er, what? (Score:2)
I think the problem is that people are pretty lazy and with big fat hard drives now pretty standard, what's the use of bothering to clean stuff up (other than keeping your porn downloads from others using your computer)? It's easier just to queue up a bunch of downloads and forget about the crap ones when you are done then it is to clean up the file names of the good ones and get rid of the bogus ones.
Maybe once it gets difficult to get even a few good downloads, people will start becoming more responsible with their sharing, but I doubt it. They will just give up and say it just doesn't work anymore.
Popups from video files? (Score:2)
Re:Popups from video files? (Score:2)
it's the Vast Right Wing Conspiracy... (Score:1)
"All it takes is an intern in a room."
Isn't that how a President was brought down?
have you tried The Circle? (Score:1, Interesting)
(unlike gnutella, kazaa, morpheus, etc.
which do NOT scale)
it is based on a decentralized hashtable.
files have md5sums to avoid fake files.
and it uses a trust network.
see http://www.csse.monash.edu.au/~pfh/circle/
Re:have you tried The Circle? (Score:2)
Re:have you tried The Circle? (Score:2)
Re:have you tried The Circle? (Score:2)
Regards,
levine
Users tend to delete corrupt files (Score:1)
Sub-network. (Score:3, Interesting)
I wonder if this has been considered---limiting the network to a certain class of user (and thus excluding exploit boxes bought by AOL Disney Time Fox sitting on a DSL line somewhere in east Peoria) might be a solution to a few of these woes. Of course, for the user on some DSL line in east Peoria, that's little consolation.
I could send little SMB alerts to people who shared broken files... but that'd probably just annoy them mightily, and the lazy bastards wouldn't properly maintain their files anyway.
--grendel drago
Sustainable Population? (Score:2)
/. is a decent example, but is too generic. Better examples are 'class of 91, X Uni' or 'ex-Cisco c.2001'.
I'll bet a lot of these smaller groups operate pretty effective low tech P2P networks - i.e. 'anyone got a pic of murray vomitting?' gets mailed out - some responses come back.
The problem with GENERIC P2P is that you need roughly the whole population of the world to take part to make it work effectively. And even then - you start needing to ask more specific questions - i.e. which murray?
Surely small group P2P is the way forward. So a framework for niche-P2P, which then collects the results in a single 'ultraP2P' could operate well. Each niche P2P is autonomous - and can be voted out of ultraP2P by algorithm or human depending on ratio of good to bad behaviour.
Each user is a member of one or more niche P2P, and shares directly with these. Even if ultraP2P sucks, the nicheP2P's individually can suck or rock depending on the group.
This is how the web was won! And newsnet before it. No reason why it shouldn't work for P2P.
Good idea! (Score:2)
(Of course, in my case, all of the traffic would go through the gateway, not just the protocol data. The whole reason I'm setting up Gnucleus-LAN is to keep from using off-campus bandwidth.)
--grendel drago
Been there, done that, wrote the book... (Score:2)
The idea is not a new one, and works surprisingly well.
A Note On Using Damaged/Incomplete/Promo Files (Score:2, Insightful)
Compare to the Tsetse fly approach (Score:4, Interesting)
The most interesting parallel animal model has got to be the experiment [slashdot.org] designed to reduce (or eliminate) Tsetse fly [museums.org.za] (and other insects [new-agri.co.uk] ) populations by releasing large numbers of sterilized males into the natural population.
The process of P2P sharing would correspond to mating, since you have to have two participants. A successful mating would correspond to a user getting the file they wanted, and therefore being more likely to use the service in the future. Getting a dud file is like a wild female mating with a sterilized male. Yields no offspring, user is less likely to continue using service. One or two cases of sterile matings have no impact, but when it is a significant percentage the population will decline, I'm sure the parallel with P2P holds.
The author seems focused on studying the best way to eliminate P2P, though, so he's probably hoping to get research grant money from RIAA.
it is already happening (Score:2)
I have had to resort to a harvester approach on gnutella. set the bot looking for XY without Z and snag everything that matches this until I stop it. Yes I end up downloading that song 35 times, but out of those 35 files, maybe 2 are acceptable... so deleting I go.
It's getting worse based on lazyness and pure stupidity and including pure greed. (I downloaded an mp3 that was nothing but a porn site advertisment.. EVERYTHINg from this one user was that same porn advert.. and he had at least 60 files all named popular band/song)
unregulated P2P sucks and is getting worse. Most of us old-timers are reverting back to IRC and private ftp stashes (20-30 friends all dropping files there, retrieving etc...)
Leave it to the users (Score:2)
Difference between Animals and P2P (Score:2, Insightful)
Given a threat to its existance, a P2P network can adapt in a matter of hours at best, weeks or months at worst. To change the behavior, defenses, etc... of a biological animal would take thousands of years at best. The flip side is that new threats are developed almost as fast. But the bottom line is, eventually the signal:noise ratio on a P2P system can be tuned enough to allow a signal to get through, no matter what problems might plague it.
Worst case scenario is that you have a voting system that allows *very* different users to vote on certain file share hosts, the ones with the most votes are generally going to be a valid source of the files... while this will present a higher profile target for the major corporations, if you have 10,000 of these high vote people, it's going to be financially problematic.
Even if you have one or two, (or 50) cases of ballot box stuffing when it comes to high vote hosts, an authorized admin of some sort could flag that particular host as being bogus.
There are many, many spin offs of this concept that would make it next to impossible for any single entity to compromise the P2P network into non-existance. It may be cumbersom, but it would work.
Bees? (Score:2)
How do real life societies of humans and animals protect their communities from invasion?
I will assume that the number of "legitimate" users vastly outnumbers the invaders.
Could it be possible to mark or remember hosts who pass around bogus files, and then pass that information to other users on the network?
For example, I download a file from a user or group of users. When the download completes, I naturally check it. The P2P client then pops up a window asking me whether the file was valid or not. If not, I hit "no". This "no" could then be associated in some sort of metafile that inclues the IP address and other identifying information about the host, and this metafile can be shared with all other users on the network.
Like a virus, I could merge my metafile with the metafiles of other users on the system.
On subsequent searches, the client will check the host results list against my metafile and warn me who the probable invaders are. I could also set filters that automatically exclude hosts from uploading and downloading if they have more than say, 5, black marks against them, effectively blackballing them from the network.
I realise that the invaders could easily change their IP address, but after passing 5 bad files they'd be off the network again.
Re:Bees? (Score:2)
Re:errors (Score:1)
"Peer to peer networks have generated significant attention in the recent past, especially file trading networks such as Gnutella commonly found and Morpheusin ecological models of fish and birds. If the mode."
still an interesting read, though
Re:errors (Score:1, Interesting)
The paper is a year old.
I wonder what the review of it was or if the prof or assistant even caught it.
Re:animal population requires food (Score:1, Interesting)
Well, the information is the food. Like in the real world, food is abundant and replenishable. You could say that information is the same. Creative people (artist, musicians, etc.) grow the information in much the same way as a farmer.
The thing you have to remember here is that information is only needed once. For example: You only need to download your favorite song once. What good is two copies of the same thing? This works for software too. Why have multiple software that does exactly the same thing? Plus, if information is something that is learnable then once you have learned it, it becauses useless to you. You can't learn it again (barring any mental disorders).
Let's consider the overhunting issue. With so many users sharing information, you won't have to look far to find what you want. Meaning, you will be able to dl everything you want. With such access, you would have a pretty big store of information yourself, just by dl'ing what you look for. So, The more you have the less you will need.
Sure there will always be more stuff to download. But, you would need to download much less once you reach that saturation point.
Re:P2P is dying!* (Score:2)
MOD PARENT UP! (Score:2)