Gnutella2 Specs - Part 1 236
Mr Fodder writes "The first part of the Gnutella2 specs are finally up." Our previous Gnutella2 story has a little more information.
It is clear that the individual who persecutes a man, his brother, because he is not of the same opinion, is a monster. - Voltaire
QoS based on material type? (Score:4, Funny)
Re:QoS based on material type? (Score:2)
Hey, it the Internet. There are seldom any other sorts of packets on the network.
Pr0n QoS issues (Score:3, Informative)
Improvements to improve your pr0n viewing experience are well underway in many *clients*, rather than in the protocol itself -- protocol changes would produce compatibility issues. A number of proposed improvements include changes to the routing system to use dictionary-based priority changing. Query and result packets containing entries with phrases such as "tits", "ass", or "CowboyNeal nude" will be given an elevated priority in sending, improving latency for those users who really need pr0n. There has been some debate over whether this is entirely appropriate -- it reduces fairness -- but when it comes right now to it, pr0n-obtaining is a task with hard realtime constraints. The Gnutella developers and GDF members recognize that the goal of P2P software should be to best serve the community as a whole -- and so some unfairness will be allowed.
Preliminary dictionaries for the new routing prioritization may be downloaded from various of the GDF developers -- links have been posted in relevant discussion on the GDF board. The proposed dictionary format allows granting of variable priority -- "tits" matching a packet might increase the routing class by 1, but "teen lesbian slut" would increase it by 3, giving it priority over merely "tits" packets.
There is some concern over abuse -- that users hungry for low latency may simply include terms like these in their filenames. Indeed, a few users have already begun doing so. A second solution, perhaps blacklisting, may have to be used later on if this becomes a severe issue.
Because of the real-time nature of Gnutella, there are limits that can be placed on how much latency alteration that can be made. Queues are never massive at a single node, since most clients allow only relatively small send buffers. Early tests show 10% to 40% improvement on high-priority pr0n-containing packets. This is somewhat variable depending upon the network traffic -- if background traffic is composed mostly of smaller "ping" packets (instead of result packets), latency improvements tend toward the latter number.
There are a few other improvements on the table. Those of you that follow my work know that I'm interested in distributed trust used to rank the users. This trust network can be adapted to rank users based on the quality of the pr0n they serve, and give higher priority to users that give really top-notch pr0n -- unfortunately, this requires a client UI (and effort on the part of the user) to do manual ranking.
One more controversial proposal includes a Freenet-like network-wide caching system. Pr0n that is being frequently downloaded will be mirrored to as many systems as possible. This will improve download speed (and end-user experience, as people will be able to view locally-cached pr0n, and thus be introduced to new and interesting forms of pr0n). The p-cache (as it's already being called) even goes Freenet one better -- it is being designed to support speculative caching based on past searches. If your client catches even a hint, based on the searches that you've done, that you might be remotely interested in "cuties in French maid outfits on the beach", say, it will search and download all the related files it can find on the Gnutellanet. Aside from the massive cache of pr0n that builds up (if the user chooses to browse it), this is mostly user-transparent, yielding only low latency, high-availability pr0n searches tailored to your tastes.
Also notable is the support (for pr0n only, I'm afraid -- scarce network resources must be conserved) for multicast, introduced with the new UDP support. When you request a download of a large file, a remote server can give you a time offset until the file will be sent -- usually, an hour or two -- and will establish you as a "subscriber" of this file. When the time expires, the server will multicast-stream the pr0n file to all people that have subscribed to the broadcast. Now, an "hour or two" may seem like a long time, but it's far better than simply waiting in a unicast queue, possibly for days. You will need to be on the MBONE for this -- some college users or business users with videoconferencing may already have this handled, but the rest will need to request "MBONE support" from their ISP.
I may be wrong (Score:3, Flamebait)
Sort of (Score:2, Informative)
It's really closer to the jump between HTML and XML. The new specification is more extensible, and has more optimizations than Gnutella. It's essentially the trends in Gnutella today taken to somewhat of a logical conclusion. Plus, instead of just lumping in Gnutella developments like hashing files, it uses it in the base design of the network. And at the same time G2 hubs and leafs can play nicely with G1 peers.
So, to answer your question, for the moment it's more of a complement to Gnutella than a replacement. However, as time goes on and more clients adopt the new protocol, it may eventually replace it. Your original question was a bit to inherently harsh.
Re:I may be wrong (Score:5, Informative)
Micheal Stokes (Shareaza [shareaza.com] developer>) thought that the GDF (Gnutella Developers Forum) was a too slow at fixing Gnutella's problems (unscalable, too much unused bandwidth, unorgnaized for future additions) so he went ahead himself (by himself) and wrote Gnutella2. He has done this before, when he wrote the spec for "Remote Queueing" (kind of like IRC). He wrote his spec first, developed it in his client, released it then proposed the idea to the GDF. The GDF likes it and now Limewire, Bearshare and Gnucleus all support it.
The GDF is pissed that Mike went ahead and "updated Gnutella" without asking them first. Granted, they have a right to. The GDF is meant to be a consensus, a forum for all developers. And a "assumed" condition of that is to let the other developers know *ahead of time* before going ahead and doing something this massive. And the entire idea that he called it "Gnutella2" (using the Gnutella brand) and advertised it as the "next revolution in P2P" (which it actually, IMO, is) pisses them off even more.
However, if you notice, it seems only the developers with corperate ties are pissed. Other clients such as GTK (Linux), Gnucleus, etc. all seem interested in the protocol, I believe GTK already said they'd implament it. Limewire and BearShare still seem upset. (It's like owning a oil company, then someone comes out with electricty - sucks).
Anyways, Mike likes the Gnutella ideals - that it is open and free. So he called it "Gnutella2". Partly to "refresh" Gnutella and revive Gnutella's bad image it has with the general user (which it has achieved IMO) and to show users it's the "second generation" of Gnutella.
The Protocol is being released now. This is part one, the next one will go over the new packet encapsulation and what not.
A revolution in P2P? I don't think so (Score:4, Interesting)
Calling Gnutella2 the "next revolution in P2P" would be like calling the latest model in horse-pulled carriages the "next revolution in transportation" years after the advent of the motor car.
Re:A revolution in P2P? I don't think so (Score:4, Insightful)
Oh, and if you want to be taken seriously, perhaps you should have the courage to put your name to your utterances.
It isn't a new searching mechanism at all, it is still using a brain-dead broadcast search. Link compression, partial file sharing, and the other features you mention are just putting lipstick on a pig. Oh really, so you must know how many people are using Freenet? What? Have you ever even used it? I have, and it works fine for me. Utter bullcrap. You have obviously never tried to use Freenet. What spec? I don't run Windows. Sounds like you need to take your own advice.Re:I may be wrong (Score:3, Insightful)
there is quite a bit of danger to your reasoning. an official gnutella2 standard should be adopted, instead of one client calling it's new protocol the next gnutella protocol.
Re:I may be wrong (Score:2)
Oops... P2P caring about intelectual property law, knew there'd be a flaw in that one!
Re:I may be wrong (Score:3, Insightful)
I dunno - if it maintains backward compatibility and doesn't break the network then what's the harm? If it's a better protocol and people migrate to it then we have progress. Design by committee is all well and good except that it's really slow and can get political, especially when commercial interests get involved. Sometimes the best decisions are not made. Survival of the fittest is the best approach
Re:I may be wrong (Score:3, Interesting)
The original program was not as many people mistakenly assumed (due to the name) free software, it was closed source windows software, that other people had to reverse engineer, the design was fairly shoddy (because, as aforementioned, it was more or less a proof of concept), leaving far too much bandwidth spent on catering to people who use firewalls (and much of the time, those push requests simply get lost). Then, the countless different clone implementations tend to not-quite fit the same specifications (as each other, let alone the original), causing no end of problems.
I did use 2 or 3 different versions of Gnutella, for quite some time, probably over a year. I got lots of stuff. Of course, most of that is only half there, because of not always finding the same files again, or not being able to connect to a servent (most of the time), or stupid screw-ups where you resume a download, and find that the remote server is actually sending you something else, so you have to stop it and muck about with dd or similar to carefully cut the file back to what it was originally. Then, after a while, I noticed not only people who were putting up lots of copies of the exact same file (eg an advert for some site) under numerous different (and totally misleading) filenames (and also files that appeared to be proper files, even having pretty large file sizes, which turned out to be just windoze URL files, padded with vast amounts of space), but even, hacked up servents, that would return numerous different responses to any query you could come up with. EG, search for FOO BAR and they would return exactly FOO_BAR.mpg, FOO_BAR.htm, FOO_BAR.jpg, FOO_BAR.mp3, FOO_BAR.exe, and eventually, more cunning variations. I'm sure there were other similar things done by crackers and spammers, et al, but I can't remember them all.
BUT, ultimately, the thing that makes the Gnutella network broken most of all, to my mind, is the sheer LEGIONS of such utter CRETINS who have not the slightest idea what they are doing, flooding the network with queries that are almost GUARANTEED to return absolutely f**k all, because they simply do not get how the queries are matched. If they did, they would probably still not be able to get it right.
A few months ago, I pretty much stopped using Gnutella, as the network seemed to be getting progressively worse, and I seemed to be able to get less and less (and yes, I did used to share some files- ones that people seemed to want, too). I tried looking into various other P2P type networks, like GiFT and Freenet, and GNUnet, but felt badly let down by what I found. After a while, I tried having another look at Gnutella, and it was so screwed up it made me feel sick. The flood of bad queries (now including torrents of empty queries, about 80%, I'd say) was even worse, and trying to search for anything yielded almost exclusively the spam responses. I kind of got the feeling that maybe groups like the RIAA/MPAA could have been deliberately creating the noise and spam themselves, to try to make the network worthless. After about ten or fifteen minutes of searching, I gave up in dispair. As far as I'm concerned, Gnutella is dead.
Well, if someone proposes a new version, and it addresses most of these problems, IMO it would really be best if it broke compatibility. Nice clean break. Well, there's my 42 pence worth, flame away, all.
Re:I may be wrong (Score:3, Insightful)
All of them are different, but let's take a look:
FastTrack -- the protocol is barely an improvement over the original gnutella, and with some additions from the LimeWire people, there are no improvements. It's also closed.
DC -- totally different, and from a technical perspective, much less impressive. Little more than IRC+DCC with a non-idiotic interface.
Re:I may be wrong (Score:4, Insightful)
FastTrack -- the protocol is barely an improvement over the original gnutella, and with some additions from the LimeWire people, there are no improvements. It's also closed.
The FastTrack protocol is vastly superior to Gnutella, especially the original Gnutella. It is, quite simply, one of the best, if not THE best, P2P protocol out there.
DC -- totally different, and from a technical perspective, much less impressive. Little more than IRC+DCC with a non-idiotic interface.
DirectConnect interface non-idiotic?!? DC has the stupidest interface of any P2P app I've ever seen. People keep bitching about how hard eDonkey or WinMX are to use (for example), but if anything, those are WAY simpler than DC. Just point and click your way to downloads. I have yet to download a single file via DC. The thing flies in the face of everything users expect of P2P apps, and even Windows apps in general. It's a disgrace.
Just my opinion though, move on, nothing to see here.
I am curious.. (Score:3, Interesting)
By "monitoring" requests in limewire or by putting in ambiguous search terms, I estimate that well, well, well over 99% of files available through gnutella-based p2p services are copyrighted.
Oh yes, we all have heard the usual arguments. Technology doesn't break the law, people do. Aka, the Pontius Pilate / Eichmann defense.
Re:I am curious.. (Score:2, Informative)
Re:I am curious.. (Score:3, Informative)
Re:I am curious.. (Score:2)
Getting Linux over FTP is much more reliable because a published server is likely to have more bandwidth, be placed "closer" in network hops to you, and is more trustworthy.
Imagine if some anti-Linux organization posted trojan-containing distributions and started sending them out over P2P... all it takes is a few people too lazy to check their hashes and it will become impossible to audit back who released all the exploits into the wild.
P2P has some possilbe legal uses, but for all the legal things P2P could do, the traditional protocols are better at doing them. The only motivating reason P2P for being developed is because people want a tool that makes it harder to trace copyright violations.
Re:I am curious.. (Score:2, Interesting)
More bandwidth? More reliable? It's the same thing you just happen to be putting your trust into an ftp server. Be placed "closer" in network hops? What does this have to do with anything? If you are pulling from 20 people at 20k/s it's faster than pulling from an ftp server at 60k/s.
Imagine if some anti-Linux organization posted trojan-containing distributions and started sending them out over P2P... all it takes is a few people too lazy to check their hashes and it will become impossible to audit back who released all the exploits into the wild.
Image if the same anti-Linux organization posted trojan-containing distributions and started sending them out over ftp or http... all it takes is a few people too lazy to check their hashes and it will become impossible to audit back who released all the exploits into the wild.
P2P has some possilbe legal uses, but for all the legal things P2P could do, the traditional protocols are better at doing them. The only motivating reason P2P for being developed is because people want a tool that makes it harder to trace copyright violations.
Like what? FTP, HTTP, GOPHER? I mean you can transfer files over the http but does that make it less efficient at doing so? That last piece is a joke, you make broad generalizations based on probably what your friends or you yourself do. However there are people who use P2P for things that have nothing to do with violating someones copyright. Especially if you work in the ad business and instead of having an ftp server you have a p2p client where people can transfer clips etc etc using the existing network. Just, there are so many uses for P2P besides violating copyright.
Re:I am curious.. (Score:3, Insightful)
Nope. It'd come out that the University of Middle-of-Nowhere's FTP mirror got hacked, so anybody who downloaded from them should check to see what they got. If you decide to download your Linux from an FTP server that's known to be owned by Microsoft, you need professional help. In being able to trace back who you downloaded from in real-identity rather than username form, it gives you a much better trail for reporting untrustworthy servers. Especially if you work in the ad business and instead of having an ftp server you have a p2p client where people can transfer clips etc etc using the existing network. Just, there are so many uses for P2P besides violating copyright.
Nah, FTP is still better for that use. An FTP server isn't that hard to set up, there are plenty of open source packages to pick from. Almost every web browser is capable of downloading over FTP, so your client likely already has the software they need, rather than asking them to download a special (offen spyware-laced... boy is that unprofessional) client.
Moreover, wouldn't this kind of communication be something only your client should see, and not something left out for other people to grab. Sure, you could sit and cancel every other user who tries to grab that file while you wait for the client to take the file, but that means you have to be there the whole time the file is up. As compared to FTP, where you can set it up so only the somebody who knows the right username/pw can get at the file, none of the P2P programs let you do that.
I'm not disputing that there are uses for P2P that don't violate copyright... I'm saying that P2P sucks compared to the mainstream protocols such as FTP, HTTP, SMTP/POP3 e-mail, etc.
Re:I am curious.. (Score:3, Informative)
P2P networks like Edonkey and Freenet have the property that it becomes easier to download a piece of information and its more likely to be closer to you the more it is downloaded, rather than the reverse with a centralized server.
Paying for bandwidth to host large digital content is not always feasible for some information distributors. A group that I work with that produces freely redistrutable media is considering how to make full resolution video available. Sometimes even for the low res video we now make available we have peaks over 40mbps when a piece of info is popular. If we can't find a donor for a substantial amount of bandwidth then we'll probably use a P2P network.
It would be more efficient bandwdith-wise if ISPs implemented P2P nodes for their customers, rather than the customers doing it themselves. They recognized that this was the case a long time ago with newsgroups and more recently with Akamai. Maybe when there's more freely redistributable content available they will do so.
Digital signatures take care of the security concerns you raised. You can download them from authoritative website and check the file after you've downloaded it. Freenet and Edonkey use digital signatures natively.
Re:I am curious.. (Score:2)
Re:I am curious.. (Score:2)
-9mm-
Re:I am curious.. (Score:2)
I meant, rather:
Yes, but that wasn't my point. My point was that the grandparent poster did not assume that there are some copyrighted works that are legal to distribute over p2p networks. This also applies to music from some bands, and some videos.
Re:I am curious.. (Score:5, Funny)
Re:I am curious.. (Score:2)
If they have a 'license' but the original copy is unusable(broken/stolen or whatever) then again downloading another copy should be ok.
In the EU I can 'sell' my copy or license that's fine, but allowing somone else who has a license to make a copy of my copy... hmmm..... probably ok too.
Re:I am curious.. (Score:2)
Re:I am curious.. (Score:2)
Re:I am curious.. (Score:2)
The best question of all: Would the labels make more money offering their songs inexpensively over the internet in high quality mp3/ogg formats, rather than pissing off their customers and TRYING to thwart open digital formats? (I stress "TRYING.")
One day they will wake up. Until then, I couldn't care less how much copyrighted material is traded online... the legalities of which are only clear to the RIAA (i.e. "it's illegal").
BeSonic (Score:2)
BeSonic no more RIAA
Most Common Files Downloaded From Me This Week (Score:2, Informative)
1. Dungeon Siege Demo
2. Day of Defeat Patch (Halflife Mod)
3. Alias Season 2 episode 1
Note that two of these are definitely freely distributable. The third one is not available anywhere else - and I have yet to hear of a big* hubbub concerning TV shows.
*there is a small hubbub, but nobody REALLY seems to care - I'll let you speculate as to why that is
Re:Most Common Files Downloaded From Me This Week (Score:2)
Re:I am curious.. (Score:2)
I'm not saying it's right, but it's a fact.
Re:I am curious.. (Score:2)
Yep, here's an Example (Score:5, Insightful)
It's now on Gnutella2.
magnet:?xt=urn:bitprint:S5Q756FJ7326XXDGA7KZBF2
I get 15 sources in seconds. (G2 required - good luck on G1)
You used the wrong phrasing... (Score:4, Insightful)
In the United States (at least), everything made since 1923 was, has been, and still are copyrighted, even if they were never registered with the copyright office. So everything you see on a peer-to-peer network is indeed copyrighted.
A more approriate question (as some of the responders have answered) is if anyone has used a peer to peer network for a legitamite purpose. The problem here is that the issues are quite grey. If I have Game X, or Game System version 1.1 can I download copies of the games/BIOS/etc. online for use with emulators/replacements for broken discs/etc.? If an online broadcaster, paying royalty fees, uses ABAcast [abacast.com] or Peercast [peercast.org] to distribute their works, do I in turn have to pay royalty fees since I am rebroadcasting them?
Unfortunately, there is a major gap between what people think they can do under copyright law and what they actually can do. While I have not extensively researched the above (IANAL), technically, all the above commonly considered legitamite things are *illegal* unless you have worked some deal out to repay the copyright holders.
The problem you really should be asking is if anyone uses P2P networks to delibrately distribute their copyrighted works, either as a primary or secondary channel. A few minor bands likely do. The next question is if you'll ever find them on Slashdot. And I do not know the answer for that.
Note I personally have *never* used Napster, Gnutella, Kazza, or any of the other networks, mainly because being caught doing so may jepordize my ability to be hired in certain areas. I used to be one of those nasty college network administrators trying to keep your P2P usage down because it overloading our bandwidth, and we could not order a significantly bigger pipe because our local phone switch could not handle it. Feel free to flame me for my ignorance as you will.
Re:You used the wrong phrasing... (Score:2)
Actually, there are a few restrictions. Up until 1980-something, unless you put "Copyright 1993 Foobar" or "[copyright symbol] 1993 Foobar" on a work, it didn't get copyright protection.
Yep ,Several things off p2p in general (Score:2)
Replacement songs for scratched / broke CDS..
Copies of songs for work that i own ( disk at home )..
HTML texts ( ala guntenburg )..
Clipart for a presentation.. ( should be legal anyway )
Expanded Uses of Gnutella (Score:2)
A 'real world' example: a Art History department may want to share digital photos of art with faculty and students but not have to maintain a dedicated server. They can utilize the power of p2p if they could easily form a private network, one that would leverage the CPU and bandwidth of all participating users.
Currently it is hard to *only* connect certain nodes or only *allow* certain nodes to connect. We are working on a complete solution at LimeWire. The first iteration will allow a tech admin to easily set up a private network. The second iteration should incorporate privacy and security to keep out unwanted guests (supported by a username / password infrastructure).
Obviously, private secure networks can be utilized by criminals and terrorists to exchange potentially illegal information. Nevertheless, the same can be done with PGP, secure telephones, etc. already. There is a lot of truth in the statement that p2p networks are defined by the users, not the developers.
Re:I am curious.. (Score:2)
My question is: what's your point? We like warez and porn.
Re:I am curious.. (Score:3, Insightful)
"I am curious to hear stories of anybody who has at any point used gnutella to do anything but transmit copyrighted material in any substantial way."
I can't help noticing a similarity between copyrighted material on p2p, and porn on home video. Just as porn drove home video technology into becoming an industry and commodity, copyrighted stuff seems to be driving file sharing network technology toward becoming a viable distribution method. Right now, p2p seems to be approaching an adolescent stage of development, as it begins to address scalability issues and alternative applications like efficient radio broadcasts. This technology is becoming more useful, and as it does, I expect it will used to solve more problems than just swapping MP3s.In other words, don't assume just because you see copyright infringement now, that the tech won't be something we all rely on for legal activity in the future.
Re:I am curious.. (Score:2)
The underlying claim from your argument is that P2P solves a technological problem - namely, bandwidth limitation. This was echoed a few posts above by somebody (quoting a limewire press release?) giving the example of where an "art history department could share its works with limewire rather than by having a dedicated server." Bandwidth limitations (the art history example arguably uses MORE bandwidth in P2P form) will be solved by people developing more wires, and other technologies are far more suited and adapted to "efficient radio broadcasts" over the web than anything relating to the porn-eminem-dvd-rip-warez-a-thon that is current p2p.
You are trying to justify a technology by mating it to a perceived, likely non-existent problem or future benefit of indeterminate nature.
Just because VCRs spawned a video industry doesn't mean that P2P will spawn any sort of money maker (and, to counter the patently assinine claim of somebody further up that rightsholders need to adjust their technologies because new technologies have come into being, I don't see anybody arguing that we should all grow bulletproof skin because of the development of handguns). Even if in the case of videos, the MPAA (or whoever) initially protested against what would ultimately be in their own self-interest doesn't mean that they are necessarily in the same position now.
Re:I am curious.. (Score:2)
"The underlying claim from your argument is that P2P solves a technological problem - namely, bandwidth limitation."
Please don't put words in my mouth. Even if that's almost what I wrote, it is not exactly what I wrote."You are trying to justify a technology by mating it to a perceived, likely non-existent problem or future benefit of indeterminate nature."
Next time you want to make such an absolute statement, you might want to do the research first.The problem exists. I will give one example here: Epiphany Radio. [epiphanycorp.com] This is a shoutcast station I used to listen to, until I ran into a 12 user limit [shoutcast.com] imposed because the broadcasters can't afford the bandwidth to support many users. However, thanks to peer to peer technology [peercast.org], I can once again listen to this station, via their peercast stream [peercast.org].
This is an example of p2p being used to solve a real problem, without copyright infringement. It is a fact, whether or not you were aware of it or want to acknowledge it. It is quite possible that we will see more examples as time and technology progress.
I am not trying to justify anything. I am simply pointing out an observation, and a possibile eventuality.
I wish I could download SuSE ISOs with it. (Score:2)
Re:I am curious.. (Score:2)
Laws have a way of changing with the times.
For nearly a century it was legal in the US to buy human slaves, and to treat them however you wanted. You could whip them, beat them, and rape thier wives without any fear of recrimination whatsoever. It took almost 100 years of this before America came to its moral senses, and took a stand for the rights of all people. Regardless of color, creed, etc.
It's a little known fact that initially the south supported this change. They repented thier moral wrongdoing, and the majority of southerners agreed to cooperate. The only provision was that the US government would compensate the south for thier losses. I.e. the country as a whole would pay for the regrettable history of slavery by putting tax dollars towards weaning the south away from slavery...and into a new business model which didn't require slavery. The northerners took the very selfish stance of claiming that it was the south's fault for using slaves in the first place, and that they didn't deserve any kind of financial support to help them make the transition.
This of course left the southerners only 2 alternatives:
1) Relinquish slavery in all its forms and become pennyless(if you think un-employment is bad now...).
2) Fight for thier way of life(thier right to eat, and stay clothed, etc)
Obviously there really was no choice for them. American history, like all histories, has a way of demonizing the losing side...but in fact the north was very cowardly and selfish in refusing to bear thier share of the legacy of slavery.
What's the point to all this?
I would argue that we're in a very similar situation now as relates to intellectual property and copyrights. Businesses who rely on intellectual property to support thier "way of life" are terribly threatened by things like file-sharing. They are human beings too, and they shouldn't have to give up thier standard of living. But at the same time, they are slaveholders.
Who are the slaves? We are. The history, and indeed the very culture of our generation is steeped in books, movies, television, music, video/computer games, etc. The merits/de-merits of this aside...it's largely true that a great deal of "ourselves" is derived from these things. The problem of course is that our culture is completely subsidized. We don't have the right to re-visit our culture unless we can afford it.
I'm not sure, but it seems as if this is the first time in history where this has happened. For thousands of years culture was passed down by word of mouth, festivals, plays, music, poetry, etc. etc. the vast majority of which was free...even taken for granted. People, even the poor, had a right to themselves and thier culture. Today we have to stand in line before an iron gate, and pay tribute to the keyholders if we want to remember who we are. A terribly dehumanizing prospect.
The controversy is obvious, and much like the days before the cival war, both sides are right. Unfortunately it appears as if history will repeat itself, and that noone will make the sacrifices neccesary to avoid a conflict. Someone is going to get burned at the expense of someone else...hopefully this will be the last time however. Because in a post-scarcity society it's a win-win situation.
Re:I am curious.. (Score:2)
Interesting... any sources (esp. online) where I can find out more about this?
Also, do you happen to be a southerner? You sure seem to understand the issues and people's sympathies. Perhaps you can explain why there's so much feeling for the days of the Confederacy? What inspires the depth of feeling? It can sure seem like the grass-is-greener to an outsider...
Re:I am curious.. (Score:2)
Unfortunately I couldn't find any material on-line. As I said though, history tends to be highly revisionist in ways that benefit/support the side of the victor. This information isn't easy to find in historical text.
Also, do you happen to be a southerner?
I was born in Michigan, and spent the majority of my life living in the north. I am now living in Atlanta, and have been here for 8 years. So I do have some exposure to both sides of the story.
Re:I am curious.. (Score:2)
Re:It's better than... (Score:2)
Wooky - Wookie [starwars.com] (p.s. he's about 6 feet tall, not 8 - and Ewoks are 3 feet tall [starwars.com])
Yes, I'm a SW geek.
Because this is a "karma whore" (Score:2)
"Karma police, arrest this man, he talks in maths, he buzzes like a fridge he's like a detuned radio..." -- Radiohead
Shareaza's gnutella? (Score:5, Informative)
Re:Shareaza's gnutella? (Score:2)
If other clients don't adopt Gnutella2, or something better/as good then they'll be pushed out of the market, simple as that.
Look at it yourself. Do you think it was progressing well?
Re:Pushed from the market (Score:2)
Gnutella2 - Empire Strikes Back (Score:2, Funny)
Awesome (Score:4, Funny)
Re:Awesome (Score:2)
And I will be able to get previews of the movies I plan to rent, using the TV-out on my graphicscard!
We'll see about that. (Score:2)
Go go Slashdot....
Ranking system (Score:5, Interesting)
I wonder how many people simply don't share anything, or have a firewall and don't open any ports for Gnutella.
Re:Ranking system (Score:3, Informative)
Re:Ranking system (Score:2)
Re:Ranking system (Score:2, Informative)
Re:Ranking system (Score:2)
Re:Ranking system (Score:2, Insightful)
It would also be a bit to implement when you can't trust any node on the network. And how do you know some client isn't lying about their Karma rating? About the integrity of the shared content? About how many files they've uploaded?
Re:Ranking system (Score:2)
Re:Ranking system (Score:2)
that way you still help a _little_. might help when somebody is missing 0.01mbytes of something to finish downloading it.
i recommend emule for sharing purposes fully though.
Re:Ranking system (Score:3, Informative)
Who defines gnutella2? (Score:4, Interesting)
I was actualy pretty into the protocol and all that, gosh, two years ago. I even got a partial implementation going in java. (I could create a node that would pass along messages, and view search requests. By the way, I have to say Gnutella was about the most fucked up protocol I have ever implemented)
Anyway, even then (the summer of 2000) there was all kinds of talk about "GnutellaNG" i.e. Next Generation. But since there was no central authority, no one really cared, and other implementors whent off to create other kinds of networks.
I guess what I'm askng hear is, how does this differ from any of the other GnutellaNG ideas floating around? Or, if did some random person just make an announcement and sucker slashdot into posting about it?
Re:Who defines gnutella2? (Score:3, Interesting)
How about the fact it isn't vaporware anymore. There is a damn client [shareaza.com] that already supports it. And there is now a protocol spec being proposed.
People can talk talk talk, but can you do the walk walk walk? Apparently somebody finally can.
Rsync type transfers? (Score:3, Insightful)
I wonder... (Score:4, Interesting)
I've already noticed some fracturing in the network, in subtle ways - for instance Bearshare implements a queueing function that others do not (it seems). Esentially, when I use Bearshare other non-Bearshare clients cannot download from me since the queue is full of Bearshare clients.
Does anyone know more about what's going on?
Re:I wonder... (Score:2)
I've never seen a non-bearshare ultrapeer when my client has been an ultrapeer. I do see some, less tha 1%, non bearshare clients to download from, though.
So far nothing new (Score:4, Interesting)
Every single feature was either implemented in an other client before shareaza or has been discussed on the gdf. OTOH, shareaza does have some hard data on how the ideas work now, so they have at least contributed something.
Re:So far nothing new (Score:3, Informative)
DoS Attacks (Score:3, Interesting)
The main focus at the beginning of the article was on making the amount of hubs smaller. This is convenient for someone (RIAA) who wishes to take down the network. Now they have less hubs to packet.
network of gnutella developers?!? (Score:3, Interesting)
Re:network of gnutella developers?!? (Score:2)
I'm glad someone else agrees with me on this. Gnutella is perhaps the worst-performing P2P system I've ever used. I do hope someday they manage to make it good, but in the meantime I'm completely stumped as to why people trumpet it like it's the best thing since sliced bread.
Re:network of gnutella developers?!? (Score:2)
Jeez, have you even tried it?
I'd be interested... (Score:2, Interesting)
Not Gnutella2! Shareaza1!!! (Score:3, Interesting)
The truth is that Gnutella2 is on the way, but not from Shareaza. Gnutella2 is a loose connection of various enhancements to the Gnutella network that have been implemented over the last year or so by SEVERAL COOPERATING Gnutella vendors. The latest enhancement is GUESS, which was introduced before Shareaza's new searching methodology and seems to be Shareaza's inspiration.
The Shareaza people continue to attempt to preempt Gnutella as THEIR protocol, when in fact they are pretty much branching off from the network. Shareaza should feel free to leave the OPEN Gnutella network, but please don't try to steal a name that belongs to the users and developers of Gnutella.
Re:Not Gnutella2! Shareaza1!!! (Score:5, Interesting)
Then lets hear you? I haven't seen any replies in the GDF from Limewire on the spec yet. You find that specs being released by Shareaza are laughable, but what about when Limewire proposes their GUESS proposal? Or "CHORD" proposal? Didn't "one development team" work on those? Sure, you released it before actually implamenting it, but still... the rest of the GDF just questioned about it, you were really the only development team. People don't say "Gnutella's GUESS proposal" they say "Limewire's GUESS proposal".
Can Limewire stop saying that? Which is totally and utterly untrue?!! Mike was working on G2 long before you sent him your spec on GUESS. He told you in a private e-mail that he was working on it before hand, and that he would probably release his with GUESS.
And the specs released today are **VERY DIFFERENT** from your damn GUESS proposal.
Oh, and what about your Remote Queueing feature? Shareaza founded that, and it's included in Limewire. Mike wants to cooperate, but your not giving him a chance.
"Shareaza People"? There is only one developer for Shareaza, Mike. Shareaza supports G1 and G2, it supports "Gnutella".
Re:Not Gnutella2! Shareaza1!!! (Score:2)
Visit the GDF. You'll find info on various components of Gnutella2 - Ultrapeers, Remote Queueing, HUGE, Meta-Data Support, and GUESS. Unlike Shareaza's features, all these enhancements were cooperatively developed and open upon introduction. Shareaza has only published the IDEOLOGY behind their protocol, it is still not open - the lack of detailed specifications is startling.
In case the original gets slashdotted... (Score:2, Interesting)
Net Impact -- Doesn't go far enough (Score:3, Informative)
This *does* include UDP as many routers/firewalls/packet shapers do perform flow-based rules on UDP conversations as well.
We've seen a relatively small link full of Bubster traffic bring a medium-end firewall to it's knees
by causing far too many conversation setup/teardowns. GnuTella should try to construct a network of long-lived inter-hub connections such that a query is never sent over a *new* connection more than a a few hundred times. Fortunately the new design is at least progress.
the real gnutella (Score:5, Informative)
I've been playing around with the limewire source for ahwile, it is well documented and there is no spyware in the open source version. I love how people complain about Limewire and spyware, when it is open source. Anyone can take the gpled limewire source and package it without spyware without having to reverse engineer it like closed source KaZaa.
Re:the real gnutella (Score:2, Insightful)
I don't blame you for spouting such FUD, as I assumed that Shareaza was trying to hijack Gnutella as well. But by actually using it, I've realized that it's not. Nor is it spyware ladden like KaZaa or LimeWire ("repackaging" aside). It's actually the best gnutella client I've ever used.
I would like a copyright protected P2P network! (Score:4, Insightful)
Maybe im the only one in the whole world who doesnt like to pirate, i dunno.
creative commons (Score:2)
The file sharing networks themselves are agnostic on the matter of how the owner of a work intends for it to be distributed. The software justs see files and shares them, it can't tell the difference.
Try Furthur (Score:3, Informative)
Legitimate P2P sharing of live music.
neighbours (Score:2, Informative)
How do they stop circular requests? Don't send to the request to the one that requested it is simple enough, but what about "multiple inheritence"?
Hub A knows hub B and hub C, but not D. Both B and C know D. D gets the same request twice?
A
/ \
B C
\
D
Can't A tell B and C to talk to each other once in a while and temporarilly remove D from either B or C's neighbour list to prevent wasting bandwidth? As soon as B or C goes down D can then automatically be re-added to D the neighbour list of the hub that is still up.
Legitimate P2P (Score:3, Interesting)
Essentially, the idea is this: When you are downloading a file, when you receive a packet of data, you now have that packet of data, and there's no reason you can't immediately share that packet of data.
So, people downloading something from a Bit-Torrent capable site are themselves distributing the content... as it is being downloaded!
The end result is that a huge number of clients can download content (iso images, etc) from a site without increasing the total bandwidth usage of the site by much at all.
Check it out - it's pretty amazing!
Re:Legitimate P2P (Score:2)
It's a combination of Partial File Sharing, Remote Queueing, Download Mesh (Alternative Sources) and Swarming. Works very well, actually.
Re:blah (Score:4, Interesting)
[snip]
Actually, what I personally find more frustrating is that when you actually do find what you want, the download fails because either the host drops offline or refuses to accept the connection. Another little irritant is the large number of files out there that are deliberately misnamed so that when you download and open them, you find yourself dropped into someone's personal porn site, regardless of what you're looking for. I used to look for cool stuff like the blooper videos and whatnot, but I got one to many that was deceptively named. Not worth the effort, really...I uninstalled the damn thing and quit trying.
Feedback rating? (Score:4, Interesting)
J.
Re:Feedback rating? (Score:2)
Re:Feedback rating? (Score:2)
Re:Feedback rating? (Score:2)
That said, there are other ways of discovering the real file. Currently fakers don't bother to show the correct filesize, so at a glance they can be seen as different. If they binded the claimed size to the download size no one could fake sizes or else you'd get a broken download that even a porn spamer couldn't use to redirect.
Re:Official? (Score:3, Informative)
Yes, the original Gnutella developers worked for Nullsoft, then a division of netscape, a division of AOL, now a division of AOL/tw.
No, it's not official in that sense. It's not even official in the sense that other gnutella-client (such as limewire, bearshare, gtk-gnutella, qtella, gnucleus, etc..) developers have adopted, or agreed anything of shareaza's new protocol.
Of course, I hope this new protocol works good, but it's wrong to attach the gnutella name to it. It doesn't have much to do with it at all. Next thing that'll happen is someone else will come up with a totally different protocol and call it "gnutellav3" or something. Bad precident.
Re:Microsoft Word 10 (Score:3, Insightful)
I just checked and yep OfficeX for mac outputs html with the same headers as this html doc.
Re:Tiger hashing? Oh my. (Score:2)
I am not entirely disagreeing with your point - I understand where you are coming from, but it is good to acknowledge the flip side of the coin. If we trust to one standard, we lose the opportunity to validate alternatives.
This argument by example presumes that Tiger is *reasonably* well defined and tested.
Cheers
MOD PARENT DOWN (Score:5, Informative)
TTH (Tiger Tree Hashing) is used to validate/verify chunks or segements of files as they download. SHA1 is still used in Shareaza, and is still a standard on Gnutella (has the lowest collision rate compared to MD4, MD5).
For example, say your downloading an 800MB Linux Distro. Some script kiddie fakes their Porn video as a Linux distro (really hard to do, but for the sake of discussion..). Shareaza will download a segement from that node, but it will use Tiger Tree Hashing to check if that chunk of the file is correct or not. Of course, it won't be. So it will delete it and ban that node from the download transfer circle and re-download that chunk again from a different node. Without TTH, it would of only caught the invalidation after the file had completed (after you downloaded that whole damn 800MB file, and it's corrupt?!)
So basicaly, TTH verify's segments of files (great for swarming (downloading from multiple sources)). So, theoretically, you will never get an invalid file when downloading on Shareaza from other Shareaza nodes (Shareaza currently is the only client that supports TTH).