P2P in 2001 56
nihilist_1137 writes: "Zdnet is reporting that P2P is becoming more used in business. "It's now over two years since a few underground song-swapping services put peer-to-peer technology firmly at the forefront of the IT agenda... A look back at some of the more significant P2P stories of 2001 shows that -- although not a new concept -- P2P is starting to assume a very important role in the corporate space, as tech giants scramble to succeed in this new market."" Hard to believe that the Napster battles have been going on for two years now.
Re:I use P2P often at work (Score:1)
Re:I use P2P often at work (Score:1)
"What the f___? You think we pay you to download old Simpson's episodes?!?" -My Boss
About time... (Score:1)
These programs can be invaluable for companies that work on big projects and need to collaborate information from a vast variety of sources. The Kazaa network has shown just how vast that information can be...
Anyway, that's my 2. Back to downloading movies...
Re:About time... (Score:3, Interesting)
Even if authentication is there, if logging is there, management ( at least the ones I have run into) like the idea of a central, impenetrable bastion of information, with big pretty accounting graphs. It is a large firewall to bringing about change in anything other than a pure technology-oriented business.
AWG
4 out of 5 doctors think that the 5th one smells.
Re:About time... (Score:1)
Intel's P2P library (Score:5, Informative)
Re:Intel's P2P library (Score:1)
Re:Intel's P2P library (Score:1)
This is the first P2P application that I've seen with encryption built into it.
Re:Intel's P2P library (Score:1)
Actually, it's an extention of OpenSSL. So, this is the first time someone has added P2P to an encryption library.
Re:Intel's P2P library (Score:3, Informative)
Besides, Intel's library isn't an application.
Burris
Re:Intel's P2P library (Score:2, Informative)
ZONORK IM Server (Score:1)
Yup... (Score:1)
finally Dragonball GT Episodes (which won't
ever be available in my country in near
future - about the next fifty years).
Way better than napster which I once tried
as I had heard of.
How come old technology keeps making headlines? (Score:2, Offtopic)
Sorry, but does anybody remember CORBA? DCOM? Or any of the zillion other frameworks for writing distributed applications that've been around for over a decade? A whole freaking lot of corporate applications ALREADY DEPLOYED are distributed applications that, in some way or another, are "p2p" applications. The one I'm personally most familiar with is Tivoli, which was a distributed app with installed clients interoperating via a distributed framework as far back as 1992. Does that make us Tivoli people futuristic super-geniuses? No, it doesn't -- because distributed apps have been on people's minds since networking was born. I mean, duhh. But hack together something that lets people swap ripped songs, and *poof* it's a "new wave".
And does anybody else feel like we've been hearing about soldiers wired together for years and years (and years)?
Re:How come old technology keeps making headlines? (Score:3, Insightful)
So now distributed computing has this neat new "p2p" hax0r acronym, and the fact that you can write distributed applications if you've got networked computers is news.
Well said!
Watch now as the corporate giants wake up and start to co-opt the methodology and recast it as their innovation and file patent suits against any and all they perceive as transgressing their IP.
Watch the partnerships a la Groove Networks foment:0 1/10-10GroovePR.asp [microsoft.com]
http://www.microsoft.com/presspass/press/2001/oct
and then watch as any work-alike initiatives are crushed in the courtrooms of America.
Re:How come old technology keeps making headlines? (Score:2, Interesting)
God bless'em for what they are, and what they have allowed me to see and hear. The reason this type of application is news worthy is because it is the absolute fastest, easiest, and most reliable way for me to access content on the net that I can't find via other channels. These apps make big news because it fell into the laps of every day citizens, and opened u[ a whole new world for them.
pointym5, do you see what I mean? No, wait. I don't care. your do elite for these things, I don't know why I bother.
Re:How come old technology keeps making headlines? (Score:1)
That's certainly one application, but it's by no means the only meaning of "distributed computing". I think the basic idea is that the application code exists around the network on the machines that want/need/request services.
A network of "simple" point-to-point file transfer agents is not really that simple.
Re:How come old technology keeps making headlines? (Score:1)
The Technical term refers to one application using clock time of two or more proccessor, in two or more physically seperate machines.
That being said, While I disagree with your usage of the term, it's a free contry, and I don't want to impose any view help by any type of concortium on your vocabulary...
About the term(s) and it's meaning I digress, but as this story was about P2P, I have to agree with the moderation.
Re:How come old technology keeps making headlines? (Score:2)
No, it doesn't. For example, I don't consider a simple FTP client connecting to an FTP server based on a user-supplied address to be a distributed application. But an automated file downloader that operates off a local preference database and that locates its "servers" by using some search algorithm it runs itself, well that's in the gray area. Of course if the app is able to serve as well as act as a pure client, there's no discussion.
Re:How come old technology keeps making headlines? (Score:1)
*Real* P2P (Score:2)
yeah (Score:3, Funny)
It goes to show you that without P2P software we wouldn't have as many online businesses.
Re:yeah (Score:1)
Re:yeah (Score:1)
P2P's future (Score:2)
The HTTP protocol & Push facilities in Gnutella are great for the firewall ridden. With a search engine on the web (find it yourself), we can download shared gnutella data even with a plain old browser. This feature of the gnutella protocol (backward compatibilty) allows it to by pass the toughest restrictions in corporate gateways
I for one have used gtk-gnutella. That stuff just rocks !. Win32 guys also have a free (again, as in speech) client in Gnucleus. All this leads to one small point P2P is here to stay.
IM usually != P2P (Score:2, Interesting)
Yahoo! IM, ICQ, AIM, etc. are not P2P. They are pretty server-centric systems.
I think I'm going to go try Jabber [jabbercentral.org].
IM usually sometimes = P2P (Score:1)
AIM = P2P (Score:1)
napster (Score:1)
Re:napster (Score:1)
If Napster hosted any songs on their system it would have been taken down an awful lot faster!
here we go again (Score:5, Funny)
In the old days, our computers talked to each other. I send you a mail, and my VAX sends it to your Sun. Then, everybody put a PC on their desk, and everything was centralized. I send you an email, it goes to my mail server, then to your mail server, then to your computer.
Well, now we're back again! Imagine that! Bring out the VCs! Bring out the patents!
I predict by 2005, we'll see a new form of P2P that uses a Central Peer for maximum performance. Get this folks, we all know how great P2P is, but sometimes it can be inefficient. What if your peer is down? Why not forward your data to a Central Peer, which is a beefy computer that can handle lots of data, and let it worry about the details? So your computers are on the Edge, and the big computer is at the Center of a big conversation. In fact, the Edge computers don't even have to talk to one another, they can just communicate with the Central Peer.
I dub this exciting new invention: Center/Edge computing. I have a patent, and lawyers.
Yawn.
A really cool use of P2P (Score:4, Interesting)
Content Cacheing isn't the end-user's problem (Score:1)
Finally, opencola's economics also look a bit iffy. Who pays for swarmcast? I don't think that the end users will, since they don't get anything. That leaves the website owners. And it seems like those who need it the most, e.g. small university sites that get slashdotted, would be least able to afford it.
Preposterous. (Score:5, Informative)
The notion that Napster (or any other file sharing system) can lay claim to any part of the P2P phenomenon, aside from raising awareness, is absolutely ridiculous. The notion that P2P is just now starting to gain a foothold in businesses is fiendeshly drug-induced.
The hype still continues. Ignorance pervades. What they really mean is "distributed", and even then most reporters are still talking out of their asses.
--jordan
The biggest flaw (Score:2)
This is relatively simple too. Just measure hops. Find out where the backbone routers are, then separate out any servers that are found inside that router, and give priority to those.
-Restil
Re:The biggest flaw (Score:3, Interesting)
So for example, we know Sprint peers with UUnet and so Sprint users would see Sprint users first, UUnet users next. Doing it at the AS level is far easier than actually attempting to map the actual hop distance between every arbitrary point on the Internet.
Another reason why copyrights must go (Score:2)
Every indication is that the next generation internet is going to be P2P. Probably a freenet type model. If we have tough copyright enforcement, it will be at odds with this.
Re:Another reason why copyrights must go (Score:1)
P2P = catchall buzzword; lacks a killer app. (Score:3, Insightful)
On the other hand, if you define peer-to-peer in a more pure sense, where each node is a peer, doing its own thing and maybe using one or more directory servers or repeaters to find others, then Napster looks like the only winner I can think of, and it's clearly dead now that it's gone legit. Most IM apps look like client-server to me, although they have some P2P aspects such as file transfer... they're not any different from IRC + DCC, really.
I interviewed with a couple of local (SF) "P2P" companies (really internet-based distributed computing platform vendors) a year ago, and they were having trouble selling their concept even then. Yes, there are CPU-intensive tasks out there that companies would pay to accomplish, but they tend to operate on a lot of data, and that data tends to be sensitive/confidential. One company was refocusing on internal deployments only - using corporate desktops inside the firewall to run distributed tasks at night. That mostly solves the bandwidth and sensitivity issues, although in a WAN environment you might not be able to use remote LANs if the pipe to the remote LANs are too small for the amount of data being crunched.
It's hard to think of too many true P2P applications. P2P architectures that don't include central directory servers or reflectors tend not to scale - think back to old LAN protocols that didn't scale well a WAN context. It's the same problem but at a higher level. The more scalable protocols use some form of central servers or at least a group of more centralized peers (routers, PDCs, whatever) to find one another. Pure P2P doesn't scale due to network inefficiencies (think Gnutella without repeaters); pure client-server doesn't scale due to node scalability limits. A hybrid such as Napster or the WWW scales very well, though. (The whole web isn't on one big server...)
With appropriate signatures, open-source software distribution might be a good P2P application. Instead of hunting around for a fast mirror, why not grab it from a peer, provided the signature is valid? Only the signature has to come from the main server (or a mirror).
The problem with that is the same as what everybody finds when using Bearshare, Kazaa, etc. - upstream bandwidth from peers is very limited. ADSL, "56K" modems, and cable modems all tend to be asymmetric, limiting a P2P network run over them to the collective upstream bandwidth. Imagine 10 people with DSL, trying to swap 10 files - no matter how you slice it, everybody might as well be downloading from one guy. A P2P file sharing program called eDonkey2000 tried to avoid the single-source problem Napster and Gnutella face by using hashes to request files by hash rather than filename, so multiple peers can send you slices of the file even if the name differs, and even if some of them drop out over time. It's nice for big files because you will eventually get all the parts from somebody, but it's still slow.
I think that perhaps multicasting is the only solution for this. P2P plus multicasting would eliminate the problem of popular servers being swamped by requests.