Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Software

RSS & BT Together? 161

AntiPasto writes "According to this Yahoo! News article, RSS and BitTorrent could be set to join in a best-of-both-worlds content management system for the net. Possible?" Update: 03/17 21:39 GMT by T : Thanks to Steve Gillmor, here's the original story on eWeek to replace the now-dead Yahoo! link.
This discussion has been archived. No new comments can be posted.

RSS & BT Together?

Comments Filter:
  • by Anonymous Coward on Tuesday December 16, 2003 @11:11AM (#7734719)
    Also, if you make your feeds static files, rather than dynamic, a modern server is going to have no problems serving it hundreds (or thousands) of times a minute if necessary.
  • I highly doubt it. (Score:3, Insightful)

    by junkymailbox ( 731309 ) * on Tuesday December 16, 2003 @11:16AM (#7734773)
    The article's idea is simply to make the web (at least the rss) distributed and then query the distributed server to change from 30 minutes refresh to a faster refresh. But the distributed server needs to be updated also. It may simply be cheaper / more efficient to simply run more servers.
  • by clifgriffin ( 676199 ) on Tuesday December 16, 2003 @11:16AM (#7734781) Homepage
    ...practical ways. It's a nice program, I've used it on occasssion but it does have its share of bugs.

    And setting up a server isn't quite easy.

    It really could be a lot better with some work.

  • by scrytch ( 9198 ) <chuck@myrealbox.com> on Tuesday December 16, 2003 @11:24AM (#7734856)
    Of course it hasn't caused any problems. It's a couple folks every half hour. Try a few thousand folks every minute (imagine it's a metaserver for some online game, or a blog during a major news event).

    Still, I'm not seeing anything beyond the "duh" factor here. All that needs to happen is for browsers to handle torrent links. Not some souped up napster app, a browser, so that I can type in a torrent link and get any web page (or other mime doc) for the browser to handle. Change the RSS to use the new URL scheme, and there you go. You could also do it as a proxy, but you run into worse cache coherency issues than with direct support of the protocol; who's to say who has the correct mapping of the content url to the torrent url?

    Good luck, mind you, on getting anything but blogs, download sites, and perhaps hobby news sites to jump on board. This issue has been beaten to death in the IETF and many other circles, and it all boils down to content control -- the NY Times simply doesn't want its content mirrored like that.
  • by penguin7of9 ( 697383 ) on Tuesday December 16, 2003 @11:31AM (#7734939)
    Sorry, guys, but you are basically reinventing USENET over TCP/IP.
  • by SWroclawski ( 95770 ) <serge@wrocLIONlawski.org minus cat> on Tuesday December 16, 2003 @11:40AM (#7735036) Homepage
    Bandwidth bills on a static page are also trivial.

    A well behaved program won't go GETs on every RSS page, but will do HEADS, compare them to what it already has, and decide from there to get or not get the new page.

    A HEAD request is very small, and unless you're doing millions of them, this shouldn't be an issue.

    - Serge
  • by Gadzinka ( 256729 ) <rrw@hell.pl> on Tuesday December 16, 2003 @11:55AM (#7735182) Journal
    What morron modded parent as insightful?

    Does your usenet reader serve news articles to other users?

    No, you need a costly usenet servers architecture. Not only machines, but also huuuge bandwith. Today's usenet servers that want to serve large portion of world hierarchies can only get it via dedicated satellite usenet-only feeds.

    RSS+BT on the other hand is poor server and rich clients that exchange articles between themselves via p2p network only supervised by a BT tracker.

    Robert
  • by PierceLabs ( 549351 ) on Tuesday December 16, 2003 @12:11PM (#7735404)
    There are too many steps involved. What's needed is the ability to put content into a deploy directory things just get torrented and distributed.

    The other problem being the relative difficulty of actually finding those 'random' websites that contain links to the things you'd actually want to download.
  • WebTorrent (Score:4, Insightful)

    by seldolivaw ( 179178 ) * <me&seldo,com> on Tuesday December 16, 2003 @12:27PM (#7735578) Homepage
    I blogged about the possibilities of using BitTorrent to deliver web content [seldo.com] back in April, but I didn't consider RSS. The idea worked out between myself and some friends was a network of transparent proxies as a way of dealing with Slashdot-style "flash crowds". When you request content, your proxy requests the content from you, and simultaneously broadcasts the request to nearby machines. If any of those machines have already downloaded the content (some form of timestamp and hash is necessary to ensure it's the correct and authentic version of that URL) then they will send that content to you, allowing servers already under or expecting heavy load to push out a new HTTP status message "use torrent", supplying a (much smaller) torrent file. This allows web servers to scale much better under flash crowd conditions.

    The drawback of the WebTorrent idea is that you need some way to group all the images, text and stylesheets together, otherwise you have to make a n inefficient P2P request for each one. RSS is a great way of doing that.

    There aren't many details online at the moment of the work we did on the WebTorrent idea; it was mainly an e-mail thread -- get in touch if you'd like details. The project page [seldo.com] is available, but I stopped updating it so it doesn't have all the work that was eventually done.
  • by welsh git ( 705097 ) on Tuesday December 16, 2003 @01:22PM (#7736129) Homepage
    > A well behaved program won't go GETs on every RSS page, but will do HEADS,
    > compare them to what it already has, and decide from there
    > to get or not get the new page.

    An even more behaved program will issue a GET with the "If-Modified-Since: " header, which will mean the server will return a "304 - not modified" if the file hasn't changed, or the actual file if it has.. Thus doing in one operation what a combined HEAD and followup GET would take 2 to do.
  • Re:WebTorrent (Score:3, Insightful)

    by seldolivaw ( 179178 ) * <me&seldo,com> on Tuesday December 16, 2003 @01:34PM (#7736256) Homepage
    Even better, why not let the format of the manifest be XML, and let the data compression be handled by HTTP gzip compression? In which case, your JAR files become RSS feeds...
  • modtorrent (Score:2, Insightful)

    by Isbiten ( 597220 ) <isbiten@gmail. c o m> on Tuesday December 16, 2003 @02:32PM (#7736957) Homepage
    What I would like to see is modtorrent for apache. Where you could specify that files larger than 20MB would get sent as a .torrent instead. And it wouldn't require you to make a .torrent manually instead it would create it when a file was requested. And put it in a directory so it was ready to serve it the next time someone wanted it. Would work great if you want to have large files such as movies and demoes on your site.
  • by penguin7of9 ( 697383 ) on Tuesday December 16, 2003 @02:37PM (#7737023)
    Does your usenet reader serve news articles to other users?

    Yes: the way people traditionally read USENET news is by becoming a USENET node, downloading articles to the directory hierarchy of the local machine, and then redistributing them to neighboring sites. Reading news by connecting to centralized news servers via a network client happened many years later.

    No, you need a costly usenet servers architecture.

    There is nothing intrinsically "costly" about it: it's something a PDP-11 used to handle and that regularly ran over dial-up.

    Not only machines, but also huuuge bandwith. Today's usenet servers that want to serve large portion of world hierarchies can only get it via dedicated satellite usenet-only feeds.

    Just like a BT solution, you only redistribute those articles that you yourself are interested in.

    The reason why we got a USENET infrastructure with a small number of backbone sites (compared to the readership) that carried everything is simply because a bunch of sites took on that role and carry everything. There is nothing in the protocol or design of USENET that requires it.

    RSS+BT on the other hand is poor server and rich clients that exchange articles between themselves via p2p network only supervised by a BT tracker.

    And you believe that BT and the BT tracker scales up to many billions of files on millions of nodes by sheer magic? BT would probably need a lot of work to scale up. And at least USENET doesn't need any supervision by anything--it's completely asynchronous and unsupervised.

    Note that I did not claim that USENET would work any better than RSS+BT--I have no idea whether it would--simply that people are basically reinventing USENET when they combine RSS and BT.

    I actually suspect that there are intrinsic properties of large peer-to-peer news networks that people don't like because that's why USENET became more and more centralized over the years.

    What morron modded parent as insightful?

    That's what I would ask about your posting. In fact, I would ask what moron wrote it.
  • by ikewillis ( 586793 ) on Tuesday December 16, 2003 @04:04PM (#7738118) Homepage
    The problem with attempting to cobble BitTorrent onto an RSS feed system is that BitTorrent would still utilize a "pull" model for distributing the syndication data, but instead of directly fetching the XML document syndicators would grab a .torrent file. While this may decrease the bandwidth used, it only solves half of the problem. What really needs to be addressed is the "pull" model being used to fetch the RSS document in the first place.

    A better solution would be eliminating the need for syndicators to constnantly poll waiting for RSS updates by using IP multicasting to notify syndicators of when the content of a particular RSS feed has changed. Multicast protocols which provide such announcements already exist, such as the Session Announcement Protocol [faqs.org] which would notify those curious of updated RSS feeds. A URL to the updated feed would be provided, and afterwards whatever file transfer protocol you wish could be used to fetch the RSS feed itself, even BitTorrent.

Work is the crab grass in the lawn of life. -- Schulz

Working...