Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Media Television The Internet

Roll Your Own Television Network Using Bittorrent 252

Cryofan writes "Mark Pesce, lecturer at the Australian Film Television and Radio School (AFTRS) writes here and here about using p2p networks, specifically bittorrent, to create a grassroots television network. He cites as an example the BBC's "Flexible TV" internet broadcasting model using that as the core of a "new sort of television network, one which could harness the power of P2P distribution to create a global television network." Producers of video entertainment and news would provide a single copy of a program into the network of P2P clients, and the p2p network peers distribute the content themselves. Thus, a virtual 'newswiki' where the content is distributed bittorrent using some sort of 'trusted peer' or moderator mechanisms as a filtering/evaluation mechanism. So what is stopping anyone from doing this now? Awareness of the concept, perhaps? Lack of broadband connections? Lack of business models for content producers?"
This discussion has been archived. No new comments can be posted.

Roll Your Own Television Network Using Bittorrent

Comments Filter:
  • by thenightisdark ( 738700 ) on Tuesday October 05, 2004 @07:35PM (#10445895) Homepage
    All someone would need to run a station would be to run an rss feed. Everyone would download .torrents basied on the RSS, then boom, instant 'station'. Hell, i might pay someone to access their RSS feed for this purpose.
  • Huh? (Score:1, Interesting)

    by Anonymous Coward on Tuesday October 05, 2004 @07:37PM (#10445917)
    Haven't people used bit torrent to download TV shows for as long as it has existed?

    Ohhh, you mean legitimately!
  • by im_thatoneguy ( 819432 ) on Tuesday October 05, 2004 @07:38PM (#10445924)
    Whenever a new episode of Stargate comes out a bittorrent streams it live as it is created... I'm not sure exactly how they're doing it but they're doing. The reason nobody is legally doing it because the distributors pay them I.E. the local broadcasters and sattelite/cable companies for usage. It's an extra dollar they wouldn't make. Actualy it's an extra million dollars they wouldn't make.
  • by TiggertheMad ( 556308 ) on Tuesday October 05, 2004 @07:44PM (#10445978) Journal
    There is a public access cable station where I live, so my first thought was why bother? Do we really need to have that funny guy that lives by the old slaughter house broadcasting world wide his theories about alien brian implants?

    From the standpoint of news broadcasting, this could be really big, though. Set up a /. type site with a moderation system, and let people submit their own footage of local news stories. You would get excellent coverage (OSS though: many eyes is a good thing), and it would be hard to censor stories. Localization/Translation might be tricky, though...
  • P2P (Score:3, Interesting)

    by dwight0 ( 513303 ) on Tuesday October 05, 2004 @07:46PM (#10445999) Homepage
    Has anyone thought of using a P2P network such as Gnutella or Edonkey / Emule for this? What if the provider's webpage had a link for a file hash to be found and for Emule to automatically download. The content is secure because its very difficult to generate a forged file for a hash thus a 'trusted peer' moderator wouldn't be needed. Mule is very good at redistributing content across its entire network even if its not actively being downloaded by yourself, it spreads rare files across the network to ensure that all content is accessible. Any comments on this? This would also useful for general file sharing too.
  • by Brigadier ( 12956 ) on Tuesday October 05, 2004 @07:50PM (#10446035)

    A friend and I produce a little 1/2 hour news talk show which we broadcast on local cable channel three. Now we are looking to get it on our local pbs station. costs are negligable. My friend who is a tech freak has the latest G5 with a DV card and a high end Sony Cam (about $5000 in hardware). Studio time is free based on cable regulations. (if your not aware FCC requires cable operators to provide free service and equipment to local users.) for us this included a studio with 3 mounted cameras, an editing room and post editing equipment. The hardest recourse is time. but for someone who is dedicated is the price we pay.
  • Torrentocracy (Score:4, Interesting)

    by lerhaupt ( 231905 ) on Tuesday October 05, 2004 @07:51PM (#10446037) Homepage
    Check out Torrentocracy [torrentocracy.com] for a way to download bit torrented content from RSS feeds straight to your TV. As far as content, that's the major stumbling block. There needs to be more people willing to license under the Creative Commons. Per that, I'm also currently hosting [torrentocracy.com] interviews from Robert Greenwald's last two movies, Outfoxed and Uncovered.
  • Multicast (Score:3, Interesting)

    by Doc Ruby ( 173196 ) on Tuesday October 05, 2004 @07:51PM (#10446043) Homepage Journal
    This kind of app makes BitTorrent into a P2P multicasting network. Finally, URIs (Universal Resource Identifiers) for media objects aren't limited to URLs (Universal Resource Locators), constrained by network topologies like bandwidth and persistence. Where's the streaming version for media play that doesn't need saving, with buffering and caching for a truly distributed media cloud? All the multicast experimenters, from MBONE to Internet2 and beyond should jump on this platform, finally meeting rubber with road on the infobahn.
  • by YOU LIKEWISE FAIL IT ( 651184 ) on Tuesday October 05, 2004 @07:53PM (#10446061) Homepage Journal

    I attended this talk at the National Student Media Conference last weekend, ( for any other attendees, I was the NSMC volunteer managing the digital projectors... ) and it was interesting to see the ideas mooted here percolating out into the other panels that took place over the rest of the conference. I think the independant media needs to continue to forge closer ties with the tech community to allow things like this to come to fruition.

    One thing that didn't get brought up was whether this will compete with or complement Indymedia's upcoming IVDN video distribution framework. I was hoping to chase Mark up on this after the conference, but lost his email address - thanks submitter!

    YLFI

    P.S., Mark, if you're reading this, I crashed in your suite on Sunday night - thanks for the keys. :-P

  • by Black Acid ( 219707 ) on Tuesday October 05, 2004 @08:06PM (#10446151)
    Whenever a new episode of Stargate comes out a bittorrent streams it live as it is created...

    Is this possible with BT considering that it sends out blocks in a non-sequential order and the .torrent file contains SHA-1 hashes of the blocks? eDonkey sends out blocks in random order, as well, in order to optimize against the rare missing block problem. I think this is a good optimization to take, especially on file distribution networks, but it sacrifices the ability to stream (as far as I know). Anyone know any more about this?

  • by reality-bytes ( 119275 ) on Tuesday October 05, 2004 @08:07PM (#10446166) Homepage
    Peercast [peercast.org] already allows for P2P video streams in most popular formats.

    I've had a go with it and its not too shabby.

    With clients for Mac, Linux and Windows, availability is good. Unfortunately, Peercast doesn't advertise themselves too well which means there aren't so many video streams available yet (typically 5-15 video streams and 100 or so Audio streams.)
  • Slashdot TV (SDTV)? (Score:1, Interesting)

    by Anonymous Coward on Tuesday October 05, 2004 @08:15PM (#10446221)

    How about Slashdot TV! 24 hour Nerd News.

    Slashdot effect the world!

    ~-~
    Anonymous Coward - The one and only
  • by dubiousmike ( 558126 ) on Tuesday October 05, 2004 @08:58PM (#10446533) Homepage Journal
    how will you measure the effectiveness of your efforts?
  • BBC... (Score:4, Interesting)

    by Insipid Trunculance ( 526362 ) on Tuesday October 05, 2004 @09:06PM (#10446589) Homepage
    Can some one shed some Light on the BBC's Flexible TV?

  • by glengyron ( 452198 ) on Tuesday October 05, 2004 @09:22PM (#10446702)
    OK, so bit-torrent is the technology to move the data, but where is the content going to come from?

    The obvious answer to my mind is bloggers.

    Imagine getting your news not from CNN / Fox, but instead actually from someone on the ground living in an apartment in Baghdad while it's being bombed?

    Get news reports on SCO vs Everyone not just from the media and court filings, but actually see image of the court building where it's all happening with bloggers telling us how they think the proceedings are going at the moment.

    Blogging is the news network of tomorrow, and this is how it will be done.

  • by dubiousmike ( 558126 ) on Tuesday October 05, 2004 @10:00PM (#10446903) Homepage Journal
    I mean really, the lousiest thing about P2P after Napster and other centrally located server-peer-peer services got wiped out. But where does that bring you? To distribution of content with no statistics gathered. I don't care where on the ladder of the content builders you are at, if you can't back up your claims of percentage of the market, then you eventually lose all funding. Even grant givers want to give to someone who will likely be sucessful so that their name comes out somewhere. Its still a business and the ones who will win will be the ones who provide a usable experience. There are services that are close.
  • by hyc ( 241590 ) on Tuesday October 05, 2004 @11:02PM (#10447217) Homepage Journal
    Think about how the network bandwidth is being used in BitTorrent - I open up connections to as many providers as I can find, and download data from them. Other clients do the same, and try to download data from me. The exact same data will go back and forth across my connection multiple times. And, across the entire network, there are N nodes connecting to as many of each other as possible, a mesh of size NxN, and each of those connections is carrying essentially the same data. As N grows, the amount of resources required to maintain those NxN connections grows geometrically. You cannot sustain that kind of growth rate, the physical network will collapse when the system gets popular enough.

    There's also other more immediate practical limits. Many users now are connecting via broadband, which is great, but the transfer speeds you get are asymmetric. So even though you have a very fast potential download rate, your upload rate is very limited. In a bittorrent setup where every peer's download rate is proportional to their upload rate, this means you are inherently unable to utilize your full download capability, because your upload rate cannot match it.

    Also, a lot of users are attached to the same network providers. The routers in those networks are transmitting the same data over and over to each of the individual users. That's an unnecessary waste of bandwidth.

    A sane approach would be to have a program guide (like TV Guide) published at a well-known URL that tells you what content will be available in what multicast group at what times. Your client software will join the multicast groups of interest at about those times. This informs the routers upstream from you that you're interested in a particular channel. When the multicast begins, the sender just needs to send one copy of the data to its local network, and all of the routers on that network will fan out one copy of the data per target network. This immediately reduces the network resource load from the NxN nightmare to a near-constant value based on the size of the network, as opposed to the number of clients. And in the common case where you have a bunch of broadband subscribers all connected to the same router, the data only traverses that network link *once*, no matter how many subscribers there are. No more wasting bandwidth with N copies of the identical data. And, everyone gets the data at their full download speed.

    You may think that bittorrent/suprnova are successful *right now* but they can never hope to reach an audience of millions, the way a TV network does. The internet would melt down long before they could do so.

    Any design whose network resource consumption scales based on the number of users is doomed to be a victim of its own success. But the approach I've described will scale efficiently because it's only dependent on the number of routers in the network, not on the number of listeners trying to receive the content.
  • by br00tus ( 528477 ) on Tuesday October 05, 2004 @11:51PM (#10447543)
    v2v [v2v.cc], a grassroots news network associated with Indymedia, is currently doing this.
  • Think about how the network bandwidth is being used in BitTorrent - I open up connections to as many providers as I can find, and download data from them. Other clients do the same, and try to download data from me. The exact same data will go back and forth across my connection multiple times. And, across the entire network, there are N nodes connecting to as many of each other as possible, a mesh of size NxN, and each of those connections is carrying essentially the same data. As N grows, the amount of resources required to maintain those NxN connections grows geometrically. You cannot sustain that kind of growth rate, the physical network will collapse when the system gets popular enough.
    I will admit to only skimming the whitepaper at bitconjurer, but my impression of BTs operation is different from that which you describe. Notably, the tracker is doing things somewhat more intelligently. A downloader does not open as many connections as possible to other nodes; instead, a downloader is instructed by the tracker as to specifically which node(s) it shall connect to at a given moment, with the tracker handling the logic to optimize distribution in terms of most efficient use of resources (e.g. making use of high bandwidth when apparently available in up/down directions, minimizing risks associated with nodes dropping out unexpectedly, etc). If my notion of BT operation is accurate, then proving that BT can scale is probably beyond me, but making a solid case that it can't scale would require more than the NxN model.
  • by Otto ( 17870 ) on Wednesday October 06, 2004 @01:00AM (#10447874) Homepage Journal
    The tracker can be more efficent, but in order to reach anywhere near that kind of real efficency, it would need more information than it actually has.

    Firstly, it can only make educated guesses at the available bandwidth of the nodes. Nodes will lie/cheat/steal in order to get more packets, and you can't trust the clients. They're greedy.

    Secondly, it doesn't really know the network topology. Again, you're only able to make educated guesses. If my neighbor and me are on the same torrent, then ideally the tracker would be able to tell us about each other, we'd connect, and share at very high speeds, being that we're both close to each other and on the same subnet and such. That case might be easy to recognize, sometimes, not so easy other times. Without full knowledge of the whole network, it's impossible to do perfectly in any case.

    Third, even with the most efficent possible tracker, the grandparent is right. You have X users downloading, and they all are downloading Y bits of data. All data transfer is point to point, meaning that X*Y bits of data must be sent out for everybody to get the complete file. For every byte downloaded, there's a byte uploaded. You can make that fast by maximizing your throughput and managing it all into small sub-networks, but it still doesn't scale to everybody in the world.

    A multicast setup does scale, even if it is a pain in the ass to do right now. One byte sent out from the source gets duplicated for each branch in the routing tree, and all users receive it. Upload rate is constant. If you ignore new users joining and old users leaving, traffic along each branch in the tree is only one copy of the stream, all the way until it reaches the endpoints (the viewers).

    The problem with multicast is that it's confusing as hell because it requires cooperation of all the routers to handle the multicast traffic appropriately. But for any single source to many receivers, it's easily the most efficent way to do things.

    And let's not forget that while torrent trackers *could* be more efficent, they are quite simply not that efficent. The torrent network is often highly connected instead of sparsely connected. Especially on larger files. A sparser network would be more optimal (read as: faster) in extremely large torrents, but it is rarely the case currently.
  • Good Content Idea (Score:4, Interesting)

    by ImaLamer ( 260199 ) <john@lamar.gmail@com> on Wednesday October 06, 2004 @02:38AM (#10448249) Homepage Journal
    I've mentioned this [slashdot.org] on Slashdot before, a few times, but this type of thing is a good candidate for educational programming, not the news.

    If someone (PBS?) could release all of their educational content under a non-restrictive license then I'd happily pay for the dedicated servers to host and track the torrents. Math, History and Science programs would get even the adults involved but would be a great resource for people who are home-schooling or parents who want to keep their children occupied when home sick from school.

    I don't know why we, Americans, have not done this already. I suspect that bandwidth is an issue but that is somewhat silly as it is otherwise wasted on illegal downloads and that sort of thing.

    There should be a public education page that acts as an entry point for materials for students and teachers alike. Think "cable in the classroom" turned into "internet in the classroom". Why haven't a few public school teachers already gotten together and made this a reality? 30 minute shows aren't that hard to make. Take your lesson plan and turn that into a script. Read it, or hire someone to and viola.
  • The BBC and iMP (Score:2, Interesting)

    by PhillC ( 84728 ) on Wednesday October 06, 2004 @05:12AM (#10448673) Homepage Journal
    This is pretty much exactly what the BBC is currently trialling with their own product called iMP [bbc.co.uk] or interactive Media Player.

    Their own webpages are a little light on content and mostly aimed at helping out the Beta testers, but more useful information can be found on various [digital-lifestyles.info] sites. [theregister.co.uk]

    iMP is P2P client that allows distribution of BBC programmes. There is a DRM component that stops a programme being watched 7 days after downloading. iMP is a great idea for the BBC as it has the potential to significantly reduce the infrastructure costs in terms of streaming and network bandwidth required. A big question for me though is how robust their DRM technology will prove to be.

Scientists will study your brain to learn more about your distant cousin, Man.

Working...