Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Cringely on P2P vs Streaming Data Centers 179

Anonymous Coward writes "Robert X Cringely is postulating today that as bandwidth applications grow, the data centers will never be ready to serve 30 million concurrent streams of data. Akamai, with its tens of thousands of servers spread in an intelligent topology, still can't serve more than 150,000 concurrent streams, which is never going to impress the TV network exec used to audiences in the millions. Cringely choruses that secure P2P is the solution to delivering not only high quality video but also to audiences that scale in the millions. BitTorrent seems to have worn out it's welcome with the MPAA recently, so maybe the future holds P2P networks owned and managed by Hollywood?"
This discussion has been archived. No new comments can be posted.

Cringely on P2P vs Streaming Data Centers

Comments Filter:
  • Change the paradigm (Score:3, Interesting)

    by cos(0) ( 455098 ) <pmw+slashdot@qnan.org> on Saturday February 25, 2006 @07:26PM (#14801870) Homepage
    Sure, currently 150,000 copies of data puts a large strain on the servers... what about one copy broadcast via multicast, much akin to airwaves?
    • by afidel ( 530433 )
      Yep, but there has to be a serious profit motive for the network providers because they will have to do a LOT of work to get multicast working reliably across their entire network.
    • Don't you mean "go back to the old paradigm"? Isn't the whole appeal of IP based content distribution to get away from that model? Content on demand, yada yada yada?
      • How about if for everything that's being downloaded, the server will multicast packets in a loop. I mean, so that if you don't catch the first five minutes, your computer will just keep downloading until it loops to the start again, and download the first five minutes it missed.

        Obviously this won't work for streaming, but a similar method could be employed.. in particular, it could start multicasting a movie every 5 minutes, so that you'll never be more than 5 minutes away from the start of a movie, and ser
    • Multicasting will deal with the challenge faced with distributing a single live event. However, TV networks are moving into Video On Demand as quickly as they can. They will have to probably invest in two distribution bases.

      1) Multicast for "Regularly scheduled programming"
      2) P2P for day after and future VOD distribution.
    • Then how do you control it? Its the same problem with radio. At least with radio you make the majority of profit from sponsors and advertisement so theres no need to control distribution (not to mention the fact that its relatively cheap to setup a radio station). So its 'ok' if you have no control of who hears the content. (More ears = more audience = more sponsors)

      But when you put it online (multicasting, Bittorrent, whatever) how do you tell whats your audience? You can't track them, hackers would go ins

      • You use encryption and locked-down client software/hardware like iTunes or Akimbo. (Of course, anything can be cracked, so your system only has to be more secure than DVDs.)
        • Bingo. What, exactly, is the difference between multicast on the 'net and DirecTV? Both broadcast to everyone, both are only supposed to be used by paying customers. DirecTV does it successfully, so does Dish Network. And there are satellite TV companies in other countries as well.

          So why can't they do it with Multicast?

          As for figuring out how many people are watching, another reply has it right: we don't know now, so worst case scenario, what changes there?

      • But when you put it online (multicasting, Bittorrent, whatever) how do you tell whats your audience? You can't track them, hackers would go insane and tear the tracking code out.

        I know! Imagine if television signals were broadcast over the air, to cathode ray tube based devices with little to no digital components at all, and no way for viewing data to be sent back to the broadcaster?

        Oh wait, that's the way it's worked for over 50 years. And there's a multibillion dollar ratings collection company
      • how exactly do you count the number of television watchers or radio listeners ? it's easy ... you don't ...

        the tv companies have no ide how many people are watching them, they believe the poll results that are given to them by poll companies that are in close connection with them and therefor not objective ... there are automatic machines that can be placed between the tv and the antenna, but that only measures the looking statistics of people that really have this item installed (in my country there is a
        • Radio is only high because there is demand, and the govt sees it as a free cash cow to go charging
          $0000000000000's worth so the station will have a hard time recovering the cost.

          Hardware wise its peanuts. Hell, its probably cheaper to pay $10m to make a sat and launch a sat from russia for $20m, than
          pay the local govt $80m for a damn licence. And go broadcast from space geo.

          Imagine if the govt suddenly made a 'website licence' and charged people $1000/yr. Or a streaming media licence for
          $10/gig/year or some
        • You know, there is a science called statistics. And statistics can tell you how many households you need to get a certain margin of error for your measurement. As long as the households are randomly selected (not too hard to do), it's accurate to a percent or so even with a smallish sample.
          • Not when the sample is self-selected, as is the case in the Nielsens. That *always* skews the data, as you would know if you actually had some experience working with statistics.

            Max
      • The iTunes store does fairly well as a centralized system, but even Apple has admitted this, their profits are virtually a joke in terms of actual cash amount.

        iTunes is not wildly profitable because the Record Companies said "give us X% or we won't give you access to our catalogues"

        Apple got their foot in the door and is laughing all the way to the bank. They could lose money on iTunes and still be laughing, all because the iPod is making a killing.

        Now, Apple has enough muscle to tell the **AA to go pound s [wordorigins.org]

      • Wait a sec, there's no tracking code in analog radio or television. They do it by sample surveys. Why can't they do the same for internet multicast?

        Cheers.
      • Piracy is tough to stop, but really, do you need to? You can make it free with ads in it. Sure, the ads may be worth less, but prolly enough to cover costs. And you can track it, and charge advertisers by download or torrent success.

        And hackers want to be counted. Honestly, they'd try harder, because we watch stuff like Firefly that we love and want to do well, but doesn't pull in numbers. So I'd be worried about the opposite effect: People downloading 2-3 times in order to boost a show's numbers.
    • P2P narrowcasting so that everyone can watch their favorite show any time of day _they choose_ is like telling everyone in a crowded hot tub to move to the other side simultaneously.

      p2p Broadcasting a single feed is like having everyone shift over one seat.

      you get to sit next to the jet the same amount time. But you may not get to sit there when you choose.
  • Hollywood hasn't soured on BitTorrent itself, only a bunch of w4r3z tracking sites.
  • by Osrin ( 599427 ) * on Saturday February 25, 2006 @07:30PM (#14801882) Homepage
    ... multicast and proxy technology that we have spent the last 10+ years working on to solve this problem?
    • by Russ Nelson ( 33911 ) <slashdot@russnelson.com> on Saturday February 25, 2006 @08:16PM (#14802039) Homepage
      Whatever happened to the MBONE? I see that a book on the subject is now posted to the web and freely copyable because it's gone out of print. The MBONE FAQ dates from 1993. That's like (/me whips out his HP-41C calculator) 13 years old. Apparently the IETF has a group for MBONE Deployment, but it hasn't been updated since last September, and even then it was a year late for its final milestone.
      • IPv4 multicast across the Internet will never happen.

        The reason is the complexity involved in deployment (multiple protocols, MBGP, MSDP, etc.) and that you have the 'third-party problem'. Basically both transmitters and receivers have to rely on a third-party for a redezvous-point.

        Scalable Internet wide multicast deployment *might* happen with IPv6 because some of the issues have been solved (using, for example, embedable rendevous points - negating the need the 3rd parties). However if you look at how IS
      • The people selling bandwidth don't want to deploy it.

        At least that's what I've heard and it makes sense. Maybe the market pressures that cause power companies to give you rebates on EnergyStar gear could come into play.

        Or maybe a media enterprise will gobble up a tier one provider and make it happen to make multicast TV happen for their customers.
    • by Danathar ( 267989 ) on Saturday February 25, 2006 @08:57PM (#14802142) Journal
      Multicast has been deployed on Internet2 for some time now. I've watched 720p streams multicasted from Europe with no problem.

      The problem with deploying it on the commercial Internet is political. Backbone commercial Internet providers have had multicast on for a LONG time. ISP's that give you your home broadband connection which are mostly cable TV operators and companies like verizon don't want to provide a cost effective way for content providers on the net to deliver video. They would rather charge you for their "middleman" service. It's not like they don't know how to enable it, all they need to do is enable it on their switches and routers.

      Most cable operators use multicast already to stream the channels through their set top boxes.

      In Britain The BBC is working with ISP's to multicast to broadband connections. That would REALLY be nice if something similar happened here (In the U.S.)

      http://www.bbc.co.uk/multicast/ [bbc.co.uk]
      • The problem with deploying it on the commercial Internet is political. Backbone commercial Internet providers have had multicast on for a LONG time.

        That's not true. Having multicast turned on to support OSPF is not the same thing as multicast routing, which is what's necessary to support multicast feeds.

        The major problem with deploying multicast Internet-wide is management and security. ISPs would have to accept multicast routing information from their neighbors and trust they know what they're doing,

        • Not debating yout technical points, but why has this not happened on I2, or GEANT or any of the other large scale research networks with hundreds of thousands of combined nodes? For all the death and gloom predicted if multicast were to be deployed widely there has not been that many large attacks on multicast on those networks and arguably if there were to be some hacking/experimentation done you'd fine it there first (just a theory).
    • > What happened to all the multicast and proxy technology that we have spent the last 10+ years working on to solve this problem?

      The same that happened with IPv6 ? Technology is right here but currently almost nobody cares to use it...
  • Figures (Score:5, Insightful)

    by Kawahee ( 901497 ) on Saturday February 25, 2006 @07:32PM (#14801888) Homepage Journal
    "Akamai, with its tens of thousands of servers spread in an intelligent topology, still can't serve more than 150,000 concurrent streams"

    Assuming Akamai has only 10,000 servers, that's 15 streams per server. C'mon now, we're not that stupid.
    • by artemis67 ( 93453 ) on Saturday February 25, 2006 @07:51PM (#14801957)
      The Akamai figures are the embellishment of the submitter... Cringely doesn't mention Akamai anywhere in the article.
      • by Anonymous Coward
        We are serving around 120k concurrent streams on Akamai every day. We are throwing their model for a spin though. Akamai's network is geared for global broadcast. We are radio stations, so we have many individual broadcasts and the demand on them is local to the broadcast point.

        In some datacenters Akamai has only a few servers, so the logic of picking the closest server to meet the listner can backfire if that datacenter has limited capacity.
    • Assuming Akamai has only 10,000 servers, that's 15 streams per server. C'mon now, we're not that stupid.

      Maybe they're just short on bandwidth? 150,000 HDTV video streams is a hell of a lot of bits per second. Actually, it's 1/3rd of a terrabyte/sec, or so.

      I'm willing to bet that akamai's more focused on sending large numbers of people 10k files periodically, than sending 18 mb/s video streams.
  • The future is peer. (Score:3, Informative)

    by soupdevil ( 587476 ) on Saturday February 25, 2006 @07:34PM (#14801895)
    Content creators and content consumers are becoming one and the same. You can see this every day on sites like Jamendo [jamendo.com] and Flickr [flickr.com].
    • No, they're not. 'Content consumers turned content creators' is nothing new, they just have a platform to distribute their work more easily now. This in no way suggests quality of work , it merely increases the signal to noise ratio.
      • The distribution is the new part. Distribution by a major corporation doesn't suggest quality of work either. Filters are necessary, but monolithic corporations are only one kind of filter. Tags, ratings and reviews are alternatives, and more will be on the way, I'm sure.
      • What's so "high-quality" about "Desperate Housewives" or "Lost"? Most of the photos on Flikr are infinitely more interesting than the crappy TV that's forcing us to have DRM-ed everything.
  • I recently attended a talk that was part of a PhD student's oral defense. He detailed a really nice streaming video system that is congestion optimized instead of rate optimized called CoDiO [stanford.edu]. I asked him how long he thinks it would take to market this, but I think he said that they're still working out the kinks in the practical application. So yeah, the technology is definitely there to stream video over P2P, but I don't know about DRM. Then again... regular terrestrial TV broadcasts aren't hampered wi
    • > Then again... regular terrestrial TV broadcasts aren't hampered
      > with copy protection as far as I know...

      HDTV will be.
    • What we really should be doing is finding home grown HDTV applications. The HDTV specs are wonderful... alternate chanels, ditatal sound, time & date codes, data feeds, captions, ratings, all thrown into a digital signal if you want to do something else really cool. We need to start community/internet based HD efforts because it's obvious the big media isn't going to do it willingly. With PVRs and the internet, we could simply stream the HD content off the waves and watch it whenever we wanted.... wh
  • Predictions (Score:2, Interesting)

    Great. Another prediction on what technology will or will not be able to do in the near future.

    We all know how accurate these are.

    Also: There is a difference between serving the exact same fucking content, at the same time to 1 million people and generating custom pages on-demand for 1 million people.

  • so maybe the future holds P2P networks owned and managed by Hollywood?

    No way. I'm gald to support the legal P2P community; I frequently leave Knoppix or other Linux distros running for weeks on end on a spare system here and make available my modest upstream bandwidth. And I can understand that some may want to use their bandwidth to share material that might anger the MPAA or RIAA (and particularly in the case of the RIAA I don't have very negative feelings about that). But that's a far cry from ever t

  • Why exactly would anyone want to donate their bandwidth to movie distributors? What benefit would you get out of it? Restricted viewing rights through DRM doesn't sound like a benefit to me. I don't see how they'd square this circle; it's not a reasonable trade-off.
    • The stupid thing of it is that the bandwidth donated by P2P servers is pure waste anyways. A packet sent from a leaf node of the Internet to another leaf node makes TWO trips - one up to the backbone, and one back down. A packet served from a data center right on the backbone only has to make the trip down. So P2P just wastes bandwidth. As for server horsepower, I'm not worried about it at all. Serving up static content (like a movie, which isn't tailored to each recipient) is super easy.
      • The stupid thing of it is that people don't understand the technology. Yes, if a packet came in from NY to your house in LA, and you shipped it back out to DC, it would trip twice across the country on the backbone. If, however, you're in LA watching a movie, and your neighbor down the street is watching it too, but a few minutes behind your own start time, then your retransmitted packet my only have to go up to the neighborhood router and back down again. In short, there's a lot of local bandwidth that a c
    • "Why exactly would anyone want to donate their bandwidth to movie distributors? What benefit would you get out of it? "

      Because it might, for example, just make true video-on-demand, any movie or TV show anytime you want it feasible? Because people might want such a service? Because otherwise it might be too damned expensive to be economically viable? Because that bandwidth for which you're already paying a fixed monthly fee is probably sitting there unused 99% of the time anyway?

  • I wouldn't worry about that.

    The computer and computing industry isn't standing still. Processor and signal transmission speeds increase exponentially. There will be quite enough bandwidth and processing power for everybody.
  • Cringely talked about a company called Grid Networks and their killer P2P app that may change TV distribution. They seem to have an interesting idea, but I wanted to look into it further. Owing to the genericness of their name, however, I haven't been able to devise a Google search that finds their website.

    Does anybody have any info on Grid Networks, or are they vaporware?
  • by Opportunist ( 166417 ) on Saturday February 25, 2006 @08:03PM (#14802001)
    And thus I don't really think they will switch to this model. Simply put: Their "servers" would not be under their control. If we were to provide them with "servers", we could at least partly control what is shown.

    Of course we would not get a say what we distribute. But that's not the point. You cannot rely on a P2P Server to provide real time content. Suddenly it's gone, because I switch the box off. Even if you have a few fallback "servers" on the list it's nothing you can build a reliable service on. And people do get angry if their favorite soap suddenly skips right after the words "I kept silent 'til now, but now I have to say it. I am..."

    Not to mention the danger of tampering with the content. Yes, they will encrypt it, yes, they will make it near impossible to inject anything, but there is still the danger that in the middle of a Disney Movie you suddenly get to see ... use your imagination.
    • ... but there is still the danger that in the middle of a Disney Movie you suddenly get to see ... George Carlin!

      "Fuck Mickey Mouse! Fuck him in the ass with a big rubber dick! And then break it off and beat him with it!"
    • And people do get angry if their favorite soap suddenly skips right after the words "I kept silent 'til now, but now I have to say it. I am..."


      Yes? Yes? You are what? What are you? The suspense is killing me!!

      Trillian

    • Not to mention the danger of tampering with the content. Yes, they will encrypt it, yes, they will make it near impossible to inject anything, but there is still the danger that in the middle of a Disney Movie you suddenly get to see ... use your imagination.

      You're not kidding. Years ago I went searching for Finding Nemo on Kazaa (yes, it's a quaint story ;-), and found 7 other movies, one of which was a neat Swedish porn.

      Now, if my kids had found that while innocently looking for Finding Nemo, I'd

    • Of course we would not get a say what we distribute. But that's not the point. You cannot rely on a P2P Server to provide real time content. Suddenly it's gone, because I switch the box off. Even if you have a few fallback "servers" on the list it's nothing you can build a reliable service on.

      The servers would never go down, I imagine they'll run the "tracker" function centrally. The question is rather if you can get enough peers to upload, and why? P2P is quite reliable enough for soft real-time (buffered)
  • they aren't paying for my cable modem, and my cable modem has a maximum upstream speed of about 45 kilobytes per second. That isn't going to help anyone really. Not to mention, I wouldn't be all that keen on maxing out my upstream just so I could watch American Idol.

    Also, shouldn't they be paying ME to use MY bandwidth?
    • A P2P Tv and movie network should be free to its viewers, as over the air television is. The reason being, that we all end up paying for the bandwidth. So i dont want just a one dollar discount on movies in exchange for my bandwidth, I instead want the product for free.

      If you want me to watch your television, your commericials, while you profit in the millions of dollars AND use my bandwidth?!.... You're giving it to me free!

      Game on, you DRM motherfuckers :) Citizens need to play hardball.
  • maybe the future holds P2P networks owned and managed by Hollywood?


    That seems unlikely to me... people would have to be willing to trade away their spare bandwidth for... what, exactly? Being able to watch movies/TV on their computer? They can do that now if they want, without having to run any "industry-approved" p2p clients (and all that that implies).

  • Plenty of P2P CDN's (Score:3, Informative)

    by ozzee ( 612196 ) on Saturday February 25, 2006 @08:49PM (#14802122)


    Chaincast
    NetCableTV
    Red Swoosh
    Kontiki

    Just to name a few.

    Some of these have been in production for many years. Chaincast is/was the leader in radio streaming (at one time).

    There are more advantages with P2P streaming/downloads than meet the eye. You also get better sharing of data in the local network. i.e. you're at Starbucks, you see someone watching somthing you want too - start the download an you get it at full speed from one laptop directly to the next. Also, from an infrastructure pespective, it's automatically fault tolerant.

    It's big.

  • If ISP's were required to enable multicast all the way to the home all these video delivery problems would be MUCH easier.

    You want to see cable and DSL operators go nutz with foaming mouths, get your congressman to introduce a bill requiring multicast to be enabled on all routers and switches, and add a provision punishing ISP's who knowingly degrate UDP.

    Many people think that multicast is a failure and does not work, fact of the matter is, it's deployed WORLD WIDE on the backbones of both Internet and Inte
  • Revenue Streams (Score:5, Informative)

    by Doc Ruby ( 173196 ) on Saturday February 25, 2006 @09:17PM (#14802191) Homepage Journal
    Cringeley doesn't mention Akamai. Where does this 150K max users figure come from? If "tens of thousands" of servers is only 10K servers, then 150K streams is only 15 streams:server.

    But even a $2K P4/4.3GHz can serve over 1750 simultaneous 500Kbps video streams (from my own benchmarks), for 875Mbps. Since Gbps fiberoptics cost <$5000:mo, or under $3:stream:mo, 10K servers should serve at least 17 million simultaneous users; 58K servers serve over 100 million simultaneous streams.

    Use more efficient servers, like SANs coupled more directly to routers, and you're talking about <$3:stream:mo for maybe 100K servers serving over 1 billion people, for a $100M investment that can be amortized over a few years. Years which can bring maybe $1-100:mo profit on 1-10 billion consumers, or 10-10,000x ROI.

    Such a network is much more efficient and economical as P2P, or multicast. But even the raw numbers sound very profitable. That's why Akamai is making so much money, even though their market is still so small.
    • Exactly how are you going to get a P4 3Ghz to serve up 110MB/sec sustained transfer from ordinary hard drive(s)? Backup system for 10,000 $2000 servers Network administrators for said facility Rental / building costs, office-space, utilities, security systems, security gaurds, etc. Electricity / year? What kind of PSU are you using / what is average wattage loads on these servers? Air conditioning, special catastrophe protection, off-site backup system, etc. Just to throw one number out... assuming 110 wat
      • I already postulated a better server architecture than actual P4s: a SAN more directly coupled to the routers, with more dedicated video hardware. The P4 costs are just a basis for multiplying scale - they're not the actual hardware to use. And 60K "server units" would be distributed around the Net, with lower electric costs in many places, especially in bulk, with greater efficiencies in the actual denser HW installations. Even at $10K per server, that's not much compared to $5K:mo for 1Gbps bandwidth to t
    • http://finance.yahoo.com/q/is?s=AKAM&annual [yahoo.com] "That's why Akamai is making so much money, even though their market is still so small." Revenue, maybe. Granted the '05 financials aren't there, but ... they have a ton of liabilities and a history of considerable net losses.
      • I dunno, they earned $48.967M on $210.015M in 2004, after losing ($11.155M) on $161.259M in 2003 and ($182.536M) on $144.976M in 2002. 2005 could be about $100M net on $250M, which would mean they'd taken in 2/3 of a $billion, and spent just a little more, before their market is even arrived. Which is about how much my estimates say they'd spend to serve a billion people in 2006. Seems like the model is about correct, though it's taken them several years of investment to get there. I guess they better hope
  • BitTorrent is peer to peer. Having said that, the BitTorrent as it stands, is drastically unsuitable for use in a streaming enviornment. It is designed to transfer files, not stream them in real time; we can start with the requirement that the server has the entire file to generate the .torrent file from (try that on a live video stream, for example), and continue with the lack of an guaranteed arrival order or time. Oh, and that .torrent file - still going to be hard for a few million users to grab at once
  • BitTorrent seems to have worn out it's welcome with the MPAA recently, so maybe the future holds P2P networks owned and managed by Hollywood?"

    I'll be happy to join their "P2P" network, buy the content for a reasponable price, and share pieces of files I download to other users that want the same thing. However, their litigious and moneygrubbing attitude makes me NOT want to share any of my bandwidth with them for free. They would have to offer me a monetary incentive to consider using my bandwidth to

  • " BitTorrent seems to have worn out it's welcome with the MPAA recently, so maybe the future holds P2P networks owned and managed by Hollywood?"

    That's assuming that Hollywood hasn't worn out its welcome with users, which it has in spades. I think the future holds P2P networks owned and managed by users, who will watch content owned and created by users, and BitTorrent is a great distribution method for it.

  • This problem has been well known forever. It was a key factor in the failure of internet set top boxes in the mid-nineties (when everyone was trying to make one, like Apple or Oracle/Liberate).
  • multicasting

    oh killler mbone apps where art though
  • Point 1: What is the point of streaming in the first place? The idea is STUPID and NOT what I as a (ab)user want. I want to be able to have the file, copy it to the device I want to view it on, pause when I feel like taking a break, start playing when I want to and so on. I want a +-700 MB avi divx file.

    Point 2: BitTorrent allows you to add seeds to the torrent as you feel like. When I read "data centers will never be ready to serve 30 million concurrent streams of data." I ask the simple question: Really
  • A hybrid streaming/P2P application where bittorrent attempts to give you the bits in the correct order, but when it fails, a centralized server gives you what you need? Wouldn't be perfect, but it would take off a lot of the load from the server...

  • Cringely States The Obvious.

    The industry has been bleating about P2P for on-demand for years. It's the perfect solution for cable operators who have networks designed around INTERNAL traffic and pushing data around to subscribers. If the subscribers share the networking and you can have a city block feeding itself..

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...