Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Cloud

Linux Foundation Exec Believes Edge Computing Will Be More Important Than Cloud Computing (zdnet.com) 67

An anonymous reader shares a report: Once upon a time, back when we all had mainframes and then servers in our offices, we had edge computing. Our compute power was literally down the hall. Then, along came the cloud, and all that changed. Computers were hundreds of miles but milliseconds away. Now, with the rise of IoT, 5G, and our never-satisfied need for speed, edge computing is coming back with a vengeance. Indeed, at his keynote at Open Networking Summit in Belgium, Arpit Joshipura, The Linux Foundation's general manager of networking, said "edge computing will overtake cloud computing" by 2025.

When Joshipura is talking about edge computing, he means compute and storage resources that are five to 20 milliseconds away. He also means edge computing should be an open, interoperable framework. This framework should be independent of hardware, silicon, cloud, or operating system. Open-edge computing should also work with any edge-computing use case: Internet of Things (IoT) edge, a telecom edge, cloud edge, or enterprise edge, whatever, "Our goal here is to unify all of these." This is being done via LF Edge. This Linux Foundation organization seeks to bring all edge computing players under one umbrella with one technology. Its purpose is to create a software stack that unifies a fragmented edge market around a common, open vision for the future of the industry. To make this happen, Joshipura announced two more projects were being incorporated into LF Edge: Baetyl and Fledge.

This discussion has been archived. No new comments can be posted.

Linux Foundation Exec Believes Edge Computing Will Be More Important Than Cloud Computing

Comments Filter:
  • Not necessarily open (Score:5, Interesting)

    by duckintheface ( 710137 ) on Monday September 23, 2019 @01:35PM (#59227436)

    Numerous Linux commentators have discussed the take-over of the Linux Foundation by large non-open corporations. It's good to keep in mind when you read pronouncements by the LF that Microsoft and others are involved. Open source means nothing if it does not refer to the General Public License (GPL) or similar GPL compatible license.

    • by lactose99 ( 71132 ) on Monday September 23, 2019 @02:48PM (#59227796)

      Open source means nothing if it does not refer to the General Public License (GPL) or similar GPL compatible license.

      Then you're referring to Free Software, not necessarily Open Source.

      • Open Source is referring to the license. All that it means is that you can make changes to the software, and even resell it, but that you have to extend the same ability to anyone that purchases your software. "Open Source" means that the source code is readable by anyone.

        • You hold a common misconception. Open Source means that entity who has the rights to run the executable also has the rights to source code access. It does NOT mean that if I license my code to Boing, for example, that I also have to make the source available to your grandmother. That includes if she happens to work at Boing, BTW.
      • Comment removed based on user account deletion
        • It sounds like you thibk free means "as in beer" not libre. The GPL certainly allows one to sell software. The only requirement is that if you sell the executable to someone you must make the source code available to that someone. You can prohibit redistribution and still be GPL compatible. It dictates how redistribution must be handled in cases where it is allowed. It does NOT force the original vendor to give the code, rights to use, or rights to redistribute away gratis.
      • St Richard has been purged. Free Software is dead.

    • There was no takeover, Linux Foundation always was a club for giant evil corporations that critically depend on Linux and need it to prosper. That doesn't mean they have a clue in any way shape or form. Jim Zemlim is the poster child for arrogant fool. Loves to boast about writing Linus's paycheck. As if Linus needs the money, or cares what the fuck Jim Zemlin boasts about.

      Linux Foundation is just a bunch of self serving corporate droid wankers who pay the bills for Linux conferences and paychecks to some k

    • This is BS advertising. From tfa,

      To make this happen, Joshipura announced two more projects were being incorporated into LF Edge: Baetyl and Fledge. Formerly known as Baidu OpenEdge, Baetyl is meant to seamlessly extend cloud computing, data, and services to edge devices, thus enabling developers to build light, secure, and scalable edge applications. Its target audience is IoT edge device developers who need cloud computing, data, and services.

      From Wikipedia:

      Baidu, Inc. is a Chinese multinational

  • Marketing (Score:5, Insightful)

    by Topwiz ( 1470979 ) on Monday September 23, 2019 @01:44PM (#59227484)

    Edge computing... Just a new marketing term to use with know nothing purchasing managers.

    • It's true. We're dealing with some of this and the concept of bypassing your heavy duty hardware and doing computing on extremely slow devices instead is baffling. Maybe I'm biased but my assumptions for the nodes I'm working on is to be smaller and use less power, while edge computing is the opposite. But... "Edge Computing" makes the marketing people ecstatic and it does seem to sell devices.

  • by fahrbot-bot ( 874524 ) on Monday September 23, 2019 @01:44PM (#59227486)

    Edge doesn't even run on Linux.

  • by LynnwoodRooster ( 966895 ) on Monday September 23, 2019 @01:48PM (#59227514) Journal
    Smart client -> dumb client -> smart client -> dumb client -> smart client...
    • by kiviQr ( 3443687 )
      Translation: $$$ -> $$$ -> $$$ -> $$$ -> $$$ -> $$$ ...
      • I think some people are finally figuring out that "cloud computing" is just going to result in AWS and Microsoft owning 80% and the remaining 20% split between on premise, other cloud platforms and whatever.

        Inventing a new "near premise" computing creates some new opportunity for middle ground that can't be owned by AWS or Microsoft (yet..), but also allows for $$$ opportunities from people who still own hardware.

        Sometimes it seems like there's a way to achieve Pareto optimal computing that balances price,

        • Exactly. Young people not knowing about the dumb terminal era thinking they invented something new.

          • that dumb Xterminal was actually rather smart.
            Tho back in the day, curses ruled; today that
            client could just as easily be enriched. VNC, remote X....

            Attach some local storage that syncs frequently and the sky's the limit:)

    • In relative terms, yes. In absolute terms, a refrigerator today like has more computing power than a personal PC of the 80's. So while it might seem like we're just revisiting the past, more accurately client and server are both ratcheting up more or less in parallel.
    • It is normally the race between cost and performance of Networking Technology vs Computer performance.

      Early Computers had no networking as they are programmed by feeding in cards. So access to the physical computer was necessary.
      Then over time computers were powerful enough with enough RAM to handle more then users at once, and with Serial communication we can spread a single computer to many users. Then we had the PC Generation where cheap fully functional computers became common, where we decided to run

      • my home Laptop is roughly 1000x more powerful then my Desktop (32gigs RAM vs 32Meg of RAM, 8gig vs 4meg of video

        You desktop sucks. Your laptop is roughly specced like my 3 year old mid range desktop, except we both know the laptop sucks in many ways too, including insufficient ports, thermal throttling and severely limited expansion, if any. Oh, crappy keyboard, did we mention crappy keyboard? My next year's desktop will be about 10 times more powerful than your laptop on paper, more in practice, and that will be far from top of the line.

        Get with the program.

    • Yep, everything old is new again.

    • by Tablizer ( 95088 )

      It's called Waldo Computing: "Where the hell is my server today?"

    • Ob. Homer Simpson quote

      https://getyarn.io/yarn-clip/c... [getyarn.io]

  • by Anonymous Coward

    ... they cut both ways.

    So... what's 'edge' and what's 'cloud'? 20ms? That 3km (round-trip). I put a datacenter in St. Louis, MO, USA - I've covered almost all of the United States!

    Sounds like some sort of crap that a pointy-haired marketroid came up with to sell you the 'next thing'.

    "Cloud" makes sense - it's just a bigger, more modern, version of time sharing. Edge just sounds like 2-tier, 3-tier, n-tier all over again.

    I'll pass.

  • 5G will make all clouds edges... or edged clouds... cloudy edges... something like that.
    But then once we get Mars populated than all of Earth's cloud servers will be edges for Mars...
  • Distance (Score:4, Informative)

    by darkain ( 749283 ) on Monday September 23, 2019 @02:01PM (#59227576) Homepage

    "five to 20 milliseconds away" I'm currently sitting at lower latency than this to AWS, Azure, and Google... so, eh!? What's the diff?

  • by scubamage ( 727538 ) on Monday September 23, 2019 @02:02PM (#59227578)
    This echos the findings of Bell Labs research (discussed in the book "The Future X Network: A Bell Labs Perspective"). Basically, more and more technologies emphasize low latencies more than raw speed. As it stands, in many areas speed is plentiful. We can always go faster, but given current workloads we'll get a diminishing return. Latency on the other hand is a hard boundary created by physics. Electrons flow at the speed of light, the speed of light is 186000mi/sec, or put another way, 186mi per millisecond. That means if you need to send a packet 360 miles, you are never going to beat 2ms of latency as an absolute minimum, and that's before any asic/compute/dsp processing time occurs. Newer technologies have ever-more demanding latency requirements for round trip delay. For instance, VR requires sub 11ms response times or you can cause nausea in users. So instead of having huge core networks, it makes more sense to build resources in places like COs, and regional datacenters that are closer to customers to minimize that latency.
    • it makes more sense to build resources in places like COs, and regional datacenters that are closer to customers to minimize that latency.

      Hell, remember the initiative where businesses were going to be able to lease out smallish bits of their workcenters to have a small cloud datacenter put in, where the heat from the servers could help offset keeping their building heated?

      Now, from what I remember cooling is actually a much bigger concern in most workcenters not in the far north, but there you go.

      And while bandwidth is cheap, I'd keep in mind that it still costs a relatively large amount to provision something like a 100 gbit connection that

    • Are these marketing electrons ? The ones I learned about in physics have mass, and travel at less than the speed of light.

    • by Papaspud ( 2562773 ) on Monday September 23, 2019 @04:17PM (#59228168)
      No, electrons don't travel at the speed of light, even light only goes that fast in a vacuum.
  • Give user 100% total control. And kiss cloud fuck off! https://librehome.com/doc/deve... [librehome.com]
  • As long as the data driven business model remains dominant, there is relatively little incentive to move services completely to the edge.

    The full shift won't come until..
    - Alternative business models become easier to operate. For example, if micro payments enter the web browser, and people start paying for things like journalism again.
    - The value of privacy is even better understood by the wider public. For example by disconnecting the smart home from the cloud as much as possible.

    Although there is a growi

  • Might as well just call it "Compu-Global-Hyper-Mega-Net Computing" and be done with it, once and for all.

  • cheap to administer computing.

  • by sdinfoserv ( 1793266 ) on Monday September 23, 2019 @03:56PM (#59228108)
    Sure, lets move all mission critical apps and secure systems to IoT end points for which no patches ever exist and we get the privilege of paying monthly 5G line of site data fees just to run spreadsheets on your toaster. What could possible make more sense?
    • by bn-7bc ( 909819 )
      Connecting your toester to your unmetred( not in the us) home wifi maybe ( home wifi isusualy connect via cable/ftth/fttc/fttp (basament of appartment bulding and cat5e/cat6 the rest of the way)/dsl (horror if you are not close to the co)
  • Comment removed based on user account deletion
  • So how many ms away is "cloud computing"? A guy near the bottom says he already has lower latency than 5-20ms from multiple cloud providers. For all I know, he happens to live near a hydroelectric dam, with multiple data centres down the block, of course. One imagines that those who live close (in wiring & router terms) to a cloud datacentre have "edge computing" for sure, and it's more than it isn't *guaranteed* if you live in Glasgow, MT. (It's 5 hours from any city and lacking in hydro.)

    It all

  • by bobstreo ( 1320787 ) on Monday September 23, 2019 @04:33PM (#59228230)

    my main storage is 0.284ms away.

    limiting factor is probably the ethernet port, which while advertised as 1gb on the laptop, won't do more than 100mb...

  • by brainchill ( 611679 ) on Monday September 23, 2019 @04:48PM (#59228266)
    my head hurts .... who ever would have thought that moving all of your information and processes to someone else's computer far away would create headaches moving large amounts of data quickly and would create latency issues for end users .... wait, we all said that when everyone said lets move everything to the cloud.
    • pretty sure is has less to do w/locality or distance and more to do with being unable to
      care for the data you own. The cost of reliable
      data-security and backups when good talent is
      hard to find

  • Electricity travels at roughly the speed of light, which is very roughly a foot per nano-second, in terms of raw distance, the 5 milliseconds would be a little less than a thousand miles. But you lose a millisecond or two for every router you pass through, and you have to account for server response times. I suspect that the big argument here is not for physical distance, but using a plethora of servers that don't build up a long queue of transactions waiting for service.
  • I think one factor for using Edge over the Cloud, at least for AI and ML, is that once you train the Network, then you just need to run it to get the output, i.e. like facial recognition. You need less compute and storage than the training machine, so you don't need to run it in a Cloud, just a closer computer for your million users using your App.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...