Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet Software

The World Wide Computer, Monopolies and Control 129

Ian Lamont writes "Nick Carr has generated a lot of discussion following his recent comments about the IT department fading away, but there are several other points he is trying to make about the rise of utility computing. He believes that the Web has evolved into a massive, programmable computer (the "World Wide Computer") that essentially lets any person or organization customize it to meet their needs. This relates to another trend he sees — a shift toward centralization. Carr draws interesting parallels to the rise of electricity suppliers during the Industrial Revolution. He says in a book excerpt printed on his blog that while decentralized technologies — the PC, Internet, etc. — can empower individuals, institutions have proven to be quite skilled at reestablishing control. 'Even though the Internet still has no center, technically speaking, control can now be wielded, through software code, from anywhere. What's different, in comparison to the physical world, is that acts of control become harder to detect and those wielding control more difficult to discern.'"
This discussion has been archived. No new comments can be posted.

The World Wide Computer, Monopolies and Control

Comments Filter:
  • by primadd ( 1215814 ) on Thursday January 17, 2008 @08:27PM (#22088504) Homepage
    I'm fearing for the days when all you have at home is a thin client to some virtual machine inside some big server farm. You buy CPU time, like in the old mainframe times, get billed by cycle.

    No need for anti piracy features, you don't get to see the executables or source anyways, all tucked away from your prying eyes.

    --
    Bookmark me [primadd.net]
    • I don't think I'd like to be billed by cycle, I live out in the sticks & that would mean an awful lot of pedalling. :-)
    • by huckda ( 398277 )
      oh you mean like using your PC...buying no software, running nothing but an OS(although remote boot'n over the web would be kewl and slow)

      and using things like GoogleApps..and the plethora of all the other 'Web 2.0' software out there? ;)
      wait...that already happens...

      there is very little that you can NOT do via 'the web' without owning any software yourself...
      and I'm not talking about using OSS on your personal computer at all.
      • Assuming everyone has a high speed connection it might be feasible, otherwise it would be to slow. Either way the consumer will have to give up control over their desktop apps, probably won't happen. Plus what web based apps can compete feature wise with those on the desktop? There might be a few but I can't think of any.
        • It's not just speed that counts, its reliability and uptime. What is considered "reliable" for your average cable or DSL connection is not what would be considered reliable for running substantial applications.

          In ten to fifteen years, the guy may have a point, but at the moment, as I said in another post, the web is a terrible application platform. Quite frankly, I think the model for distributed apps was paved a couple of decades ago by X Windows. The X protocol is horrible and insecure, of course, but
          • by MrKaos ( 858439 )
            And of course the latency. User's have enough of a hard time with an applications latency on a local machine, let's chuck a network into it as well and really piss them off.

            Carr is talks from the perspective of a user - not a technologist so when I see an article by someone qualified to make such predictions I'll pay more attention. He talks about distributed applications like google apps which have their place for casual users who realise they can get in trouble for copying proprietary software. I don't

      • by MightyMartian ( 840721 ) on Thursday January 17, 2008 @09:23PM (#22088918) Journal
        There may be little you can't do, but there's a lot you can't do well, reliably or securely. As cool as Google Apps may be, you're essentially trusting your data integrity and security to an outside company.

        The web, as it currently exists, is a really shitty software platform. Web 2.0, if it meaningfully exists at all, is built on some rather horrible hacks that break down the server-client wall, and for certain kinds of limited applications that's fine, but building substantial applications, like accounting and financial software, in AJAX would be an unbelievably difficult job, and a rather hard one to justify.

        I think this guy is, as with his last great proclamation, overstating his case. Yes, in certain arenas, like home and small business email, apps like GMail certainly can play a role, but I can tell you right now that the business I am in, which deals with confidential information, will be waiting a long time before farming out this sort of thing.
        • Privacy Laws (Score:5, Insightful)

          by Roger W Moore ( 538166 ) on Thursday January 17, 2008 @10:31PM (#22089406) Journal
          As cool as Google Apps may be, you're essentially trusting your data integrity and security to an outside company.

          Just to drive home your point further what can be even more important is that, as trustworthy as Google may be, they are subject to US law. This is a huge problem in places like Canada which have privacy laws since using, for example, GMail means that your organization can end up breaking Canadian law because the US government has free access to any data in your email which you may be legally responsible for protecting.
        • Web 2.0, if it meaningfully exists at all, is built on some rather horrible hacks that break down the server-client wall

          I won't deny horrible hacks, but "server-client wall"?

          building substantial applications, like accounting and financial software, in AJAX would be an unbelievably difficult job, and a rather hard one to justify.

          I don't see how it would be either particularly difficult (there are plenty of good libraries out there now) or particularly hard to justify (Business Guy can now print his report

          • There are other technologies like RDP that allow him to do this already, and in a far more secure and robust manner. Using web browsers to run software is like using a brick to hammer 3 inch 10d nails. Yeah, you can drive it in, but it's ugly, takes a lot longer and leaves you at a lot more risk of injury.
            • There are other technologies like RDP that allow him to do this already, and in a far more secure and robust manner.

              I fail to see how RDP is either. Certainly, the Web app could be much more efficient at bandwidth usage.

              (Yes, I know RDP is more efficient than VNC. But it's still less efficient than a custom protocol, even if it is based on gzipped XML/JSON/YAML.)

          • I don't see how it would be either particularly difficult

            It isn't hard, it's impossible. You would have to figure out how to distribute the app without any data. Can't do that, and the company won't let you distribute their data.

            • It isn't hard, it's impossible. You would have to figure out how to distribute the app without any data. Can't do that, and the company won't let you distribute their data.

              Oh, I see. You've confused "web" with "internet".

              What, exactly, is the problem with an intranet app, available, at most, over a VPN? Or even SSL-secured web app?

              They need to run without net access? Net access is like basic utilities, but fine, give them Apache and MySQL -- or whatever else you used. Most of the decent open source web a

        • Even for confidential info - you might be farming it out sonner than you think. Just take the example of credit cards. Given the choice I'd rather pay authorize.net a monthly fee to store my customers credit card numbers because they are better at securing it and I can leverage them to obtain safe harbor from visa by being pci compliant. Another example is the web based database lightspoke.com. Some customers have been known to store healthcare info bc it's easier to pay loghtspoke.com than it is to make su
        • Web 2.0, if it meaningfully exists at all, is built on some rather horrible hacks that break down the server-client wall, and for certain kinds of limited applications that's fine, but building substantial applications, like accounting and financial software, in AJAX would be an unbelievably difficult job, and a rather hard one to justify.

          AJAX is getting pretty damn easy (have you looked at Prototype? Dojo?) But AJAX is just one very small part of this..

          Today, if I'm a company, and want to build a multi
    • by RuBLed ( 995686 )
      So when that time comes, only ninjas/pirates/outlaws will have their own personal computer and independent OS? Shiver me timbers! I'd join their ranks without a second thought...
    • somehow I don't believe that will happen, to some extent yes, but I think people will start to value there privacy, but maybe I'm only naive.
      then you have companies that has secrets they cant trust third party with, already today we are talking about nations using the intelligent services to give there own companies an edge over other nations companies.

      I get the feeling that people that write this sort of thing does not understand the technology. but maybe it is I who don't "get it".
    • by PPH ( 736903 ) on Thursday January 17, 2008 @09:37PM (#22089018)
      And then some enterprising guys, working in their own garage, will develop a machine that you can own, can program yourself and mantain complete control over.

      It will be the 1970s all over again (except without disco).

    • by kwerle ( 39371 ) <kurt@CircleW.org> on Friday January 18, 2008 @12:04AM (#22089942) Homepage Journal
      I'm fearing for the days when all you have at home is a thin client to some virtual machine inside some big server farm. You buy CPU time, like in the old mainframe times, get billed by cycle.

      Look around. There are no thin clients. The iphone is 100x more powerful than my first computer. The macbook air is 1000x more powerful than my first computer.

      Imagine 21 years from now. Imagine computers 128x more powerful than they are today. That means that the iphone of 21 years from now will be 10x more powerful than "the lightest laptop available today."

      You're talking about "thin clients". But a really powerful computer will be the size of a thick piece of paper.

      Yeah, I'm dreaming - but how else do you expect to keep up!? In my professional career (say 18 years), computers have become 100x more powerful, and fit in an envelope.

      The only reason for "thin clients" is because the client wants and agrees to be thin.
      • by dodobh ( 65811 ) on Friday January 18, 2008 @09:12AM (#22092402) Homepage
        The browser is the software version of the thin client.
        • by kwerle ( 39371 )
          The browser is barely software - not in an interesting way, anyway. It's just that: a browser. Data/information will be stored somewhere else. It's someone else's information, after all. They created it, you go to them.

          If you don't like that, buy yourself some DVDs from Encyclopedia Britannica and ignore the news.

          But that is the browsers purpose/choice - to be a client.
      • You're talking about "thin clients". But a really powerful computer will be the size of a thick piece of paper.

        Yes, but requirements for computing power will have increased right along with them (insert joke about "Windows Vista 203X Cybernetic Edition" here). Your future paper computer will have power beyond anything available today, but it will still be a tiny fraction of the processing power of a pizza-box-sized computer mounted in a rack in a server room.
        • by kwerle ( 39371 )
          Yes, but requirements for computing power will have increased right along with them (insert joke about "Windows Vista 203X Cybernetic Edition" here). Your future paper computer will have power beyond anything available today, but it will still be a tiny fraction of the processing power of a pizza-box-sized computer mounted in a rack in a server room.

          Wow. I had about a full page written about:
          * How inexpensive Walmart PCs are, and how they do everything Joe-6pack wants
          * How 100-1000x more powerful is not ju
  • google for it....
  • by gringer ( 252588 ) on Thursday January 17, 2008 @08:48PM (#22088652)
    Otherwise known as a botnet
  • by swschrad ( 312009 ) on Thursday January 17, 2008 @08:48PM (#22088656) Homepage Journal
    10 stop war
    20 fix domestic problems
    30 printf "Woo!"
    40 goto 10

    hmm, doesn't seem to be working. hairbrained theory, anyway.

    it would probably take 80kb to do that in visual C.
  • The internet, PCs, etc. permits low-cost, large-scale anyone-to-anyone communication and influence. That any-to-any influence can be (and is being) used in a decentralized fashion. Or, because any = {one, many, everyone}, it can be used as a one-to-everyone scheme for control. (As an aside one could argue that the slashdot effect, DDoS, or internet vigilante effect is a "everyone-to-one" phenomenon that overwhelms the target one)

    That said, the past was dominated by one-to-many mechanisms for influence.
  • by farkus888 ( 1103903 ) on Thursday January 17, 2008 @08:58PM (#22088746)
    I am thinking these centralized computers would be maintained by professionals, assuring they will be virus free.[don't laugh too hard yet... the jokes not over] if that is the case I think the telcos would love the reduced bandwith requirements of *only* having to pass every byte of every app I decide to use down the "tubes" instead of all that botnet traffic they need to deal with now.
    • by dbIII ( 701233 ) on Thursday January 17, 2008 @09:12PM (#22088838)

      would be maintained by professionals, assuring they will be virus free

      Oddly enough that currently defines the difference between the professional level operating systems (some of which are free) and a hobby system that was pushed into the workplace (which you have to pay for). The wide range of malware is currently a single platform problem and is almost all the fault of poor design of two applications - Internet Explorer and Outlook.

      • almost all the fault of poor design of two applications - Internet Explorer and Outlook

        Hey, don't forget to mock their network stack!
      • by TheThiefMaster ( 992038 ) on Friday January 18, 2008 @04:22AM (#22090866)
        That was true in the past, but nowadays malware is mostly spread by the good old "User wants free porn" method.

        A.k.a social engineering.

        I don't remember encountering any malware since at least before 2000 that could spread itself without relying on the user to infect their own machine. I've had several pieces of malware try to email or even msn file transfer themself to me from an infected pc though.
  • by NetSettler ( 460623 ) <kent-slashdot@nhplace.com> on Thursday January 17, 2008 @09:01PM (#22088766) Homepage Journal

    Internet still has no center, technically speaking, control can now be wielded, through software code, from anywhere. What's different, in comparison to the physical world, is that acts of control become harder to detect and those wielding control more difficult to discern.

    Or from nowhere. The risk of a bad guy taking over is serious, but the risk that no one is at the helm is much more likely to lead us to death by Global Warming, for example.

    You have to look no further than the US Congress to see a worked example. If you idealize every single member of Congress as intelligent, and I think a similar analogy can be made for people on the net or for companies on the net (where you still have to question intelligence sometimes, but let's not and say we did), it's pretty clear that the problem isn't just the sinister taking hold of someone with total power. It's also that it's easy to cause behavior that no one can take responsibility for, and that isn't in the best interest of individuals. The Internet is no different, but not because we didn't have examples of this before. Just because we didn't heed them.

    • Re: (Score:1, Funny)

      by Anonymous Coward
      The entire world combined, is less intelligent than the average person.

      While you may have a hard time convincing one person that the overproduction of popcorn is causing tsunamis stronger than ever before, you would find it surprisingly easier for a group of 10 people to convince 1 person of the same "fact".
  • by monopole ( 44023 ) on Thursday January 17, 2008 @09:02PM (#22088770)
    The definition of a real utility computing environment is one where somebody can hold a coup d'etat in it and make it stick in the real world.
    • That leads to the interesting questions of whether a 24-year-old Norwegian hacker who likes allowing people to share information freely would make a better leader than any politician likely to achieve high office this year, and whether even 17-year-old Russian script kiddies could do a better job of promoting good international relations than the likes of Brown and Putin.

      Oh, sorry, did you mean a coup d'état via cyberspace would be a bad thing? :-)

  • Ahem.... (Score:5, Insightful)

    by GaryOlson ( 737642 ) <slashdot@NOSPam.garyolson.org> on Thursday January 17, 2008 @09:02PM (#22088778) Journal
    "The tighter your grip, the more star systems will slip thru your fingers." Princess Leia of Alderaan

    This guy obviously has no sense of history....real or fictional.

  • The IT cycle? (Score:5, Insightful)

    by jase001 ( 1171037 ) on Thursday January 17, 2008 @09:07PM (#22088808)
    Isn't this just the IT cycle, everything gets centralized, short term costs are saved. 10 years later decentralized, and long term costs are saved Vs short term.
  • There is no news. There is only the truth of the signal. What I see. And, there's the puppet theater the Parliament jesters foist on the somnambulant public.

    Mr. Universe
  • by Tancred ( 3904 ) on Thursday January 17, 2008 @09:30PM (#22088968)
    Here's a classic sci-fi (extremely) short story on the topic of an immense computer. Frederic Brown's "Answer":

    http://www.alteich.com/oldsite/answer.htm [alteich.com]
    • Comment removed based on user account deletion
    • Problem: Technically infeasible.

      It's talking about wireless power and faster-than-light power/information control.

      Doesn't make it less compelling of an idea, though. Singularity is the modern evolution of this concept. And there have been others.
  • by Gideon Fubar ( 833343 ) on Thursday January 17, 2008 @09:35PM (#22089010) Journal
    Both Nick Carr and Alexander Galloway seem to be missing something..

    perhaps it's that they assume the user and authority groups are mutually exclusive.. or perhaps it's the 'programming as control' inference that collapses the argument.. i'm not sure, but i really don't see this outcome occurring.
    • i dont see either, where i work we have two computers(at least) for every person, one for normal email and stuff(no secret stuff) and one for every other system, that one for email and none secret stuff is handled by another company but the other ones are handled internally. basically you cant trust every thing to another party.
      • That pretty well encapsulates the problem with his last article, eh?

        Sure you can outsource the generic business stuff, but there are some things that you won't find a host for, some things that are clearly cheaper and better to keep in house, and some things you'd have to be insane to outsource..
  • by Duncan3 ( 10537 ) on Thursday January 17, 2008 @09:38PM (#22089030) Homepage
    So we're back to the point in the cycle where centralized mainframes you rent time on rule the world again. Can you guess what happens next? Privacy problems, reliability problems, outages, and we all go back to personal systems again.

    Old is new again.
    • by dch24 ( 904899 )
      I totally agree with you. But food for thought: ham radio operators were the distributed network during the mainframe days 40 years ago.

      Maybe it's about the population of users in each camp. I mean, mainframes, HPCs, and imaginary Beowulf clusters haven't gone away. It's not an either/or proposition.

      But good luck convincing your boss to take a mixed approach. When Microsoft enters the mainframe market, surely we're all doomed.
  • by Dun Malg ( 230075 ) on Thursday January 17, 2008 @09:39PM (#22089036) Homepage

    Carr draws interesting parallels to the rise of electricity suppliers during the Industrial Revolution.
    Interesting comparisons? More like spurious comparisons. I read the linked interview and, as someone who has read quite a bit about the rise of industry and its relationship to the availability of power (basically, the history of power generation), I can say he's a typical unrealistic abstractionist. He handwaves away the fact that the purpose and nature of electric power generation and electronic communication are similar solely in topography by claiming that they are both "general purpose technology" and are analogous economically. Of course, his entire line of reasoning is balanced upon a precarious point of assumption which is highly questionable: that people will find off-site centralization easier than in-house. Really, it's the same old crap we've heard for years. How long had we been hearing about how "real soon now" thin clients will be all people will need? It's ludicrous. Just think about how much lower latency and greater reliability would be required before people would be willing to offload any significant percentage of their storage and computational needs. We're not there yet. We're not anywhere near there. I'd say you'd be lucky to get 2 nines of reliability out of such a system, much less the 4 or 5 nines you'd need to make it what this nutter predicts. Really, the parallel between remote IT service and electric power is nil. All power requires for reliability is a good run of copper wire and generator.
    • by Gorobei ( 127755 )
      Well, I'll take the other side...

      I use a lot of compute (multiple megawatts, globally distributed.) Even with an in-house support team, I don't see even 99% uptime: in the last two years, I've lost compute twice due to natural disasters, and several more times due to operator error or hardware misconfiguration.

      I agree we aren't there yet, but I'll switch to an external compute provider as soon as their perceived reliability and scaling exceeds what I have in-house. I expect that will happen in 2009.

      Googl
    • i remember when i first got broadband i spent a lot of time downloading videos from youtube and video.google because i wanted to watch them. now i don't bother--it's easier just to dial in and watch them.
    • It's a bit much to call him a nutter when most of what he's pointing to are trends in the consumer space that potentially may be disruptive (in the Clayton Christiansen vein). Disruptive innovations rarely look adequate for the broader market, but they have a way of taking over....

      As for "off site centralization", perhaps I'm a bit of a weirdo: I don't store my money in a sock under my bed, I use an off-site centralization service called a "bank". I also don't cut my own hair, I get a salon to do that
  • welcome our subversive effusive control asserting paradigm shifting overlords
  • So just start a solar/wind/hydro/? powered wireless world wide net.

    The Peoples Net

    Using off the shelf hardware (solar), it would be a one time cost of (US) $500.00 - $1000.00 to set up self powered node.

    I'm shooting from the hip on the costs here, but I used to install solar/hydro, so I'm prolly close.

    And the deep cycle batteries would have to be replaced after 5 - 8 years (with good maintenance, if wet cells).

    But that would be a truly non centralized network.

    Amateur Packet Radio works in a similar wa

    • Amateur packet was also really big on 2m and 440mhz last I checked (admittedly about five years ago), which is well within your privileges as a tech. Also, with the code requirement having been nixed, what's stopping you from reaching for Extra?
      • "what's stopping you from reaching for Extra?"

        Interest...

        I have one good friend who's a Ham also, but the rest of 'em around here in Mendoland are so narrow focused on emergency comm, and militaristic regimen, that we can't relate at all.

        Oh, I'll be there in an emergency (already have been), but otherwise, no, I have other things to do, like tinker with my new Eee PC.

    • What I wonder about is, how will a truly decentralized network work?

      The Internet is truly one of the wonders of the world, but think about it. Why do you trust google.com to actually refer to Google? Because a centralized authority dictates DNS from 14 servers. Why do you trust that 12.38.253.8 really is 12.38.253.8? Because centralized government-controlled authorities dictate which numbers go where. By comparison, search on the epitome of decentralized networks, p2p, brings up a shitflood of spam, fak
    • Use the new cheap printable solar cells, and for almost infinite
      battery life use super capacitors instead.

      The super cap would cost more, but they have like a million cycle life.

      Assuming 1 cycle per day, It could last in theory, 3,000 yrs.

      Nanosolar which is invested in by google:

      http://www.nanosolar.com/history.htm [nanosolar.com]

      UltraCapacitors at present:

      http://en.wikipedia.org/wiki/Ultracapacitor [wikipedia.org]

      High Altitude Balloons for Relayers:

      http://www.21stcenturyairships.com/HighAlt [21stcenturyairships.com]

      The 65,000 ft. variety is still in development bu
    • Where would you get the hardware if the general purpose computer went away? You'd be fine for a while using stockpiles of existing hardware, of course, but I really don't see many of today's computers lasting more than a couple of decades without suffering problems from capacitors drying out, harddrives seizing, tin whiskers, plastic parts degrading, and things like that.
  • by Vellmont ( 569020 ) on Thursday January 17, 2008 @10:10PM (#22089264) Homepage
    There's some technologies that everyone wants, and there's a solution that'll fit 90% of the populace.

    Examples would be hosted email, contact management, and calendaring. A central provider can just simply do a better job at providing all these things that an IT department does, and the requirements are all extremely generic. Users seem to want infinite amounts of email storage, and the ability to find an email at a moments notice. That's difficult to manage unless you want to dedicate someone to JUST knowing the email systems.

    The thing I disagree with is that the IT department is going away. Simply not true. The difference with other utilities is that the IT department doesn't provide a single, simple resource like electricity. IT provides automation and tools that increase productivity, many of which are going to be way to specialized to centralize.

    IT departments may evolve, like they've been evolving for the last 50 years. I've heard many years ago (before my time at least) there were people dedicated just to swap tapes around. We don't have that anymore of course.
    • IT departments may evolve, like they've been evolving for the last 50 years. I've heard many years ago (before my time at least) there were people dedicated just to swap tapes around. We don't have that anymore of course.

      Of course they'll evolve, but the idea that the entire field will shrink to a tiny fraction of its current size is ludicrous. This idea that hardware and software in the future will somehow just magically work, and that what little is left will be handled by little IT elfs that come in the

    • by Shotgun ( 30919 )
      There's some technologies that everyone wants, and there's a solution that'll fit 90% of the populace. Examples would be hosted email, contact management, and calendaring.

      To reinforce your point, the attribute that ties all of your examples together is that they all need to make use of the network to be useful in the first place. Latency and reliability are already tied to the network. Having to have the network operational to play my flight sim, write a thesis, update the accounts receivable or do a CA
  • Uh, yeah. (Score:3, Funny)

    by MadMorf ( 118601 ) on Thursday January 17, 2008 @10:12PM (#22089282) Homepage Journal
    This is the same kind of abstract extrapolation that predicted we'd all be riding around in flying cars.

    So, the real question is...

    Where the fuck is my flying car?
  • Kinda. Sorta. Not yet, but soon.

    For businesses, especially small ones, utility computing makes a lot of sense. I work for a 70-person company, and six of our employees (including me) are dedicated to the IT function. We could probably reduce that number in half and still get more revenue-generating projects tackled if we were able to outsource things like backup and recovery, user account maintenance (why isn't this an HR function has always befuddled me - they control the hire/fire function, but don't determine system access at most companies, including mine), software rollouts, machine cloning, etc. I've been evaluating Google apps, and I tell you, it's almost to the point where I can see myself making the business case to deploy it company wide. I close my eyes, imagine a world where i never have to think about email servers and spam blocking again, and I cry a little. Saving my company $150K+/year in the process is just a bonus.
    • Re: (Score:3, Interesting)

      And when Google's document store gets hacked, and all your documents and private communications are compromised, and someone asks you "What do you mean, you didn't know how Google handled backups and security?", I hope to be there to watch as you melt.
      • Right, because it happens to Google *way* more than your own servers, with their bulletproof procedures, obfuscated passwords (and change policy), and well-managed backups.

        ROFL
  • by caramuru ( 600877 ) on Thursday January 17, 2008 @10:38PM (#22089456)
    Carr wrote the May 2003 Harvard Business Review's "IT Doesn't Matter." His argument (grossly simplified) was that IT is a "utility" and businesses should not invest in IT because IT cannot differentiate one firm from another. In a well known (to the business community, but apparently not to ./) rebuttal to Carr's article (Smith & Fingar's "IT Doesn't Matter, Business Processes Do", Meghan-Kiffer Press, 2003,) it is argued (again, grossly simplified) that IT is critical to optimizing business processes - the true source of enterprise value. A business that optimizes its processes differentiates itself (positively) from its competitors. In fact, Business Process Management Systems (BPMS) is a new layer on the enterprise software stack. For those of you coming from the SOA space, BPMS is the choreography layer.

    Carr's current article's argument that IT functions should be taken over by functional units only perpetuates the silo thinking of most organizations. Budgeting IT resources on a departmental basis perpetuates islands of automation, redundant/conflicting rules, ridiculous internal interfaces., etc. Outsourcing some or all IT functions may be reasonable in some cases, but turning control of IT over to the various functional units in an organization is insane.

    • by Tarwn ( 458323 ) on Friday January 18, 2008 @06:50AM (#22091438) Homepage
      I'm not sure the idea of splitting the IT responsibilities into other departments is insane (hold with me a moment). Consider the current situation of an IT department that is a separate department, usually with their own goals, budget, etc. This department is notable for not always getting new PCs as fast as they are wanted, for not implementing software changes immediately when requested, and for demanding additional money when deploying technologies like video conferencing so they can upgrade the internet connection or some other foolishness. Oh, and they always act like they are busy, but we all know the systems hardly have problems.

      Unfortunately the suggested solution, of splitting IT into the surrounding departments, is going to look like a good idea to many director level people. It will (in their minds) ensure immediate service for new equipment, allow a higher level of control over the purchase of items they think are unrelated, and allow them to have changes made to software at a higher level of priority. To the outside manager or director, they generally only see what we are not supplying, not what we are. If we are good at our jobs, but have poor systems, they don't generally realize just how bad things are because we are keeping the system limping along. A lot of our expenditures are due to reasons they just don't understand. If we buy a 48 port managed switch with fibre but were rolled under one of these departments, it could very easily turn into a refurbed 48 port hub off ebay, since they both have lots of connections and thats all you really need.

      What about change control? They don't see it. Time for testing? That will get reduced further. Developing in test environments? But those are good machines, they should be used for something important. Oh, and why do you need fancy development tools? Joe down the way made an application to do that in 45 minutes using MS Access, but it takes days in this fancy technology, we'll just use MS Access instead.

      The whole idea of splitting IT up into several departments is like a startup company (non-tech) in reverse. Money will go to IT-related resources last, it will be in no one's interest to spend the time, resources, or money to ensure there is a strong infrastructure capable of growth, in house software development will be on-the-fly and likely based on technologies like MS Access. On top of that, larger initiatives like data warehousing, global data management will be left to whoever wants to pay for the whiz-bang consultant to come in and do it their way. Backups, email, directory services, all of this will end up on someone's plate who will forever be trying to drop it off on someone else.

      I realize that the author of that article was likely thinking that IT resources would not need to deal with most of these things in the future, and for that I can assume he has not worked in an IT environment in quite a while. While technologies are available to streamline our jobs and allow us to grow the department(s) more slowly that in the past, splitting the department so that no one has these responsibilities is going to have one positive thing going for it: The consultants that come in to clean up the mess after the takeover are going to be set for a good long time.
    • In fact, Business Process Management Systems (BPMS) is a new layer on the enterprise software stack. For those of you coming from the SOA space, BPMS is the choreography layer.

      BPMS are *not* choreography engines, in the sense that Fingar was referring to. He likes to claim that this stuff is all based on sound theory (pi Calculus), but in reality, it's not, just the modeling langauage is. It's unlikely choreography will really take off for a long time, frankly, because it's a bit too beyond where peop
  • by presidenteloco ( 659168 ) on Friday January 18, 2008 @12:11AM (#22089978)
    Availability of secure P2P protocols, and creation of a location-free, fragmented
    encrypted redundant moving storage virtual layer on top of lower-level net
    protocols, could retain freedom from monopoly control of information
    and services.

    But watch for the predictable attempts to get legislation against such
    "nebulous dark-matter middle-nets". Watch for fear arguments to be used
    as justification. Watch for increasingly asymmetric ISP plans (download good,
    upload bad), and protocol-based throttling or filtering, by the pipe providers.

    These are all the very predictable reactions by "the man". They must it goes
    without saying be resisted, in law and political discourse, and economic boycott,
    or circumvented by all ingenious tricky means necessary.

    P.S. I've been predicting this inversion of the intranet to where it is the "extranet",
    and inversion of where we would trust our data (What, you kept your data on
    your own servers, and not the massively redundant global storage net?
    Are you insane??) for a long time now, but nobody listens to me.
    (Brain the size of a planet, and they've got me parking cars...)
    • Watch for increasingly asymmetric ISP plans (download good, upload bad), and protocol-based throttling or filtering, by the pipe providers.

      That started with 56k for dialup and was an optimization based on common usage -- most people download more than they upload. If you're going to claim that there is some sinister intent behind optimizing for download, you ought to provide some evidence.

      • Fundamentally, asymmetric bandwidth says "You want to hear from us (consume from us) more than we want to hear from you.

        Slight asymmetry may be justified as an optimal use of bandwidth in cases where technically
        it is in fact inter-constrained bidirectionally, but large and increasing asymmetry would be a self-fulfilling
        prophecy. See, these mere ordinary consumers have nothing to say. Nothing to offer. Nothing to store or process
        for us.

        It's not really conspiracy, but rather the inexorable creep of business
  • Princeton University held a panel this week on "Computing in the Cloud" that discussed many of these issues. A couple of relevant excerpts:

    From Data Center Knowledge [datacenterknowledge.com]:

    Some cloud-based services could become so vital that they become candidates for government regulation, according to panelists at the event ... "Everyone who is trying to get into utility computing is getting big fast," said Jesse Robbins (of O"Reilly Radar). "They're all trying to get as big as they can as fast as they can to win the platform

  • Considering that companies like Time Warner are attempting to severely limit the amount of data users can transfer. [nytimes.com]
    We'll see what happens once FIOS is implemented everywhere, but from the way the major ISPs have been behaving lately - your run of the mill end user type services will be punishing people who use (what I would consider to be) a really silly bandwidth cap of about 40GB/mo...and that's only if you pay for premium services.
  • I am inclined to think that it's not just 'The worldwide computer', it is the 'New emerging species - The Almighty machine'.
    I think it is here, it controls us, it's just that we have a different definition of 'control'.

    Consider this:

    - People don't make machines, machines do.
    Well, it's not exactly true, since we *do* design the chips and circuits. But that's about all we do in order to create an evolved machine replica. The chips we design today would be impossible to design without computers and computers d
  • How about this one: "in the future all IT - world-wide - will be performed by six people in India, and all home computing will be done on cell-phones, and those cell phones will be smaller than a dime." Just look at the current trend!! Now please buy my book. Thank you.

  • This idea of his doesn't make much more sense that his idea for how the IT dept will 'go away', seeing as how this idea is built on his earlier flawed idea. [slashdot.org]

"...a most excellent barbarian ... Genghis Kahn!" -- _Bill And Ted's Excellent Adventure_

Working...