Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

The Rise and Fall of Corba 304

ChelleChelle writes "Chief scientist of ZeroC, Michi Henning, has an interesting look at the story behind CORBA a once-promising distributed computing technology. Henning provides more than a brief history, in addition to several reasons pinpointing why CORBA fell short, focusing specifically on the OMG's technology adoption process itself. Most interesting is the final discussion on what we can learn from CORBA's decline, particularly in reference to web services."
This discussion has been archived. No new comments can be posted.

The Rise and Fall of Corba

Comments Filter:
  • Let's do SOAs based on WebServices. Right now, right here.

    The only Web-Service-Standard that's currently finalized and widely accepted is WS-I basic profile. So, no standard on...

    - authentication (no, dear MS people, HTTP basic is _NOT_ sufficient for the IBM MQ guys)
    - transaction management, transport and control (please say properietary soap headers)
    - encryption (there IS a standard for XML encryption, but it's unsure how to use it within SOAP)
    - naming services (UDDI is so dead, it's already smelling, go and find a public UDDI registry that's not just a webpage, that you can query via SOAP, IBM's developing a Websphere Naming Service, superb!)
    - ... and so on, and so on...

    Stuff that CORBA has been offering for nearly a decade! So why are webservices popular? Because of the technology? No way! They're freaking slow (our Java RMI services are nearly 50 times faster than those implemented with Apache Axis 1.4 here, and axis is pretty good). No, just because of the tools!

    Go, build a Webservice with NetBeans and a client with VS.net 2005 and you will have to implement two or three lines of code... That would have been possible with CORBA, too! The fall of CORBA is just a matter of tools, the technology is clearly better, offers more features, is very performant.

    But coding these days requires click and run...

    Sad.
  • by mshurpik ( 198339 ) on Tuesday June 20, 2006 @01:51AM (#15566917)
    It's not surprising that an early middleware app like Corba failed because it was being developed at a time when the concept of middleware was just a buzzword. The article mentions the "fracturing" of the middleware market and the over-reliance on "screen scraping" technologies like HTTP+CGI. In other words, it wasn't until the standardization of the web platform (Apache+PHP+MySql or IIS+ASP+SqlServer) that people even knew what middleware was supposed to look like, and this standardization didn't happen, actually, until after the dot-com boom was over.

  • by wysiwia ( 932559 ) on Tuesday June 20, 2006 @01:57AM (#15566939) Homepage
    Why should anybody create a distributed application when a simple library API is almost always sufficient. Why making something more complex than necessary. In most cases component APIs are rather stable as soon as all the missing pieces from early beta versions are solved. Yet even if they change it's possible to handle most cases without any intermediate interface definition etc.

    Prof. Wirth always said: "Make everything as simple as possible but not simpler". While Prof. Wirth as made many things too simple (to prove his statement true) any component system is yet much too complex for any locale task and many times also even for distributed tasks. I'm still waiting for a component system which is as easy usable as a simple library API.

    O. Wyss
  • by Ricdude ( 4163 ) on Tuesday June 20, 2006 @02:11AM (#15566977) Homepage
    Um, there are instances where you *want* to distribute an application across several machines, and not have to worry about the details of implementing a robust inter-process communication layer yourself. Once you get past the boilerplate code of creating an object, publishing it's reference, and locating that object, CORBA breaks down into simple function calls.

    I just wish they'd create a C++ mapping that allowed for STL compliant sequences, and std::string compatibility...
  • by Animats ( 122034 ) on Tuesday June 20, 2006 @02:21AM (#15566999) Homepage

    The intercommunication system within OpenOffice, the mechanism that allows embedding spreadsheets and drawings in other documents, is CORBA-based. Sort of. Actually, it's something called UNO, which started life as CORBA but went off in an XML direction.

    GNOME also uses CORBA internally. But its CORBA isn't compatible with the one from OpenOffice.

    The UNIX/Linux world has never really had a good way for applications on the same machine to intercommunicate in a subroutine-like way. Microsoft has OLE/COM/DCOM/ActiveX, which is clunky but always available. In the Linux/Unix world, there's nothing you can assume is always there. There's OpenRPC, there's CORBA (in about five incompatible forms), there's Java RMI, and there are various kludges built out of pipes. But there's been no convergence in two decades.

  • Scary... (Score:4, Insightful)

    by Gorimek ( 61128 ) on Tuesday June 20, 2006 @02:21AM (#15567001) Homepage
    I worked with CORBA around 1997-98. It was one of several new technologies in our project, and we never really got it to work properly. Everything was just really complex and error prone. The company got closed for many reasons, but our stuff didn't help things much.

    Recently I've done a lot of XML Web Services work. This can actually be made to work, but it feels a lot more like filling out your tax returns than programming. Everything is really verbose, and you have to tell the system the same thing over and over.

    I never really connected dots until I read this article, but it is pretty much the same uneasy feeling I have about this that I had about CORBA. And the article even explains how they're similar!

    Not that that has to mean they're destined for identical paths, or that I'm a visionary who can sense the fate of a technology years in advance, but it does make me a bit happier that I quit that job last week.
  • by Anonymous Coward on Tuesday June 20, 2006 @02:28AM (#15567020)

    Some of the technical problems of CORBA went beyond a misunderstanding of what it actually needed to do. The biggest failure from my point of view was uneeded extra state that had to be synchronized. The hardest part of distributed systems is synchronization of views, so anything that makes this harder makes the entire system more brittle. Using CORBA always started out easy enough and then got nastier and nastier as development went on. Good riddance to bad rubbish really.

  • by Anonymous Coward on Tuesday June 20, 2006 @02:40AM (#15567053)

    The fall of CORBA is just a matter of tools, the technology is clearly better, offers more features, is very performant.

    Well one reason CORBA tools sucked was that it was over-engineered: intended to solve world hunger, clean up the environment, produce a fresh and pleasing scent and tuck you into bed at night - oh yeah and be the glue for distributed systems too. Web-service oriented protocols are simpler because they try to do less. Simpler protocols means that tools are easier to produce, which means tools will be produced.

  • by The Pim ( 140414 ) on Tuesday June 20, 2006 @02:40AM (#15567054)
    Go, build a Webservice with NetBeans and a client with VS.net 2005 and you will have to implement two or three lines of code
    There may be some truth to this (I've never used those tools), but that's not saying much. You've just built a self-contained "hello world" client and service, accepting all the defaults. Now, take a complicated, pre-existing service, using some non-default binding and non-trivial schemas, and integrate a client for it into a large existing application. I tried this recently using apache axis2, and it was a world of pain. Now, it may be that axis2 sucks and there are better tools, but I've read many of the web service standards, and it's clear that there are complexity and interoperability issues (why do these standards have to offer implementors so many options??) that you can't easily paper over. Ultimately, I agree with the author (and hope!) that web services will fail for technical reasons.
  • by bigmouth_strikes ( 224629 ) on Tuesday June 20, 2006 @02:44AM (#15567064) Journal
    Different people need different things. For many companies being productive - i.e. having appropriate tools - is everything. In many cases tt doesn't matter if you have to roll your own or use a non-standards compliant protocol, as long as you can get the functionality out the door in a matter of months, not years.

    Like the article said, CORBA is a niche product for those who absolutely need it.
  • by Frankie70 ( 803801 ) on Tuesday June 20, 2006 @03:05AM (#15567111)
    CORBA was an open standard.
    COM was a propreitary standard.

    But still COM succeeded much more than CORBA.

    Dox Box's book "Essential COM" has a foreword by Charlie Kindel (one of Microsoft's
    COM/ATL developers) which discusses this.

  • by Anonymous Coward on Tuesday June 20, 2006 @03:13AM (#15567129)
    He is right. (In my opinion.)

    I am not him, but the problem is the same - Corba is overly complex.
    His post illuminates the fact that this is not about Sun at all anymore, if it ever was. On the contrary, I saw Corba as quite popular among the OSS crowd.

    I did the same thing as the guy in your parent post. I suspect that in groups of developers all over the world, one guy was the Corba-guy with the task of just making it work. And we all had the same solution (I imagine) - writing the ultimate IDL, then a flexible abstraction layer in code. Then never touch it again.

    Nowadays Java and .Net and various libraries does what Corba promised to do, but far far simpler.

    I'm not sure where Sun fits in this picture at all. Why would they want to resurrect Corba, with JavaEE in its place?
  • by rubies ( 962985 ) on Tuesday June 20, 2006 @03:22AM (#15567142)
    Anybody who had to deal with the woeful implementation of naming services in CORBA, who stupidly subjected themselves to cross-platform / cross language system implementations (try Visual C++ on NT talking to Smalltalk systems on SUN == headaches and midnight support calls every day) will tell you CORBA was a crock. Anybody stupid enough to listen to Microsoft when they said they would fix the DCOM dropouts / timeout issues when the system would stop talking to other DCOM clients (requiring server reboots) will tell you DCOM was a crock. The old RPC stuff was hard to use, but at least it worked. Give me a minimal raw socket solution any day of the week.
  • by Anonymous Coward on Tuesday June 20, 2006 @04:25AM (#15567374)
    This is definitely a case where Microsoft's way of doing things was vastly superior. Whereas CORBA was designed by a committee of competitors who all wanted to sell object request brokers, COM was designed by a single company who wanted to sell software that could interoperate.

    MS saw a need, designed something to fill that need, tweaked it until it worked, then released it. OMG designed something to fill that need, and more or less immediately released it. As a result, no two implementations of CORBA could talk to each other without buying another piece of software. Come to think of it, I don't really know why they didn't just copy COM -- at least it worked.

    What they should have done was have a body of experts collaborate on a standard, implement it, tweak it until it worked, and then release it. This is why standards like IEEE 754 (floating point) and JPEG are so successful, while standards like CORBA and VRML2 are such crap. Once the dust settles on ODF, I'm afraid that ODF is going to end up like CORBA, with every program implementing it differently enough that adoption is hindered and it only gets used in-house by companies that mandate it.

    dom
  • by bytesex ( 112972 ) on Tuesday June 20, 2006 @05:08AM (#15567499) Homepage
    CORBA tried to do too many things at once; not only was it supposed to be some kind of OO-RPC, it also specced a declaration language (IDL), which was replaced at some stage by an XML-variant, which it wanted to also use to incorporate data- and other resources, plus some kind of 'discovery' broadcasting protocol that you could use to find (distributed, of course) CORBA servers around you, oh, and object serialization in weird strings. On top of all that, you had to leave your blood, soul and first born child at the omg website in order to obtain documentation, because, as these guys felt it, they had gold in their hands, and they wanted to cash in on it good. And it _was_ good, but it was just that because of this in-built obscurity (in turn caused by its complexity and omg's secretiveness), nobody could really tell where CORBA stood; was it some kind of transactioning system a la the kind that IBM mainframes have ? Was it OO-RPC (but then, why not use RPC) ? Then all kinds of competing tech started to overtake it (java RMI, XML-RPC based tech) to which the omg only vaguely responded (the XML declaration thingy) but couldn't really, because of the moloch that CORBA had become. And thus they faded into obscurity.

    The moral of the story ? When you want to sell a protocol or a language, be as simple as you can (modularize) and be as open as you can (throw it around, even) Otherwise, if you're not IBM or Sun or Oracle, you will not make it.
  • by Anonymous Coward on Tuesday June 20, 2006 @05:09AM (#15567507)
    From an OOP point of view CORBA's core architecture is basically very, very good.

    They screwed up the details. Mr. Henning's point of view is much more informed than mine,
    but I want to emphasise this: they got a lot right.

    Look at SOAP and the "Web Services" trend: like the language-du-jour, "everyone" seems to
    think it is something new, something great. Why do we need another MS-DOS every five to nine
    years? Percieved convienence is the usual answer.

    At my company, a major retailer in our market, we're now taking a legacy codebase that is far
    from perfect but nonetheless innovative and moving it all to silver-bullet platforms. You know
    the ones I'm talking about. The full-meal deal: all our problems solved.

    Even though the language we're "moving up to" is basically 1957 technology billed as cutting-edge,
      with a bloated description system being used to transfer what amount to fixed-field data records,
    and all of that piped through another bloated and basically silly but extremely trendy RPC mechanism,
    it's all supposedly for the best. Not.

    See the trend? Why are we always in a hurry, for everything? Sure, you can swing to a point where everything
    is crufty and convoluted (like CORBA), but you can get good things when you think about it a while.

    That brings me to Ice.

    I have a workmanlike understanding of CORBA. From that base, I have an equal understanding-- and appreciation--
    for Ice, gained in much less time. Of course, a lot of that is because the whole paradigm (Ice /does/ keep a lot of the things that were VERY GOOD about CORBA!) wasn't new, but let me say that Ice is a dream in many ways. The C++ bindings are a perfect start. For casual use, blissful. (Great job, guys!)

    What's my point? Okay. First, understand Michi Henning's article. Now, with that understanding, pretend that
    you're in a meeting in which SOAP and certain other "modern technologies" are all being suggested. Take the
    time to absorb the scene. Is the "modern" technology really and truly innovative, or is it just-- and this is /USUALLY/ the case-- crap?

    -Anon.

    (I have no association with ZeroC, by the way.)

  • Re:Real reasons (Score:3, Insightful)

    by dan the person ( 93490 ) on Tuesday June 20, 2006 @05:27AM (#15567546) Homepage Journal
    CORBA always required holes in firewall, more complicated to setup(as mentioned in article), poor/no load balancing/fault tolerance mechanism/ maintainability

    Right, so SOAP gets around this problem by reusing port 80 which on insecure networks sometimes has an unrestricted hole in the firewall already. In secure environments, you still have to put a new server in the DMZ whether it's port 80 your talking on or port 9001.

    As for load balancing and fault tolerance, Websphere here load balances CORBA clients over multiple machines in a cluster without any development effort. Just add more than one machine into your cluster and it does it.
  • CORBA rocks!! (Score:1, Insightful)

    by Anonymous Coward on Tuesday June 20, 2006 @05:52AM (#15567617)
    Well, I did not see CORBA working for anyone here.... I am sharing my experience working with CORBA and c++.

    I work for a Telecom service provider, and we have been and are using CORBA for the past 6 yrs and I tell you that CORBA rocks, at least for us.

    We have convered some transcations to webservice call (because of some verdor requirement) but the response time really sucks.. it changed from 4ms to approximately 400ms.

    We serve 28Mi+ customers all the transcations go via our CORBA Bus and it rocks... :)

    tchau
  • - authentication (no, dear MS people, HTTP basic is _NOT_ sufficient for the IBM MQ guys)

    WS-Security [coverpages.org], an OASIS standard [oasis-open.org] (like OpenDocument Format), has been around since 2001. You may wish to update your SOAP knowledge.

    But coding these days requires click and run...

    No: it's all about the APIs and who's making them available. Got CORBA bindings for Google? How 'bout the National Weather Service? If nothing else, people are publishing SOAP APIs that we actually want to use. That alone makes it much more interesting that competing RPC protocols.

  • by Viol8 ( 599362 ) on Tuesday June 20, 2006 @06:38AM (#15567733) Homepage
    "mostly university based people "

    And therein lies the problem. A lot of these ivory tower academics have
    never worked in the real world where there are things known as deadlines,
    costs and "lots of work". Ie , people don't have a few weeks free to kick back
    and learn a highly complex API. They need to be able to learn it on the fly.
    And if the API is over complex and over engineered thats not going to happen.
  • The article mentioned CORBA's problems with dealing with firewalls.

    Correct me if I'm way off base here, but it seems like the following happened with regards to ports:

    1. In the beginning, services were allocated to different ports, with HTTP going to port 80.
    2. Sysadmins decided to block everything but port 80.
    3. Stuff needed to be put through the firewall.
    4. Rather than developing procedures to get ports opened, everything but port 80 stayed blocked.
    5. Protocols were designed to use port 80 to get through firewall.
    6. Selective port firewalling became useless because everything uses port 80.

    Clearly, blocking ports selectively is not a sufficient security measure, but being able to block services based on the ports they use is a handy tool to have in the armoury. Isn't it? And isn't it one we've thrown away because sysadmins have been to anally retentive?

  • by XMyth ( 266414 ) on Tuesday June 20, 2006 @09:42AM (#15568623) Homepage
    How about saying a paved road is better than a dirt bike? :)
  • by BattleTroll ( 561035 ) <battletroll2002@yahoo.com> on Tuesday June 20, 2006 @11:09AM (#15569390)
    All of these issues are unrelated to CORBA. It sounds like a bad implementation; one which would have been equally bad had they used a roll-your-own RPC mechanism.
     
  • by aminorex ( 141494 ) on Wednesday June 21, 2006 @07:22PM (#15579362) Homepage Journal
    I'm of the opinion that invested vendors want to raise the bar to entry for competitors, so they inject as much complexity as they feel they can easily manage. This turns into a bidding war at the standards table, when multiple vendors all try to inject complexity in areas where they feel they have leverage. It's just like price-fixing, really. It's an anticompetitive practice.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...