Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
The Internet

Web Services 222

Erik Sliman writes "Why are all the IT companies suddenly interested in open standards with web services? An OpenStandards.net article explores the issues surrounding the somewhat vague term."
This discussion has been archived. No new comments can be posted.

Web Services

Comments Filter:
  • by Anonymous Coward
    At least Web Services don't force me to run binaries on my system to do stuff. But I'd rather have local applications running in a well-defined sandbox.
  • by telstar ( 236404 ) on Tuesday April 23, 2002 @01:11PM (#3395647)
    Those that lead have the most to gain, and those that follow stand to lose the most if they don't jump on board...
    The success or failure of the actual concept is secondary to how soon they joined the party.
    • Funny you mention that. Below is the text of an email I just received from oreilly. I guess you can't hop up on that wagon fast enough (too bad the stepstool is so bloody expensive).

      Planning for Web Services is a new report from O'Reilly Research, written by industry visionary Clay Shirky. This report guides CTOs and CIOs through the inflated claims, competing standards, and amalgam of acronyms to arrive at a realistic appraisal of the business impact of Web Services. Topics include how Web Services can replace EDI, who the major players are and what they really offer, as well as the hurdles to implementing Web Services today. A must-read for anyone developing a Web Services business strategy. $495 Save $100! Just use code # wsrelj when ordering by phone (800-998-9938 or 707-827-7000) or email (order@oreilly.com) and you can get this invaluable report for only $395. Offer expires May 10, 2002

    • Microsoft uses this pretty effectively. People will jump on bandwagons without knowing all the details, so Microsoft spreads enough half-truths to encourage people to jump on their bandwagon instead of someone else's. After being on the bandwagon for a while, the users realize that microsoft's many bandwagons have rounded up millions of people and brought them back to the Microsoft ranch, and it's going to cost them extra if they want to get out.
  • by Telastyn ( 206146 ) on Tuesday April 23, 2002 @01:15PM (#3395681)
    It's becoming more and more common that the "Internet" is just Internet Explorer to most people. So some smart fellow thinks it'd be a grand idea if services could be served this way, to appease the lowest common denominator. PHB's get ahold of it, and wham! off it goes to the media, and in 2-3 years everyone (hopefully) realises what a bad idea it was.

    If you want a unified 'client' for all services, make one, don't kludge everything onto http. Please...
    • by wiredog ( 43288 )
      Well, sort of. The PHB's are realizing that web browsers aren't the best way to do web services/web applications and are looking for a better one.

      The problem is that everyone has a web browser. Anything that aims to replace it has to get high distribution at low cost. You want all your customers to have whatever client you use. And it has to be based on a standard so that even if the customers client isn't exactly what you have, it's close enough.

      And this is in a world where it can be difficult to get IE and Mozilla to play nicely together.

    • by smagoun ( 546733 ) on Tuesday April 23, 2002 @01:27PM (#3395778) Homepage
      OTOH, HTTP is pervasive. So are HTTP clients. It's the "write once, run anywhere" model that Sun's been pushing with Java for so many years. You run the app in one place (on your server), and it's accessible to anyone with a computer and a modem. It even works on PDAs, phones, etc with a minimum of effort. I'll be the first to agree that HTTP isn't the best way of doing things for most apps, but the industry has never been about "best". It's about "good enough" and market penetration.

      Designing your own protocol takes time, and implementing it for each OS/hardware combo out there takes even more time. Why bother to do that, when you can leverage a protocol (HTTP) and client software (browsers) that are already everywhere?

      From management's point of view, web services are a no-brainer....

      • OTOH, HTTP is pervasive. So are HTTP clients. It's the "write once, run anywhere" model that Sun's been pushing with Java for so many years. You run the app in one place (on your server), and it's accessible to anyone with a computer and a modem. It even works on PDAs, phones, etc with a minimum of effort. I'll be the first to agree that HTTP isn't the best way of doing things for most apps, but the industry has never been about "best". It's about "good enough" and market penetration.

        You've just described the web as it was (and is, and will be) since 1995 or so. But the web is passe now; even the PHBs have realised that adding an i or e to the start of a brand just doesn't cut it any more. They need new buzzwords! new paradigms! new... bullshit! Fortunately, the world's marketing people have stepped up to the line with a fresh new truckload of ripe, steaming horseshit. Web services? It's client/server. It's nothing new. And the rest of the hype - vague ideas about websites communicating your preferences between themselves - can either be implemented on the client (Mozilla's wallet, form manager etc); is only of use to a Microsoft-like conglomerate that wants to own /everything; and ISN'T GOING TO HAPPEN. Anyone here remember "DNA"? "Digital dashboard?" Remember in 1996, how VRML was going to make the "flat" WWW redundant? And XML, which was going to make all the 30 yeasr old EDI systems obsolete overnight? Pur-lease. Give me a break. Some of us have been working in this industry for more than a couple of years, and you know what? Once you've seen a couple of bullshit marchitecture hype waves come rolling in, then swoosh back down the beach leaving nothing but some rotting seaweed and a couple of old shampoo bottles... it gets real hard to get excited about an amazing new technological breakthrough when there's no new technology (or indeed anything else of substance) there.

      • Designing your own protocol takes time, and implementing it for each OS/hardware combo out there takes even more time. Why bother to do that, when you can leverage a protocol (HTTP) and client software (browsers) that are already everywhere?

        So, sticking propriatory formats like .DOC and what not on the web is a good idea? Give me a break. This junk only works on M$ platforms and not very well there.

        We use this trash at work and it sucks. It's all tied in with M$'s bloated, ever changing formats and "standards". Email has gotten so thick from people mailing power point presentations, and word docs that the poorly perfoming servers are crapping out. Oh yeah, all that goofey mail carries a bandwith cost. Sigh, when a few kilobytes of text will do, it get's sent as a word doc. This might be because the default and only allowed mail client defaults to word as an editor, and plain text gets all screwed up at the recieving end. Really, the software forces you that way. Oh, and the asp bullshit? It's kind of like an evil VB crossed with Access used to put blinky things on people's pages and prevent anything but an M$ encumbered computer from looking at it. The slob in the next cube is forced to deal with it, on his two huge monitors that attempt to make up for a lack of virtual desktops. It's the best M$ scripting ever! "Objects" and tabs and buttons, oh my! It does not always work and when it does, it does not work well. A typical use is to hand you a Word DOC though OLE and a link while pretening to be a "web page". The OLE is quirky enough that Word fails to work correctly, if Word has a correct mode of operation. Other uses I've seen are inferior tables that blink and hand you a couple of forms that may or may not collect information and send you a word DOC. So there you have it. Poor perfomance, high cost, incompatibility and formats that won't work in two years. What else can you want from a "service"?

        The person who recomended that junk is going to be fired. It's one thing to get suckered for a few insignificant desktop machines that replaced typewriters and secrataries. I can even forgive the poor joker who thought that M$ file sharing might be useful, and the other poor fool who bought copies of front page. After all that, the rework, the broken formats, the masses of equipment that got obsoleted in four years, the broken prommises and costs that exceeded mainframes by wide margins then doubled, there is no forgiveness. Further folly is willful ignorance.

        Now that's a story everyone knows, or will know if they work for a M$ "partner". So ends the rant.

        • Hehe, that's what will kill proprietary formats like Microsoft Word's DOC. Unless it's readable on Microsoft's Internet Explorer, Opera, Netscape, Galeon, Konquerer, a few varieties of PDA, or any of a few dozen no-name brand browsers yet to be, it's going the way of the Dodo.
          If I can't read what I've written now five years from now, essentially regardless of what I am using five years from now, it's almost suicidal not to be looking for some format that's useable and that I can read five years from now.
      • But browsers are not OSs.
        I've been saying this since I first heard Gates start pushing it. Web services is a concept only someone who is clueless or hoping to keep others clueless about computing fundamentals could love, like Gates.
        The argument of leveraging "what's out there" is totally misleading. OSs are already out there. Limiting yourself to a browser is essentially just taking the responsibility off the sysadmins by saying everything has to be squeezed into HTML so they don't have to worry about their security policy, but that's totally naive. In order for web services to be useful, they have to be powerful and you're back to the question of why you didn't just write a brown paper app with FTP, TFTP, SSH or whatever protocol you needed for the networking chores? How does squeezing this existing functionality into HTTP represent an improvement?
    • "It's becoming more and more common that the "Internet" is just Internet Explorer to most people."

      Very true! I wonder if that has anything to do with the fact that the first contact many people had with "The Internet" was Internet Explorer... which was conveniently disguised by that "The Internet" icon in early versions of Windows 9x.

      It's no wonder why people use Internet Explorer and The Internet synonymously.

    • Whoa there...

      Web services != HTTP Web services

      While web services can be used over http, even microsoft is pushing to not use http, merely because http was not designed for this sort of thing.
      • even microsoft is pushing to not use http, merely because http was not designed for this sort of thing.

        No, microsoft is pushing to not use http because http is an open standard, which they can't control and use to lock in users. Microsoft's stand on http, referenced here [com.com], was, as usual, stone-bullshit, obvious to anyone with a little technical savvy.
  • by FurryFeet ( 562847 ) <joudanx.yahoo@com> on Tuesday April 23, 2002 @01:15PM (#3395685)
    In today's world, connectivity is key. You pick up a phone and expect to be able to call any other phone on earth (granted, it may be expensive or hard, but it is possible), no matter if it's in another country, company, if it's a celular or a satelite phone.
    That expectation moves to the Net. If you're going to hire net services, you expect to have a unified system that will allow you to do anything with one interfase, one bill, from anywhere.
    Now, I can only see two posibilities for that to happen. One is Microsoft, but fortunately I see a trend where less companies are willing to empower the BMFH (Bastard Monopoly From Hell). The other is open standards.
    And yes, this is a Good Thing (TM).
  • In short, this is a simplified XML version of COM, CORBA or EJB, only without the specific requirement of a "component", "object", or "bean", or anything except... well... a "web service".

    For those wishing to simplify CORBA or EJB in the privacy of their own homes, the secret is to make the Service an Object. Now you would never have thought of doing that if I hadn't told you, would you? That's why Web Services are different.

    Oh well, someone who puts 'simplified' and 'XML' in the same sentence is probably nuts anyway...
  • Not only is the "on the bandwagon", but open-source is one of those buzzwords that comes around and sticks in the lexicon of the public. Everyone sees/hears/reads that Microsoft is being sued because, amongst other things, they are not open-source. Open-source must be good then, if the courts are forcing MS to be that. Open-source gets good press; IT companies believe that they would have a favorable image if they offer something that they can point to and say "We use open-source code for that and look how great it is." Also, open-source should be more economical to run/code/acquire than proprietary solutions. I guess the problem with that is, if you are an IT company, why don't you have/use your own solution??
  • Check this [ibm.com] out to see what IBM has going on..
  • We use web services (Score:2, Interesting)

    by Anonymous Coward
    At work we've been using web services to make eligibility requests to insurance company databases. The reason it's nice is

    1) no connectivity issues, it's just https over the Net, and

    2) no data format issues. In .Net at least, you write a web service just like a local procedure. It has parameters which can be arbitrarily complex objects. Hit a button, it exports WSDL. Send that to your partner, who hits another button, now they can call your web service just like they were calling the procedure locally. No muss, no fuss, and any VB programmer can pick it up in a day and start using it.

    • xml maps are not easy. while complex types are neat, this is no easy task. xml can be confusing as hell, much more than the objects they wrap.
      some stuff does work and yah, it's nice over https, but it's really a technology about 70% hype and the rest maybe some varying degree of usefulness..
      • But he's right. In .NET importing a web service is just one button. And if don't like .NET, use the GLUE plugin for JBuilder and again, it's just one button. I simply love web services, cause they makes things so easy and distributable. You don't have to care about the whole Client/Server comunication stuff, just use it.
    • Your so a microsoft goon... The only proof I need is to hear you say "SQL" and pronounce it "Sequel."
      Show your true userid!!!
    • You can only do this if both you and your partner are using the same message format.

      Microsoft .NET defaults to doc/literal message format. No other toolkits support this on the client side. So either you BOTH have to use the default (which is usually the case because VB programmers aren't known for their high IQs or programming ability) or you both have to know enough to change the message format to the more OPEN RPC/encoded message format.

      Think about it for a minute - it's easy, so you don't have to think about it. You do whatever Bill thinks is best, which is to use a proprietary message format and force a closed service because no one else but MS clients can get at it.

      But then again, maybe that's a good thing. I don't know that I'd trust a Web service written by a VB programmer.

  • Here's a related /. story [slashdot.org] regarding IBM and Microsoft's suggested security standard for web services.
  • They want a hip name that makes it sounds like they're actually doing something. They really don't know what it means either. They just want to look good and hi-tech.
  • I don't think Microsoft is as worried about a distributed computing world as the article suggests.

    There are transition points along the way to a truly distributed computing world, however, that it has been worried about. In a truly distributed computing environment, every client, every desktop is a server. Who owns the desktop world today? Microsoft. That is the end-game, and Microsoft is well positioned to capitalize on it.

    In the interim, however, before all the "standards" and "security" issues are worked out, server-based computing -- Larry Ellison's proclaimed NetPC -- will surface. Unfortunately, for Microsoft, this is where they are weak. Microsoft knows that there are already too many competing server platforms out there. So, it focused instead on making sure that open protocols were adopted so that any server-based products that are developed will always be compatible with its desktop client. To hedge its bets, it also pushed Passport so that even if server-based computing becomes established, it will have a piece of the pie.

    When a truly distributed computing world surfaces, unless the open-source community finds a way to penetrate the desktop client, it will be a Microsoft world all over again.
  • Its a recession. During boom times like the mid 90's companies were too busy dealing with sales and expanding like crazy to deal with demand. Now that most of the competition has died down, no one expects them to post record profits etc it gives people the chance to think about where to go next.

    The web is all very well but HTTP et al. have some serious limitations and were never designed for most of the current technology. For example a dial up connection has the same bandwidth of a dedicated line in the 1970's so ASDL/Cable modems etc were never considered.

    The reason for all the demand now is the scientific community and all the Grid projects around the world, just because there's a recession doesn't stop them and their data requirements make Google look like a small fry (20TB of data for Google vs 600TB for BaBar at SLAC [stanford.edu]).

    The other issue is business - they've all got on the band wagon of internet sales as an extra sales channel so they can grow this, but its not going to be the sudden revenue increase it was initially. Web Services offer the opportunity for companies to increase productivity and efficency which is why the tech companies are investing in it now so when the economy changes and the corporate clients come back they have something new to go on about.
    • Now that most of the competition has died down, no one expects them to post record profits etc it gives people the chance to think about where to go next.

      We do that here, except now that its so slow we have the time to think about planning for tomorrow instead of this afternoon there's no goddman money available to make it happen.

      It's starting to feel like the Staples shared-pen commercial.
  • Web Services- you take the crackable and exploitable service on port 'X' and advertise it on port 80 or 443. Just as bad, just as exploitable - but now the IT people can't firewall it.

    • I'm tired of this crap. Let's put the shoe on the other foot. If you were going to accomplish the same task (letting customers access an API publish by your company) how would you do it?

      Here are the choices as I see it:

      1) Use CORBA. You have to bust a whole in the firewall. I don't know about you, but I would much rather trust an HTTP server than most CORBA Orbs I've seen. Grab the source to one and start poking. Look at the marshalling code in particular. There's also no provision for encryption in the IIOP standards, big problems for any NAT equipment, the list just goes on and on.

      2) Use DCOM -- Yech

      3) Use a custom protocol. Sorry but most programmers mess up network programming pretty bad not gonna trust this one.

      4) Use EDI -- if you are seriously considering this one, get out a baseball bat, bash youself in the head, rinse, lather, repeat until it's all better.

      5) Build a private network. This is expensive and troublesome. Using HTTPS with authentication is probably a better solutions.

      This crap about web services being inherently insecure is usually based on running web services over port 80 or 443. If you really want to, you can run it over any port you like. HTTP communication endpoints are specified using essentially URLs, so http://www.example.org:8325/myservice. Uses port 8325. Now you can firewall all you want.

      Other people think that web servers are getting exploited all the time so you shouldn't use them. IIS aside, most of the popular web servers have become more secure as a result of the attacks. I don't know of a single ORB or custom protocol implementation that's withstood the trypes of attacks that web servers have. So I feel more comfortable putting something out there that's been battle tested.

      I think I've covered most of the options. If you have others that you think are better, I'm certainly open to hearing them. Just remember, not letting users have access to the APIs is not an option.
  • Simply put... (Score:3, Informative)

    by gUmbi ( 95629 ) on Tuesday April 23, 2002 @01:28PM (#3395786)
    Simply put, Web services is SOAP and UDDI. SOAP is like RPC, UDDI is like LDAP. There is nothing really new here.

    The reason it is becoming popular is:
    A) it uses XML for procedure calls and it has a big-fat standard for type-mapping so it's not tied to a specific language or language-binding.
    B) It can piggy-back on HTTP so it works through firewalls.

    Web Services may have some issues when network/security administrators figure out people will be using RPC through the firewall.

    • Web Services may have some issues when network/security administrators figure out people will be using RPC through the firewall.

      Mmmm, yes. Especially when they realize that they can't discriminate based on the target Object (there ain't one).

      As we all remember from college, most protocols are layered, which allows encrypted bits to be layered inside routing / security bits, but an XML document can't be layered (it can't contain other XML documents).
    • Web Services may have some issues when network/security administrators figure out people will be using RPC through the firewall I hate this argument. SSL, PKI, servers other than port 80. There are plenty of solutions here.
  • Where i work, the only thing that the end user has on their desktop apart from the standard tools for the job, is IE (no, we dont permit anyone to install Mozilla, basically because theres no point and we wont support it). SO the more things we can pump over the standard http protocol to the end user the better. It gives the end user a standard portal to all the services we provide them, and also its easier to apply a template. And we dont have PHBs here, all these decisions are made by the people that have to implement them, and the people that use them.
    • Re:Because... (Score:2, Insightful)

      by ethereal ( 13958 )
      Where i work, the only thing that the end user has on their desktop apart from the standard tools for the job, is IE (no, we dont permit anyone to install Mozilla, basically because theres no point and we wont support it).

      If you're not part of the solution...

      SO the more things we can pump over the standard http protocol to the end user the better.

      ...you're part of the problem.

      And we dont have PHBs here, all these decisions are made by the people that have to implement them, and the people that use them.

      I think you're missing the people that think about the consequences of those decisions - you know, things like "does running a service over port 80 magically make it secure?" and "hmmmm, so if we're going to do everything over port 80, what was the point of our firewall again?".

      On the other hand, having no PHBs means that you can theoretically turn on a dime and start improving things almost immediately. Good for you!

      • I think you're missing the people that think about the consequences of those decisions

        Hmmm i see your point, but no, we arent missing the people that think about the consequences, since we are also the peoplethat have to deal with network security and such.

        - you know, things like "does running a service over port 80 magically make it secure?" and "hmmmm, so if we're going to do everything over port 80, what was the point of our firewall again?

        Im more talking about webservices to internal users, our own internal employees and those on VPNs. Yes, you are right, jsut switching the port doesnt magically secure a service, but then are all services on that port goingto be used as a secure transmission method? no. any service that we run, we code in the appropriate level of security. And firewalls can be made to do a lot more than control access in or out of the network based solely on port number or services type :) Since us implementors are exactly those persons who have identified the problem, researched the solution and implemented a version of that solution, and we have direct longterm access to the end users and the results that our applications are having on that end user, then i dont think we are in a tight spot at all.
      • As another poster mentioned, SOAP, XML-RPC, et al are not much different than using HTTP POST to pass messages to a CGI program. In fact, as far as I can tell, that is what they are. The only difference is that with web services, the messages are formatted as XML. So don't we already have a problem?

        Maybe there's something crucial that I'm missing; I have not worked extensively with any of this shiny new web services stuff myself.

  • I'm actually knee deep in the poop of webservices and 2++ things that come to mind on this subject:

    1. they're not actually useful. i mean, who's gonna publish their auction webservice or requisition web service for someone else to use? it's nice that u can get order tracking for fedex - i guess that's useful - or stock quotes. but, beyond that, there's no point.

    2. companies like bea (esp bea) and ibm need more revenue and hyping web services to sell 'corporate developer' tools is their way to go. bea esp. is learning from m$ with their whole 'all in one visually appealing code completion server starting package' server and gui tool. and this appeals to a large market of 'corporate developers'. by corporate developers, and i'm taking this straight from the horses mouth, i mean developers that are not full blown ejb/c/python, etc... people. basically newbies that can open an ide and connect to a db via some sort of control. definitely not vi or emacs people.

    however, if u use their tools (which i do, unfortunately on a daily basis), i don't see how a corp dev. is gonna understand hooking into ejb's and such. it's not as trivial as their canned sales demos make it.

    it's really all about the market making more money for itself. "oh, well, any app server's can do ejb, but hey, can yours do web services?" check out xmethods, anything interesting there? hardly.

    there might come a day where these services are avail for rent as components and that might be useful the same way components are now, but until then, no use beyond the good ol stock quote or google search. come to think of it, is the google search really useful at all?
    • by rylos ( 472268 ) on Tuesday April 23, 2002 @01:54PM (#3395961)
      I agree that the hype is heavy for Web Services. However, I do see benefits for using Web Services.

      Making Web Services work in a useful way sometimes takes some creativity. Take Google as an example. With the recent release of the Google API, I was able to use PHP and SOAP to access their search results. One of the methods offered through the service is spell checking. By integrating this spell checking with my company's internal search engine, I now have the ability to make search term suggestions to users. This functionality would be very difficult to provide if it had to be created from scratch.

      Web Services will NOT work for all things and in all situations, but they WILL work for some things and in some situations. Creativity is the key.
    • I can think of many things I'd love to see exposed via web services (or another RPC type protocol).

      What if all the functions of Slashdot were avaliable via SOAP? Then anyone could easily write a Slashdot application that looks more like a news reader, or whatever you want. The value of such a program is debatable, but at least it would be an option. How about weather information, traffic reports, interest rates, currency exchange rates, etc.

      Interest rates and exchange rates are an excellent example of how web services could be used. If you are writing a financial application you could then always have up to date rates which would require no human input. Then a web service call could then be made to a bank to make the transaction. All automated from a single program. No human interaction needed.

      Web services make perfect sense when someone has information that can be useful to someone else that would otherwise not be avaliable. It allows people to build applications that would not be otherwise possible due to lack of information. It gives computers access to the vast knowledge of the internet instead of just humans.

      Of course if no one creates any useful web services then all of this technology will go to waste.
  • Anyone else see the irony in an article about standards that has so many grammatical errors?
  • by mmacdona86 ( 524915 ) on Tuesday April 23, 2002 @01:37PM (#3395847)
    Not companies routinely make information available to the Internet, and routinely make use of information that other companies provide. Unfortunately, lots of times this is more difficult than necessary since all the information is formatted in pretty web pages for people to see.

    Web services just means that you are providing the same data in a format for other companies' programs to use. This is an excellent idea, particularly when you can charge for providing the data.

    This was always the idea behind CORBA, but I think people didn't get it because since both ends of the communication were to be programs, it was too abstract. Now that people do these kinds of information exchanges everyday with web servers and browsers, it's much clearer what the point was all along.

    Web services takes the CORBA idea and adds the web momentum. You leverage the communication infrastructure built for the web. SOAP is a hell of a lot less efficient than IIOP, though.
  • They're not (Score:3, Insightful)

    by Anonymous Brave Guy ( 457657 ) on Tuesday April 23, 2002 @01:40PM (#3395861)
    Why are all the IT companies suddenly interested in open standards with web services?

    They're not. The only people actually interested in "Web Services" are those who make large-scale business apps, those who are in niches where the technology might help, and those who thrive on marketing buzzwords. The remaining 90% or so of the IT world frankly couldn't give a funny line.

  • by sleight ( 22003 ) on Tuesday April 23, 2002 @01:40PM (#3395863)
    Before I begin, I want to make clear that I'm an XML skeptic. To me, XML is nothing more than formatted text -- utterly devoid of value until two or more parties agree on a shared vocabulary (in the form of a DTD or Schema).

    To be simple, CORBA is too entirely too complex. Until recently, even Orbix's (the lead vendor of the pack) offerings have been extremely "flexible" with their degree of compliance to the CORBA spec; Orbix 2.x had CORBA 1.x and 2.x features side by side without any clear delineation of which feature was compliant with which spec.

    EJB is respectable if you're a CORBA or RMI shop.

    Now, let's be realistic. HTTP is already there. It works. Sure, it's not stateful but, historically, people have been kluging statefulness in using cookies for years. XML isn't necessarily ideal but, if you want to be programming language indepent then you have to choose some sort of format. Why not formatted plain text? Sure, it's a little wasteful on the bandwidth but it's flexable.

    To the above mix, we just add UDDI in place of a JNDI or CosNaming and away you go.

    Sounds nice in principle but I have yet to see it in practice. ;)
    • "Sure, it's a little wasteful on the bandwidth but it's flexable. "

      The bandwidth issues can be mitigated by compressing the http stream as per the HTTP 1.1 spec.
    • Sheesh, why does everybody say CORBA is too complicated, the following few lines of code initialize our (Visibroker) CORBA service:

      // Initialize the ORB.
      orb = org.omg.CORBA.ORB.init( args, null );

      // Initialize the BOA.
      // Need to cast boa for java 1.2.x
      boa = ((com.inprise.vbroker.orb.ORB)orb).BOA_init();
      // Export the newly created object.

      // Wait for incoming requests

      and here's a typical method call;


      Operation: ::com::appdesigngroup::appsecurity::IBaseSecurity: : uthenticate.

      #pragma prefix "com/appdesigngroup/appsecurity/IBaseSecurity"
      st ring authenticate(
      in string user,
      in string credentials) raises(::com::appdesigngroup::corba::util::UserExc eption, ::com::appdesigngroup::corba::util:: SystemException
      public java.lang.String authenticate(java.lang.String user, java.lang.String password)
      throws com.appdesigngroup.corba.util.UserException, com.appdesigngroup.corba.util.SystemException
      return authenticate(user, password, enforceExpiration, false);

      Yes, I know Visibroker has a non-standard registration service, but it has a full-blown naming service if you need it. If you're just calling your own methods, then you can simplify things a lot. And yes, you have to run the IDL compiler, but we just put that in our makefiles (yes we still use make), and it's taken care of automagically. New developers don't have to learn all the gritty details.

      I don't see how this is much more complex than RMI or SOAP, but the advanced stuff is there if you need it. And I don't see how parsing an XML tree every time you need to check a data element is speedy or efficient.
    • > EJB is respectable if you're a CORBA or RMI shop.

      Mind you, I'm developing a system for distributed learning, and as the back-end I've got EJBs and I expose the interfaces trough SOAP. I think it's sweet to expose the power of EJB with SOAP.

    • Congratulations (Score:3, Informative)

      by CoreyG ( 208821 )
      Your understanding of XML, "...XML is nothing more than formatted text -- utterly devoid of value until two or more parties agree on a shared vocabulary (in the form of a DTD or Schema" is exactly what XML is defined to be. See? [w3.org] Point #2 is probably the most appropriate here.
    • SOAP is supported by Microsoft and IBM, Sun, BEA, and so on. That observation alone seems to suggest that SOAP will go much further than either DCOM or CORBA.

      A few years back, I used to wonder what the world of distributed computing would be like if Microsoft decided to support CORBA. Maybe with SOAP, we will get a chance to find out.

      BTW, I think Microsoft has no choice but to play along with open standards in web services. If they were to choose otherwise to push their own proprietary web services "standards", their proprietary standards would probably be adopted no more than DCOM.

      • I don't think I'll hold my breath to see how compatible MS SOAP is with everybody elses. There are already subtle differences between the MS soap and apache soap. Also the ms parser does an ebrace and extend by executing script in the XML (which just sounds like a recipe for disaster to me).
    • I'm in agreement on your synopsis of XML. I have an article [eastcoast.co.za] [my ISP] on why XML doesn't meet its stated goals and, in general, sucks. But its too long to post here.

      The problems with HTTP as a transport are: 1. it is heavy; 2. it isn't stateful (as you point out); and 3. its INTENDED as a security backdoor. SOAP stemmed from work on XML-RPC, and both explicitly point out that the use of HTTP gave them an easy way to circumvent firewalls.

      Heavy? Yes. There are several overhead fields on requests, and typically even more on responses (since server's don't tend to be terse just because you're asking for a web service). 20 'int's encoded as strings have an insignificant overhead compared to one or two lines of HTTP header information. And we won't even get into SOAP packets...

      Compression (of which some have glibly spoken) is not an acceptable solution. Accepting or responding to a compressed SOAP message involves a series of filters or parsers: http, gzip, xml, soap, field encoding. The processing overhead is tremendous - even on an otherwise idle system with a Gb ethernet, SOAP cannot get near the performance of traditional (binary encoded) RPC mechanisms (on slower networks). Not to mention that you STILL have the HTTP header overhead, because those are not compressed.

      The first question people should probably be asking is: Why not ASN.1 ? Its also standard, it has a ridiculously longer history than XML, and is in widespread use. It is a terse and efficient binary encoding. And that's its perceived downfall: somewhere, someone decided (with little technical knowhow or forethrough, I might add) that human-readable protocols were a good idea for data communication between machines.

      Why are companies jumping on the bandwagon? Because either they stand to make a lot of money out of developing new technology, or the stand to make money out of selling new technology, or out of converting customer applications to use or support new technology, or they are customers who have their suppliers (and internal MSCD intelligencia) telling them how wonderful and great and cool and really important it is that they break their fully working existing systems and reimplement them with a new protocol. Just because.

  • This whole thing has me just kind of lost.

    The mess of SOAP and RDDI and GESCOM and all these vaguely XML-related, something-to-do-with-port-80 acronyms don't leave me all that impressed; near as i can gather, they're nothing but platforms for people to build platforms on top of, and they won't be of much use until someone takes the foundation of tangled acronyms and builds a common client app that lets you actually use all of these things. I don't take this all this seriously, because knowing the computer industry, i'm pretty sure that by the time "web services" becomes actual services you can use using programs you can download, these services will be using a specific, jury-rigged enough implementation of "web services technology" that you'll be unable to use a given service except with their specific client, and there will be a huge incompatibility rift between MS-based and non-MS-based web services, and basically all of the nice, compatibility-engineering abstractions that the W3C is trying to put together now will be thrown out the window just because the current "web services" standards are so rediculously complicated that no one will be able to come up with an implementation of those standards that really *uses* the full potential of the protocols.

    The thing is, though, i really don't care to understand "web services". I understand the following, and i really think it's all i need to know:
    I like XML-RPC, because it gives me a really neat, simple way to do simple message passing between programs over the internet, without any more overhead than is absolutely necessary, it's cross-platform and cross-language, it isn't awkward to use in any of the programming languages i've used it in.

    XSLT looks really really awesome becuase it's useful and relatively simple, and i really hope we start seeing some tools that can automatically generate some of the XSLT for you, since like all XML tech it's just really verbose.

    J2EE looks *interesting*, and all i wish was that it could be interfaced more easily with other languages. I love Jython, but i don't know if i can embrace Java completely until it's possible to let java communicate with arbitrary languages a bit more easily.

    Twisted [twistedmatrix.com] looks neat but i don't think i'll ever use it.

    I think CORBA would rock my world if it were a bit simpler, or just if someone would find a way to integrate it with the compiler, or just cut out the complicated crap that surrounds using it. C# (whoo, someone's finally figured out that if you make a bunch of languages with the same features but different syntax and macro between them, people will think it's "language cross-compatibility"..) is not the correct way of doing this.
    I think "web services" these days comes down mostly to taking the problems with CORBA (it makes stuff simple! but you have to read a 1500 page book before you can start using it!) and putting <html brackets> around them.

    I think this article was very interesting, especially the claim that .NET is just microsoft trying to take existing standards and take credit for them. (Although i found it funny that the article gives MS full credit for SOAP. Wasn't the guy who made XML-RPC on the SOAP creation team?)

    I would like to know when someone is going to find the balance between J2EE's "everything is nice and fits together and is simple and you just sit down and start doing object oriented programming, but you're chained to the java vm" and the .NET/'web services' "here's a bunch of complicated, bloated standards that take way more bandwidth than they need and that are so abstract you can use them from any language, but also make so many compromises you really don't want to use them unless you're using C# (or a special version of python written for .NET, or a version of C++ that looks exactly like c#..)

    You know, it would be really nice if we had *real*, good, turing complete macro languages built into the popular programming languages. Maybe then we wouldn't have to take the C# route of rewriting the compiler just because you want to make it possible to declare a method a "web service" using a single keyword.
  • Everyone is jumping on board because consuming Web Services is less expensive. If everyone jumps on board offering services with an open standard protocol, developers consuming those services won't have to spend as much time learning how to integrate what those services offer. I don't understand why so many people are getting bent out of shape about Web Services. Many of you have spent years working with different protocols, so I can understand your frustration when there is so much hype around Web Services-- you've spent so much time helping distributed computing concepts mature, why can Microsoft come in and throw all my work away? I am curious to know from people who have experience with distributed applications-- what are Web Services lacking? Is there a specific reason that Web Services will ultimately fail? I can fully appreciate your frustrutions if you can forsee everyone jumping on the Web Services ships only to realize it was extremely limited from the beginning and nobody saw its failure from the beginning. However, if Web Services are powerful enough to bring the Internet to the next level, why are they so strongly criticized?
  • by terrymr ( 316118 ) <terrymr.gmail@com> on Tuesday April 23, 2002 @01:48PM (#3395921)
    Microsoft is blaming industry hype for the general lack of consumer interest in .net services. Their decision to delay the launch of My Services was apparently because of some kind of consumer backlash against over-hyped web services. read the register article [theregister.co.uk]
  • by cybermage ( 112274 ) on Tuesday April 23, 2002 @01:52PM (#3395945) Homepage Journal
    Hey Erik, nice ad:

    Joshua Branch
    Erik Sliman
    1449 Larchmont Ave., Dn
    Lakewood, OH 44107
    Phone: 216 228-7361
    Email: erik(at)joshuabranch.org

    Registrar Name: Register.com
    Registrar Whois: whois.register.com
    Registrar Homepage: http://www.register.com


    Created on: Fri, Dec 17, 1999
    Expires on: Sun, Dec 17, 2006
    Record last updated on: Wed, Mar 06, 2002

    Administrative Contact:
    Joshua Branch
    Erik Sliman
    1449 Larchmont Ave., Dn
    Lakewood, OH 44107
    Phone: 216 228-7361
    Email: erik(at)joshuabranch.org

    Technical Contact, Zone Contact:
    Domain Registrar
    575 8th Avenue - 11th Floor
    New York, NY 10018
    Phone: 902-749-2701
    Fax: 902-749-5429
    Email: domain-registrar(at)register.com

    Domain servers in listed order:

    • by Anonymous Coward
      haha "busted" i believe is the terminology

      shame on his pathetic attempt to get people on the bandwagon or is this just another slashvertisment ?

      nicely spotted, props

      yeah im Anon cause im not accepting cookies

    • Why do a WHOIS? Erik's name is on the byline.
      • Why do a WHOIS? Erik's name is on the byline.

        OpenStandards was /.ed at the time. Also, I think there's are degrees of shamelessness. It's one thing to submit a story you've found someplace and think others might like. It's a bit shameless to submit a story about an article you wrote that another site posted. It's the epitamy of shameless to submit a story you wrote that is posted on a site you own.

        I posted the whois information so that everyone could see just how little research /. needed to do to know that the submitted story was just an Ad.
    • by Anonymous Coward

      looks like Openstandards site is all about buzzwords

      Taken from the "about us" page

      "dedicated to increasing the synergy of international IT collaboration"

      "creating synergy"

      "opportunities to foster synergistic cooperation"

      "fostering proprietary standards"

      "the greater the demand for innovation leveraging it"

      maybe he should try plain english, even consumer TV adverts are laughing at this kind of "dotcom" speak
  • While there is a great deal of hype surrounding web services, this group of technologies is going to dominate how the internet is used in the next few years.

    It has been an ordeal to get web sites to interact usefully without an end-user clicking on a web page. One big problem is trust. An other is protocol. Sites have so many different ways to get information and to submit information. Worse, site administrators have different ideas about how to make various forms of raw data available to others. Exactly where it is to be found is but one stumbling point, much less how it is structured.

    With stuctured data in the form of web services readily available, and clear protocols as to the use of a site's structured data, there will be a lot more interaction between sites and developers of sites.

    Most importantly, web services will allow users and sites to become more alike and on more equal ground. This is a powerful change that is already upon us in the form of web sites like slashdot.org and early web services like Napster.
  • What's the difference between web services, and COM, CORBA and EJB? You are not likely to get a straight answer from the owners of the earlier technologies, because doing so would be tantamount to admiting that they should have begun by agreeing on open standards in the first place.

    Perhaps that's why the "owners" joined the OMG, and later Sun's JCP? However, one company refused to participate in these efforts - I wonder who? If you can guess, congratulate yourself that you're more qualified than the author to write the next Web Services column!
  • by oops ( 41598 ) on Tuesday April 23, 2002 @02:01PM (#3396002) Homepage
    There's a common assumption that SOAP is only transported via HTTP.

    From the Apache SOAP faq [apache.org]

    The writers of the SOAP 1.1 protocol [http://www.w3.org/TR/SOAP/] note that: 'SOAP can potentially be used in combination with a variety of other protocols; however, the only bindings defined in this document describe how to use SOAP in combination with HTTP and HTTP Extension Framework'.

    eg. you can transport SOAP via SMTP.
  • It WILL happen (Score:4, Insightful)

    by Ars-Fartsica ( 166957 ) on Tuesday April 23, 2002 @02:01PM (#3396003)
    The web will at some point be home to more metadata than html. The web at some point will traffic more bots and agents than documents.

    Its silly to presume the web will remain only as a document archive with rudimentary data exchange facilities.

    This is the first step to really exposing APIs over the network in a truly heterogenous fashion. It will take time, there will be major failures, and there will be a lot of hype, but it will happen.

  • I think in IBM's mad rush to push web services onto the market, they are neglecting the fact that there is a limited market right now for this sort of thing and they are pushing web services as a way to build all web applications, where it uneccesarily adds another layer of complexity.

    Web services is not a way to build applications which are never intended to be accessed by other applications directly.
  • Ostriches (Score:1, Insightful)

    by Anonymous Coward

    I am going to get modded as flamebait I am sure, but I think that most of the replies here indicate the prevailing attitude coming from IT workers (not "The Management"),

    • "Another protocol?"
    • "SOAP, don't you use that to wash up? Why would we need that?"
    • "I know X (CORBA,COM,JAVA,etc.) Why do I have to worry about Web services?"
    • "More Microsoft Hype.. We won't be doing that here"
    • "Non-web browsing on port 80?!?! Security risk!!!"
    • "I will just use a custom (ie home grown) interface for my apps, that way, if people want to use them, they have to know the protocol!!"

    This is what got us into trouble before... By before, I mean before the rest of the world moved to "IE 6" or "MSN Browser" or "AOL". And "The Management" was being wined, dined, and 69ed by the Microsoft Marketing machine (which is ALIVE AND WELL FOLKS!!) being convinced that Micro$oft software and implementations were the only solutions to your computing problems.

    The best way to prepare for these things is to KNOW about them, learn them, get your head around how they work, and their implications... by getting our heads out of the sand. That way, when "The Management" asks your opinion (They might, you never know!) you can speak with authority and confidence and be able to fight the good fight.

  • by Anonymous Coward
    Web Services provide a method for remote procedure calls over HTTP(S). Since they involve a remote call, they require that the remote server be up. They also require that a Web Services directory server (UDDI) be available. So in the simplest case, a Web Service requires that 3 systems be up simultaneously to function properly. Because of this, calling a Web Service cannot be as reliable as calling a procedure locally.

    No standards yet exist for Web Service security. Until such standards are promulgated, Web Services will be used only on intranets, if at all.

    Because Web Services require multiple HTTP (request-response)s across the Internet, they are inherently 1000's of times slower than an API call on a local machine. They require more memory and CPU (on both requesting client and on responding server), additional OS context switching, as well as additional network overhead and latency when compared to a local procedure call. While the additional bandwidth and time required to process each Web Service request is music to network hardware vendors' ears, it would mean a drastic increase in Internet traffic.

    Because the various implementations of SOAP (Web Service's underlying protocol) differ, clients and server on various vendors' machines will not currently interoperate.

    All this pales when one considers the effort involved in getting the IT groups of two cooperating corporations to agree on what a term such as "business partner" means and how it is to be represented in XML and/or a database. While this has nothing to do per se with Web Services, it is unfortunately required before one can begin to define any Web Service.

    Today, remote procedure calls are used on the Internet, but not nearly as often as local procedure calls, and certainly not nearly as often as Web Services Proponents would have you believe. A world of Web Services would attempt to distribute processing across the internet, and would fail miserably. Contrary to the premises of the Web Services architecture, the only viable future architectures are those that integrate and centralize processing and that minimize remote procedure calls.

  • I think the service model is an interesting one to take note of and watch over the next few years as the major players roll out their solutions. What I'm most interested in seeing is this technology used to deploy applications across an enterprise. I think this is really where this model can shine in the near future. Currently a lot of enterprises are moving their applications to web and internet based architectures because it can decrease costs of deployment. We all know the Heavy Client vs. Web Client argument and I think there are reasons for using both of them in certain situations. Now imagine corporate users having the ability to subscribe to their enterprises applications as needed. Application management can be consolidated into central locations and the cost of deployment can be decreased significantly. I think what we will need to focus on are tools to enable such deployments and the management of said software in a large, sometimes global, environment. Third party developers can still use the same subscription licensing or develop new licenses for such distribution and the company maintains control of their own information and don't have to rely on 24/7 uptime from an outside service. There are hurdles of course but the technology is here and we heading in that general direction.

    I think this is where the first applications in this area will be built and used successfully. The same technology used to deploy applications using web services across enterprises can be used to distribute applications to consumers.

    My personal opinion is that service based companies don't exacly have the best track record. I bet chances are pretty good that anyone reading this has had a bad experience at one point with a service provider such as the phone, electric, or cable companies. And also people like the idea of owning something. I myself feel like my whole life is a rental sometimes and it bothers me. It's going to take a lot to push users in this direction and the ones that can execute the best will win. But there is no guarantee it will work. It takes more than just a push or shove to generate a new market, but it can be done.
  • by BroMan ( 203660 )
    As it stands now Web services are just the next step in the evolution to a platform-neutral, language-neutral distributed software environment. I see alot of M$ bashing going on, which is somewhat misplaced since some of the biggest backers of web services also happen to be M$'s biggest foes (read - IBM). M$ is however, trying to pull off one of their famous hijacks, with .NET. If they succeed it will be at least partly because of their competitors failure to take the initiative.

    Getting back to web services though, they can possibly fill a niche in enterprise computing - and that niche is the ever-present, never fully solved question of how to tie together disparate platforms and software applications in a common enterprise environment. CORBA is the oft-quoted answer, but it is expensive to implement, and hard to get right. Wells Fargo has implemented an interesting solution for distributed programming using something they call Model Driven Architecture [ebizq.net].

    Looking at getting systems working together from an IT managers perspective, your always looking at the Big Two - time and money. 'How can I get this system working with my current resources in the least amount of time?'. The complement to that is 'How do we maintain and augment this solution once it goes into production without going through birthing pains?'.

    The promise that Web Services is making to IT managers is that they will be able to lower their TCO and increase their ROI by cutting down on the number of changes they have to make to existing systems, while at the same time increasing their flexibility in adding new functionality. To others it makes the promise of providing services that can be metered and billed (wasnt that the promise of CORBA, EJB's, insert favorite distributed model here?).

    Of course this is all a pipe dream until they solve some big issues, like security. Transaction management is not as important since web services can actually be implemented in any kind of language you want (read - Implement your own damn transactions).

    However I think most IT managers will go blue in the face the first time their Fund Transfer web service is hacked because of a weak 56 bit SSL connection ;)
  • MS Browser wars (Score:2, Interesting)

    by damien_kane ( 519267 )
    What became clear in the browser wars was that open standards was the only way to win any war on the Internet. One may be able to truthfully say that Microsoft has won the browser wars.

    Microsoft may have won the browser war with Internet Explorer, but it is not because of open standards.
    Any web developer will tell you that javascript et.al., although founded on the same basic functions and routines, is quite different from one browser to another.
    Microsoft did not win the browser war through open standards, but by bundling it with Windows.

    The fact is, people are too ignorant and lazy to download a completely separate browser, even though it may be (and generally is) more secure than IE. Because of this, the majority of the global online community uses Internet Explorer. The reason this has not changed is because companies realized this. They have thus developed their pages using proprietary MS javascript extensions.

    Why build a nuclear car when 99% of gas stations sell gasoline?

    "Computer games don't affect kids. If Pac-Man Affected us as children, we would spend all of our time running around in darkened rooms eating magic pills and listening to repetetive electronic music..."
    • Actually, you've pointed out the problem with pinning hopes on Standards: Even if Browser A and Browser B are both 100% compliant with Standard X, it's a virtual guarantee that they'll still be incompatible.

      I believe IE is 100% compliant now for DOM/JavaScript/ECMAscript. It's virtually complete for CSS-1 (I think it misses a few, but I'm not sure on that); It's also the first browser to implement the P3P Privacy standard.

      So standards are a good thing, but they've never been intended to be the silver bullet to merge everyone's divergent products into one.

      Let the flames begin... :)

  • ...and how long will the sheep put up with it?

    "News for nerds, stuff that matters"


    Now check this out [openstandards.net]

    "Web Services
    Revision 2
    March 5, 2002"


    In my timezone, it's April 23..

    And we're expected to pay for this "news"?


  • The problem with CORBA, RMI and EJBs is that deploying them is a problem.

    They are each nice is that they enable you to reuse code by tapping into them from a non-local box. For some of those above mentioned technologies, you can use them from different languages. However, they each require a client stub.

    When you need to push a client stub around before you can access the system you have a deployment problem. With SOAP, you can consume those services right away without the need for a client stub.

    I see solving the deployment problem as SOAP's chief advantage.


    PS Now can anybody tell me why Gartner Group thinks that web services will take off inside the enterprise before it does across the web? Why use them internally when I don't have a deployment problem inside of my company?
  • "Web services standards" are robbing network programming from its greatest advantages -- possibility of asynchronous processing, data transparency and flexibility in the data models. Programmers don't neet SOAP or RPC of any kind, they need data encapsulation standard, and one that is not tied to poisoned technology such as XML, Unicode ot Java (yes, those are "open standards", but they are stuffed with idiosyncrasy of their developers and therefore promote all kinds of ideas, in my opinion, stupid ones, over other ideas that are superior for a lot of applications, yet excluded and made impossible to implement over narrow-minded standards).

    Imagine a standard that only allows to address the service by sending HTTP-like request, yet then the connection transforms into a asynchronous bidirectional one, with possibility to stream data in both directions -- with ot without synchronization. Imagine "document" that is just like XML, but without all the Unicode crap (data is transparent -- if one wants to use unicode, mark it as unicode, and make software aware of it, otherwise just don't), without end of document, so data can be streamed endlessly and become available as the tags/fields are parsed.

    This would be far superior for any imaginable purpose to all those little "standards" that Microsoft and Sun originate, and W3C ruberstamps by dozens, that would be a truly useful tool that will improve network programming. However I don't believe any software company will now work on it -- no idiosyncrasy in the standard means no advantage, monetary or political, to its originator over everyone else. Only truly free software/opensource/open standards project would be able to accomplish this. I am just afraid that people won't realize this until it will be too late.

Last yeer I kudn't spel Engineer. Now I are won.