Web Services 222
Erik Sliman writes "Why are all the IT companies suddenly interested in open standards with web services? An OpenStandards.net
article explores the issues surrounding the somewhat vague term."
"...a most excellent barbarian ... Genghis Kahn!" -- _Bill And Ted's Excellent Adventure_
Well (Score:1)
It's Like Most Bandwagons... (Score:4, Insightful)
The success or failure of the actual concept is secondary to how soon they joined the party.
Re:It's Like Most Bandwagons... (Score:2, Interesting)
Planning for Web Services is a new report from O'Reilly Research, written by industry visionary Clay Shirky. This report guides CTOs and CIOs through the inflated claims, competing standards, and amalgam of acronyms to arrive at a realistic appraisal of the business impact of Web Services. Topics include how Web Services can replace EDI, who the major players are and what they really offer, as well as the hurdles to implementing Web Services today. A must-read for anyone developing a Web Services business strategy. $495 Save $100! Just use code # wsrelj when ordering by phone (800-998-9938 or 707-827-7000) or email (order@oreilly.com) and you can get this invaluable report for only $395. Offer expires May 10, 2002
Re:It's Like Most Bandwagons... (Score:2)
Becuase of Stupidity of course (Score:4, Interesting)
If you want a unified 'client' for all services, make one, don't kludge everything onto http. Please...
That's what happening (Score:3, Interesting)
The problem is that everyone has a web browser. Anything that aims to replace it has to get high distribution at low cost. You want all your customers to have whatever client you use. And it has to be based on a standard so that even if the customers client isn't exactly what you have, it's close enough.
And this is in a world where it can be difficult to get IE and Mozilla to play nicely together.
Re:Becuase of Stupidity of course (Score:5, Insightful)
Designing your own protocol takes time, and implementing it for each OS/hardware combo out there takes even more time. Why bother to do that, when you can leverage a protocol (HTTP) and client software (browsers) that are already everywhere?
From management's point of view, web services are a no-brainer....
Re:Becuase of Stupidity of course (Score:2)
You've just described the web as it was (and is, and will be) since 1995 or so. But the web is passe now; even the PHBs have realised that adding an i or e to the start of a brand just doesn't cut it any more. They need new buzzwords! new paradigms! new... bullshit! Fortunately, the world's marketing people have stepped up to the line with a fresh new truckload of ripe, steaming horseshit. Web services? It's client/server. It's nothing new. And the rest of the hype - vague ideas about websites communicating your preferences between themselves - can either be implemented on the client (Mozilla's wallet, form manager etc); is only of use to a Microsoft-like conglomerate that wants to own
no-brainer, sounds like IT at work. (Score:2)
So, sticking propriatory formats like .DOC and what not on the web is a good idea? Give me a break. This junk only works on M$ platforms and not very well there.
We use this trash at work and it sucks. It's all tied in with M$'s bloated, ever changing formats and "standards". Email has gotten so thick from people mailing power point presentations, and word docs that the poorly perfoming servers are crapping out. Oh yeah, all that goofey mail carries a bandwith cost. Sigh, when a few kilobytes of text will do, it get's sent as a word doc. This might be because the default and only allowed mail client defaults to word as an editor, and plain text gets all screwed up at the recieving end. Really, the software forces you that way. Oh, and the asp bullshit? It's kind of like an evil VB crossed with Access used to put blinky things on people's pages and prevent anything but an M$ encumbered computer from looking at it. The slob in the next cube is forced to deal with it, on his two huge monitors that attempt to make up for a lack of virtual desktops. It's the best M$ scripting ever! "Objects" and tabs and buttons, oh my! It does not always work and when it does, it does not work well. A typical use is to hand you a Word DOC though OLE and a link while pretening to be a "web page". The OLE is quirky enough that Word fails to work correctly, if Word has a correct mode of operation. Other uses I've seen are inferior tables that blink and hand you a couple of forms that may or may not collect information and send you a word DOC. So there you have it. Poor perfomance, high cost, incompatibility and formats that won't work in two years. What else can you want from a "service"?
The person who recomended that junk is going to be fired. It's one thing to get suckered for a few insignificant desktop machines that replaced typewriters and secrataries. I can even forgive the poor joker who thought that M$ file sharing might be useful, and the other poor fool who bought copies of front page. After all that, the rework, the broken formats, the masses of equipment that got obsoleted in four years, the broken prommises and costs that exceeded mainframes by wide margins then doubled, there is no forgiveness. Further folly is willful ignorance.
Now that's a story everyone knows, or will know if they work for a M$ "partner". So ends the rant.
Re:no-brainer, sounds like IT at work. (Score:2)
If I can't read what I've written now five years from now, essentially regardless of what I am using five years from now, it's almost suicidal not to be looking for some format that's useable and that I can read five years from now.
Re:Becuase of Stupidity of course (Score:2)
I've been saying this since I first heard Gates start pushing it. Web services is a concept only someone who is clueless or hoping to keep others clueless about computing fundamentals could love, like Gates.
The argument of leveraging "what's out there" is totally misleading. OSs are already out there. Limiting yourself to a browser is essentially just taking the responsibility off the sysadmins by saying everything has to be squeezed into HTML so they don't have to worry about their security policy, but that's totally naive. In order for web services to be useful, they have to be powerful and you're back to the question of why you didn't just write a brown paper app with FTP, TFTP, SSH or whatever protocol you needed for the networking chores? How does squeezing this existing functionality into HTTP represent an improvement?
Re:Becuase of Stupidity of course (Score:1)
Very true! I wonder if that has anything to do with the fact that the first contact many people had with "The Internet" was Internet Explorer... which was conveniently disguised by that "The Internet" icon in early versions of Windows 9x.
It's no wonder why people use Internet Explorer and The Internet synonymously.
Re:Becuase of Stupidity of course (Score:1)
Web services != HTTP Web services
While web services can be used over http, even microsoft is pushing to not use http, merely because http was not designed for this sort of thing.
Re:Becuase of Stupidity of course (Score:2)
No, microsoft is pushing to not use http because http is an open standard, which they can't control and use to lock in users. Microsoft's stand on http, referenced here [com.com], was, as usual, stone-bullshit, obvious to anyone with a little technical savvy.
Re:Becuase of Stupidity of course (Score:2, Insightful)
Think a little.
Re:Becuase of Stupidity of course (Score:5, Insightful)
So, you think you know security, but anything that's tunneled through HTTP/HTTPS is OK with you? You really don't understand security.
SOAP et al are a mistaken implementation for exactly that reason, in a typical Microsoft fashion: by running everything over HTTP, we can get things working quickly without wondering whether they are secure. Later on, there will be a ton of SOAP security holes and information leaks, but we won't be able to plug the hole properly since we can't cut off HTTP without strangling our businesses. I love innovation without cogitation.
An absolute godsend to good firewall administrators would be to have specific services on specific ports so that you could easily audit the use of such services separately and have a better handle on what's going in and out of your 'net. You could, for example, inspect SOAP packets for a particular service without having to slow down all traffic through your HTTP proxy. But since you're a lazy bastard, I bet you don't care :)
FYI: SOAP is not transport/port specific (Score:3, Informative)
http://www.pocketsoap.com/specs/smtpbinding/
h
http://
http://mailman.jabber.org/pi
Just a few links but you can search www.google.com and get an idea of what SOAP really entails.
that's nice (Score:2)
So what does M$ $oap entail? I seem to remember reading about how their junk was going to do stupid stuff like let others arbitrairily run executables on your machine. With the current poor state of M$ user/permission set up, this is like making an email client that automatically runs attachments. Woops, there they go again.
Re:FYI: SOAP is not transport/port specific (Score:2)
...and this goes to show just how serious the security issues of SOAP are. SOAP is meant to be a data format which is transport indenendant, and is also intended to activate services.
So, if you want a decent firewall protecting your network, you must now use a stateful firewall which is capable of checking for SOAP messages in every know (and unknown!) SOAP transport ... otherwise some complete arb can RPC to services on your internal network (possibly with the assistance of another complete morons inside the firewall wanting to expose services for reasons that don't serve the company).
Let's review:
Maybe this is a good thing. Everyone gets to communicate. Instead of hackers creating their own, limited distribution backdoor protocols, there is now one global standard backdoor protocol - at least the security experts can set their sights on a specific target!
stupidest argument ever (Score:3, Informative)
SOAP isn't any different from CGI. I'm posting this message in a web browser, and it is going to port 80. The horror! If slashdot ran a SOAP service, you could write other clients to do the same thing. Instead of posting to 'postcomment.pl?subject=stupidest+argument+ever&c
That's all SOAP is. You can just relax about the use of HTTP. I don't understand why people see something like this, and immediately react with hysterics.
-Mike
Re:stupidest argument ever (Score:2)
Stupid question (Score:2)
Re:What's wrong with HTTP? (Score:2)
Send short request, get slightly longer short response back; end.
If applications actually used this, that'd be great, but in the real world they don't. Almost every application is better served by a persistant connection.
And just because firewalls only let those things through, doesn't mean they should.
Re:What's wrong with HTTP? (Score:2, Informative)
As an even more extreme case, consider the situation where you want to start a lengthy computation on a computational server. Your HTTP request starts the action and the HTTP response indicates that the computation has started successfully. However, when the computation finishes, perhaps hours later, HTTP may not work to report the completion event. Constant polling isn't a good idea, either. Sure, HTTP communication could happen the other way. But HTTP traversal through a firewall or NAT is usually asymmetric, so the reverse HTTP connection may not be a possibility.
How could they not be? (Score:5, Insightful)
That expectation moves to the Net. If you're going to hire net services, you expect to have a unified system that will allow you to do anything with one interfase, one bill, from anywhere.
Now, I can only see two posibilities for that to happen. One is Microsoft, but fortunately I see a trend where less companies are willing to empower the BMFH (Bastard Monopoly From Hell). The other is open standards.
And yes, this is a Good Thing (TM).
AKA RPC over HTTP (Score:2)
For those wishing to simplify CORBA or EJB in the privacy of their own homes, the secret is to make the Service an Object. Now you would never have thought of doing that if I hadn't told you, would you? That's why Web Services are different.
Oh well, someone who puts 'simplified' and 'XML' in the same sentence is probably nuts anyway...
Trendy (Score:2)
Another resource (Score:2)
Re:Another resource (Score:2)
You caught me in a good mood tonight.
We use web services (Score:2, Interesting)
1) no connectivity issues, it's just https over the Net, and
2) no data format issues. In .Net at least, you write a web service just like a local procedure. It has parameters which can be arbitrarily complex objects. Hit a button, it exports WSDL. Send that to your partner, who hits another button, now they can call your web service just like they were calling the procedure locally. No muss, no fuss, and any VB programmer can pick it up in a day and start using it.
Hardly one button (Score:1)
some stuff does work and yah, it's nice over https, but it's really a technology about 70% hype and the rest maybe some varying degree of usefulness..
Re:Hardly one button (Score:1)
Re:We use web services (Score:1)
Show your true userid!!!
Re:We use web services (Score:2)
Microsoft
Think about it for a minute - it's easy, so you don't have to think about it. You do whatever Bill thinks is best, which is to use a proprietary message format and force a closed service because no one else but MS clients can get at it.
But then again, maybe that's a good thing. I don't know that I'd trust a Web service written by a VB programmer.
Related recent /. story (Score:2, Informative)
It's Default (Score:1)
Microsoft's business model in a web services world (Score:1)
There are transition points along the way to a truly distributed computing world, however, that it has been worried about. In a truly distributed computing environment, every client, every desktop is a server. Who owns the desktop world today? Microsoft. That is the end-game, and Microsoft is well positioned to capitalize on it.
In the interim, however, before all the "standards" and "security" issues are worked out, server-based computing -- Larry Ellison's proclaimed NetPC -- will surface. Unfortunately, for Microsoft, this is where they are weak. Microsoft knows that there are already too many competing server platforms out there. So, it focused instead on making sure that open protocols were adopted so that any server-based products that are developed will always be compatible with its desktop client. To hedge its bets, it also pushed Passport so that even if server-based computing becomes established, it will have a piece of the pie.
When a truly distributed computing world surfaces, unless the open-source community finds a way to penetrate the desktop client, it will be a Microsoft world all over again.
Its a good time to .... (Score:2, Interesting)
The web is all very well but HTTP et al. have some serious limitations and were never designed for most of the current technology. For example a dial up connection has the same bandwidth of a dedicated line in the 1970's so ASDL/Cable modems etc were never considered.
The reason for all the demand now is the scientific community and all the Grid projects around the world, just because there's a recession doesn't stop them and their data requirements make Google look like a small fry (20TB of data for Google vs 600TB for BaBar at SLAC [stanford.edu]).
The other issue is business - they've all got on the band wagon of internet sales as an extra sales channel so they can grow this, but its not going to be the sudden revenue increase it was initially. Web Services offer the opportunity for companies to increase productivity and efficency which is why the tech companies are investing in it now so when the economy changes and the corporate clients come back they have something new to go on about.
Re:Its a good time to .... (Score:2)
We do that here, except now that its so slow we have the time to think about planning for tomorrow instead of this afternoon there's no goddman money available to make it happen.
It's starting to feel like the Staples shared-pen commercial.
Web Services = Inherently Insecure (Score:2, Interesting)
Re:Web Services = Inherently Insecure (Score:3, Insightful)
Here are the choices as I see it:
1) Use CORBA. You have to bust a whole in the firewall. I don't know about you, but I would much rather trust an HTTP server than most CORBA Orbs I've seen. Grab the source to one and start poking. Look at the marshalling code in particular. There's also no provision for encryption in the IIOP standards, big problems for any NAT equipment, the list just goes on and on.
2) Use DCOM -- Yech
3) Use a custom protocol. Sorry but most programmers mess up network programming pretty bad not gonna trust this one.
4) Use EDI -- if you are seriously considering this one, get out a baseball bat, bash youself in the head, rinse, lather, repeat until it's all better.
5) Build a private network. This is expensive and troublesome. Using HTTPS with authentication is probably a better solutions.
This crap about web services being inherently insecure is usually based on running web services over port 80 or 443. If you really want to, you can run it over any port you like. HTTP communication endpoints are specified using essentially URLs, so http://www.example.org:8325/myservice. Uses port 8325. Now you can firewall all you want.
Other people think that web servers are getting exploited all the time so you shouldn't use them. IIS aside, most of the popular web servers have become more secure as a result of the attacks. I don't know of a single ORB or custom protocol implementation that's withstood the trypes of attacks that web servers have. So I feel more comfortable putting something out there that's been battle tested.
I think I've covered most of the options. If you have others that you think are better, I'm certainly open to hearing them. Just remember, not letting users have access to the APIs is not an option.
Thank you.. (Score:1)
Simply put... (Score:3, Informative)
The reason it is becoming popular is:
A) it uses XML for procedure calls and it has a big-fat standard for type-mapping so it's not tied to a specific language or language-binding.
B) It can piggy-back on HTTP so it works through firewalls.
Web Services may have some issues when network/security administrators figure out people will be using RPC through the firewall.
Jason.
Nicely understated (Score:2)
Mmmm, yes. Especially when they realize that they can't discriminate based on the target Object (there ain't one).
As we all remember from college, most protocols are layered, which allows encrypted bits to be layered inside routing / security bits, but an XML document can't be layered (it can't contain other XML documents).
Re:Nicely understated (Score:2)
This is why HP came up with the hack of putting SOAP messages in mutipart-MIME wrappers [w3.org] - this was necessary to pass XML Docs as arguments in a SOAP doc. Your Jabber stream is using a similar wrapper - a wrapper which is not XML.
Re:Absolutely wrong (Score:2)
Re:Simply put... (Score:1)
Because... (Score:1)
Re:Because... (Score:2, Insightful)
If you're not part of the solution...
...you're part of the problem.
I think you're missing the people that think about the consequences of those decisions - you know, things like "does running a service over port 80 magically make it secure?" and "hmmmm, so if we're going to do everything over port 80, what was the point of our firewall again?".
On the other hand, having no PHBs means that you can theoretically turn on a dime and start improving things almost immediately. Good for you!
Re:Because... (Score:1)
Hmmm i see your point, but no, we arent missing the people that think about the consequences, since we are also the peoplethat have to deal with network security and such.
- you know, things like "does running a service over port 80 magically make it secure?" and "hmmmm, so if we're going to do everything over port 80, what was the point of our firewall again?
Im more talking about webservices to internal users, our own internal employees and those on VPNs. Yes, you are right, jsut switching the port doesnt magically secure a service, but then are all services on that port goingto be used as a secure transmission method? no. any service that we run, we code in the appropriate level of security. And firewalls can be made to do a lot more than control access in or out of the network based solely on port number or services type
Re:Because... (Score:2)
Maybe there's something crucial that I'm missing; I have not worked extensively with any of this shiny new web services stuff myself.
Marketing Hype = More $$$ (Score:2, Insightful)
1. they're not actually useful. i mean, who's gonna publish their auction webservice or requisition web service for someone else to use? it's nice that u can get order tracking for fedex - i guess that's useful - or stock quotes. but, beyond that, there's no point.
2. companies like bea (esp bea) and ibm need more revenue and hyping web services to sell 'corporate developer' tools is their way to go. bea esp. is learning from m$ with their whole 'all in one visually appealing code completion server starting package' server and gui tool. and this appeals to a large market of 'corporate developers'. by corporate developers, and i'm taking this straight from the horses mouth, i mean developers that are not full blown ejb/c/python, etc... people. basically newbies that can open an ide and connect to a db via some sort of control. definitely not vi or emacs people.
however, if u use their tools (which i do, unfortunately on a daily basis), i don't see how a corp dev. is gonna understand hooking into ejb's and such. it's not as trivial as their canned sales demos make it.
it's really all about the market making more money for itself. "oh, well, any app server's can do ejb, but hey, can yours do web services?" check out xmethods, anything interesting there? hardly.
there might come a day where these services are avail for rent as components and that might be useful the same way components are now, but until then, no use beyond the good ol stock quote or google search. come to think of it, is the google search really useful at all?
Re:Marketing Hype = More $$$ (Score:4, Interesting)
Making Web Services work in a useful way sometimes takes some creativity. Take Google as an example. With the recent release of the Google API, I was able to use PHP and SOAP to access their search results. One of the methods offered through the service is spell checking. By integrating this spell checking with my company's internal search engine, I now have the ability to make search term suggestions to users. This functionality would be very difficult to provide if it had to be created from scratch.
Web Services will NOT work for all things and in all situations, but they WILL work for some things and in some situations. Creativity is the key.
Re:Marketing Hype = More $$$ (Score:2, Interesting)
What if all the functions of Slashdot were avaliable via SOAP? Then anyone could easily write a Slashdot application that looks more like a news reader, or whatever you want. The value of such a program is debatable, but at least it would be an option. How about weather information, traffic reports, interest rates, currency exchange rates, etc.
Interest rates and exchange rates are an excellent example of how web services could be used. If you are writing a financial application you could then always have up to date rates which would require no human input. Then a web service call could then be made to a bank to make the transaction. All automated from a single program. No human interaction needed.
Web services make perfect sense when someone has information that can be useful to someone else that would otherwise not be avaliable. It allows people to build applications that would not be otherwise possible due to lack of information. It gives computers access to the vast knowledge of the internet instead of just humans.
Of course if no one creates any useful web services then all of this technology will go to waste.
This article is not standards compliant!!! (Score:2)
Re:This article is not standards compliant!!! (Score:2)
Re:This article is not standards compliant!!! (Score:2)
Nobody knew what CORBA was for until the web (Score:5, Insightful)
Web services just means that you are providing the same data in a format for other companies' programs to use. This is an excellent idea, particularly when you can charge for providing the data.
This was always the idea behind CORBA, but I think people didn't get it because since both ends of the communication were to be programs, it was too abstract. Now that people do these kinds of information exchanges everyday with web servers and browsers, it's much clearer what the point was all along.
Web services takes the CORBA idea and adds the web momentum. You leverage the communication infrastructure built for the web. SOAP is a hell of a lot less efficient than IIOP, though.
They're not (Score:3, Insightful)
They're not. The only people actually interested in "Web Services" are those who make large-scale business apps, those who are in niches where the technology might help, and those who thrive on marketing buzzwords. The remaining 90% or so of the IT world frankly couldn't give a funny line.
CORBA is too heavy & EJB is too RMI/IIOP depen (Score:5, Insightful)
To be simple, CORBA is too entirely too complex. Until recently, even Orbix's (the lead vendor of the pack) offerings have been extremely "flexible" with their degree of compliance to the CORBA spec; Orbix 2.x had CORBA 1.x and 2.x features side by side without any clear delineation of which feature was compliant with which spec.
EJB is respectable if you're a CORBA or RMI shop.
Now, let's be realistic. HTTP is already there. It works. Sure, it's not stateful but, historically, people have been kluging statefulness in using cookies for years. XML isn't necessarily ideal but, if you want to be programming language indepent then you have to choose some sort of format. Why not formatted plain text? Sure, it's a little wasteful on the bandwidth but it's flexable.
To the above mix, we just add UDDI in place of a JNDI or CosNaming and away you go.
Sounds nice in principle but I have yet to see it in practice.
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2)
The bandwidth issues can be mitigated by compressing the http stream as per the HTTP 1.1 spec.
CORBA is NOT that complex (Score:1)
// Initialize the ORB.
orb = org.omg.CORBA.ORB.init( args, null );
// Initialize the BOA.
// Need to cast boa for java 1.2.x
boa = ((com.inprise.vbroker.orb.ORB)orb).BOA_init();
/
boa.obj_is_ready(service);
// Wait for incoming requests
boa.impl_is_ready();
and here's a typical method call;
/**
Operation:
#pragma prefix "com/appdesigngroup/appsecurity/IBaseSecurity"
s
in string user,
in string credentials) raises(::com::appdesigngroup::corba::util::UserEx
);
*/
public java.lang.String authenticate(java.lang.String user, java.lang.String password)
throws com.appdesigngroup.corba.util.UserException, com.appdesigngroup.corba.util.SystemException
{
}
Yes, I know Visibroker has a non-standard registration service, but it has a full-blown naming service if you need it. If you're just calling your own methods, then you can simplify things a lot. And yes, you have to run the IDL compiler, but we just put that in our makefiles (yes we still use make), and it's taken care of automagically. New developers don't have to learn all the gritty details.
I don't see how this is much more complex than RMI or SOAP, but the advanced stuff is there if you need it. And I don't see how parsing an XML tree every time you need to check a data element is speedy or efficient.
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2)
Mind you, I'm developing a system for distributed learning, and as the back-end I've got EJBs and I expose the interfaces trough SOAP. I think it's sweet to expose the power of EJB with SOAP.
Mikael
Congratulations (Score:3, Informative)
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2, Interesting)
A few years back, I used to wonder what the world of distributed computing would be like if Microsoft decided to support CORBA. Maybe with SOAP, we will get a chance to find out.
BTW, I think Microsoft has no choice but to play along with open standards in web services. If they were to choose otherwise to push their own proprietary web services "standards", their proprietary standards would probably be adopted no more than DCOM.
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2)
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2)
I'm in agreement on your synopsis of XML. I have an article [eastcoast.co.za] [my ISP] on why XML doesn't meet its stated goals and, in general, sucks. But its too long to post here.
The problems with HTTP as a transport are: 1. it is heavy; 2. it isn't stateful (as you point out); and 3. its INTENDED as a security backdoor. SOAP stemmed from work on XML-RPC, and both explicitly point out that the use of HTTP gave them an easy way to circumvent firewalls.
Heavy? Yes. There are several overhead fields on requests, and typically even more on responses (since server's don't tend to be terse just because you're asking for a web service). 20 'int's encoded as strings have an insignificant overhead compared to one or two lines of HTTP header information. And we won't even get into SOAP packets...
Compression (of which some have glibly spoken) is not an acceptable solution. Accepting or responding to a compressed SOAP message involves a series of filters or parsers: http, gzip, xml, soap, field encoding. The processing overhead is tremendous - even on an otherwise idle system with a Gb ethernet, SOAP cannot get near the performance of traditional (binary encoded) RPC mechanisms (on slower networks). Not to mention that you STILL have the HTTP header overhead, because those are not compressed.
The first question people should probably be asking is: Why not ASN.1 ? Its also standard, it has a ridiculously longer history than XML, and is in widespread use. It is a terse and efficient binary encoding. And that's its perceived downfall: somewhere, someone decided (with little technical knowhow or forethrough, I might add) that human-readable protocols were a good idea for data communication between machines.
Why are companies jumping on the bandwagon? Because either they stand to make a lot of money out of developing new technology, or the stand to make money out of selling new technology, or out of converting customer applications to use or support new technology, or they are customers who have their suppliers (and internal MSCD intelligencia) telling them how wonderful and great and cool and really important it is that they break their fully working existing systems and reimplement them with a new protocol. Just because.
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2)
Re:CORBA is too heavy & EJB is too RMI/IIOP de (Score:2)
I've argued that for significant uses you don't even have to agree on the semantics of the data. The producer and consumer of the data can each form seperate semantic meaning for the same data as long as the meanings they form are useful to them.
You for example might produce a document that includes a field 'OrderID' which identifies the order number assoicated with the data in your database. Looking at the data I might have no use or interest in 'OrderID', but still be able to draw out (for example) the name of the customer without any prior agreement as to the meaning of the fields you provided.
The distinction becomes significant when you consider the number and types of communications that can exist. If each one must be standardized or agreed to in advance of use, then XML becomes a much weaker tool for data interchange. Instead someone looking at the data can make an educated guess about the uses that they would like to make of the data you provide, and even if those meanings don't overlap with your meanings, just so long as they get utility from their interpretation the result is still useful.
Mirror (Score:1)
Re:Mirror (Score:1)
Ummm... hm. Some random thoughts. (Score:2, Interesting)
The mess of SOAP and RDDI and GESCOM and all these vaguely XML-related, something-to-do-with-port-80 acronyms don't leave me all that impressed; near as i can gather, they're nothing but platforms for people to build platforms on top of, and they won't be of much use until someone takes the foundation of tangled acronyms and builds a common client app that lets you actually use all of these things. I don't take this all this seriously, because knowing the computer industry, i'm pretty sure that by the time "web services" becomes actual services you can use using programs you can download, these services will be using a specific, jury-rigged enough implementation of "web services technology" that you'll be unable to use a given service except with their specific client, and there will be a huge incompatibility rift between MS-based and non-MS-based web services, and basically all of the nice, compatibility-engineering abstractions that the W3C is trying to put together now will be thrown out the window just because the current "web services" standards are so rediculously complicated that no one will be able to come up with an implementation of those standards that really *uses* the full potential of the protocols.
The thing is, though, i really don't care to understand "web services". I understand the following, and i really think it's all i need to know: I think "web services" these days comes down mostly to taking the problems with CORBA (it makes stuff simple! but you have to read a 1500 page book before you can start using it!) and putting <html brackets> around them.
I think this article was very interesting, especially the claim that
I would like to know when someone is going to find the balance between J2EE's "everything is nice and fits together and is simple and you just sit down and start doing object oriented programming, but you're chained to the java vm" and the
You know, it would be really nice if we had *real*, good, turing complete macro languages built into the popular programming languages. Maybe then we wouldn't have to take the C# route of rewriting the compiler just because you want to make it possible to declare a method a "web service" using a single keyword.
Integration is less expensive (Score:2, Insightful)
Microsoft blames hype for .net woes (Score:3, Interesting)
Re:Microsoft blames hype for .net woes (Score:2)
Your interpretation would be akin to claiming nobody is interested in using the Web because boo.com failed.
Re:Microsoft blames hype for .net woes (Score:2)
Shameless Self-Promotion (Score:5, Interesting)
Organization:
Joshua Branch
Erik Sliman
1449 Larchmont Ave., Dn
Lakewood, OH 44107
US
Phone: 216 228-7361
Email: erik(at)joshuabranch.org
Registrar Name: Register.com
Registrar Whois: whois.register.com
Registrar Homepage: http://www.register.com
Domain Name: OPENSTANDARDS.NET
Created on: Fri, Dec 17, 1999
Expires on: Sun, Dec 17, 2006
Record last updated on: Wed, Mar 06, 2002
Administrative Contact:
Joshua Branch
Erik Sliman
1449 Larchmont Ave., Dn
Lakewood, OH 44107
US
Phone: 216 228-7361
Email: erik(at)joshuabranch.org
Technical Contact, Zone Contact:
Register.Com
Domain Registrar
575 8th Avenue - 11th Floor
New York, NY 10018
US
Phone: 902-749-2701
Fax: 902-749-5429
Email: domain-registrar(at)register.com
Domain servers in listed order:
DNS13.REGISTER.COM 209.67.50.208
DNS14.REGISTER.COM 209.67.50.209
haha "busted" (Score:1, Funny)
shame on his pathetic attempt to get people on the bandwagon or is this just another slashvertisment ?
nicely spotted, props
yeah im Anon cause im not accepting cookies
Re:Shameless Self-Promotion (Score:1)
Re:Shameless Self-Promotion (Score:3, Interesting)
OpenStandards was
I posted the whois information so that everyone could see just how little research
Whoaa Site Buzzword Alert !!! (Score:1, Informative)
looks like Openstandards site is all about buzzwords
Taken from the "about us" page
"dedicated to increasing the synergy of international IT collaboration"
"creating synergy"
"opportunities to foster synergistic cooperation"
"fostering proprietary standards"
"the greater the demand for innovation leveraging it"
maybe he should try plain english, even consumer TV adverts are laughing at this kind of "dotcom" speak
Web services really do matter (Score:2, Interesting)
It has been an ordeal to get web sites to interact usefully without an end-user clicking on a web page. One big problem is trust. An other is protocol. Sites have so many different ways to get information and to submit information. Worse, site administrators have different ideas about how to make various forms of raw data available to others. Exactly where it is to be found is but one stumbling point, much less how it is structured.
With stuctured data in the form of web services readily available, and clear protocols as to the use of a site's structured data, there will be a lot more interaction between sites and developers of sites.
Most importantly, web services will allow users and sites to become more alike and on more equal ground. This is a powerful change that is already upon us in the form of web sites like slashdot.org and early web services like Napster.
Re:... but are just a small step, not giant leap (Score:2)
Check out this recent W3C submission by HP called WSCL: http://www.w3.org/TR/wscl10/.
The XML Schema spec improved upon DTD in a big way (mainly by "xmlizing" the description and adding input and output data types). But it was not nearly enough. WSCL extends this API-describing capability by describing the entire architecture of an interactive web service.
Once services out there make use of WSCL things will get somewhat more interesting. Now I'm sounding like Linus in the pumpkin patch again...
- James
Forgotten the OMG already? (Score:2)
Perhaps that's why the "owners" joined the OMG, and later Sun's JCP? However, one company refused to participate in these efforts - I wonder who? If you can guess, congratulate yourself that you're more qualified than the author to write the next Web Services column!
SOAP != HTTP (necessarily) (Score:3, Informative)
From the Apache SOAP faq [apache.org]
The writers of the SOAP 1.1 protocol [http://www.w3.org/TR/SOAP/] note that: 'SOAP can potentially be used in combination with a variety of other protocols; however, the only bindings defined in this document describe how to use SOAP in combination with HTTP and HTTP Extension Framework'.
eg. you can transport SOAP via SMTP.
It WILL happen (Score:4, Insightful)
Its silly to presume the web will remain only as a document archive with rudimentary data exchange facilities.
This is the first step to really exposing APIs over the network in a truly heterogenous fashion. It will take time, there will be major failures, and there will be a lot of hype, but it will happen.
web services not replacing something (Score:2, Insightful)
Web services is not a way to build applications which are never intended to be accessed by other applications directly.
Ostriches (Score:1, Insightful)
I am going to get modded as flamebait I am sure, but I think that most of the replies here indicate the prevailing attitude coming from IT workers (not "The Management"),
This is what got us into trouble before... By before, I mean before the rest of the world moved to "IE 6" or "MSN Browser" or "AOL". And "The Management" was being wined, dined, and 69ed by the Microsoft Marketing machine (which is ALIVE AND WELL FOLKS!!) being convinced that Micro$oft software and implementations were the only solutions to your computing problems.
The best way to prepare for these things is to KNOW about them, learn them, get your head around how they work, and their implications... by getting our heads out of the sand. That way, when "The Management" asks your opinion (They might, you never know!) you can speak with authority and confidence and be able to fight the good fight.
Web Services - Unreliable, Insecure, Slow, Buggy (Score:2, Insightful)
No standards yet exist for Web Service security. Until such standards are promulgated, Web Services will be used only on intranets, if at all.
Because Web Services require multiple HTTP (request-response)s across the Internet, they are inherently 1000's of times slower than an API call on a local machine. They require more memory and CPU (on both requesting client and on responding server), additional OS context switching, as well as additional network overhead and latency when compared to a local procedure call. While the additional bandwidth and time required to process each Web Service request is music to network hardware vendors' ears, it would mean a drastic increase in Internet traffic.
Because the various implementations of SOAP (Web Service's underlying protocol) differ, clients and server on various vendors' machines will not currently interoperate.
All this pales when one considers the effort involved in getting the IT groups of two cooperating corporations to agree on what a term such as "business partner" means and how it is to be represented in XML and/or a database. While this has nothing to do per se with Web Services, it is unfortunately required before one can begin to define any Web Service.
Today, remote procedure calls are used on the Internet, but not nearly as often as local procedure calls, and certainly not nearly as often as Web Services Proponents would have you believe. A world of Web Services would attempt to distribute processing across the internet, and would fail miserably. Contrary to the premises of the Web Services architecture, the only viable future architectures are those that integrate and centralize processing and that minimize remote procedure calls.
It will be interesting to watch (Score:2, Interesting)
I think this is where the first applications in this area will be built and used successfully. The same technology used to deploy applications using web services across enterprises can be used to distribute applications to consumers.
My personal opinion is that service based companies don't exacly have the best track record. I bet chances are pretty good that anyone reading this has had a bad experience at one point with a service provider such as the phone, electric, or cable companies. And also people like the idea of owning something. I myself feel like my whole life is a rental sometimes and it bothers me. It's going to take a lot to push users in this direction and the ones that can execute the best will win. But there is no guarantee it will work. It takes more than just a push or shove to generate a new market, but it can be done.
50% Meat, 50 % Filler (Score:2, Interesting)
Getting back to web services though, they can possibly fill a niche in enterprise computing - and that niche is the ever-present, never fully solved question of how to tie together disparate platforms and software applications in a common enterprise environment. CORBA is the oft-quoted answer, but it is expensive to implement, and hard to get right. Wells Fargo has implemented an interesting solution for distributed programming using something they call Model Driven Architecture [ebizq.net].
Looking at getting systems working together from an IT managers perspective, your always looking at the Big Two - time and money. 'How can I get this system working with my current resources in the least amount of time?'. The complement to that is 'How do we maintain and augment this solution once it goes into production without going through birthing pains?'.
The promise that Web Services is making to IT managers is that they will be able to lower their TCO and increase their ROI by cutting down on the number of changes they have to make to existing systems, while at the same time increasing their flexibility in adding new functionality. To others it makes the promise of providing services that can be metered and billed (wasnt that the promise of CORBA, EJB's, insert favorite distributed model here?).
Of course this is all a pipe dream until they solve some big issues, like security. Transaction management is not as important since web services can actually be implemented in any kind of language you want (read - Implement your own damn transactions).
However I think most IT managers will go blue in the face the first time their Fund Transfer web service is hacked because of a weak 56 bit SSL connection
MS Browser wars (Score:2, Interesting)
Microsoft may have won the browser war with Internet Explorer, but it is not because of open standards.
Any web developer will tell you that javascript et.al., although founded on the same basic functions and routines, is quite different from one browser to another.
Microsoft did not win the browser war through open standards, but by bundling it with Windows.
The fact is, people are too ignorant and lazy to download a completely separate browser, even though it may be (and generally is) more secure than IE. Because of this, the majority of the global online community uses Internet Explorer. The reason this has not changed is because companies realized this. They have thus developed their pages using proprietary MS javascript extensions.
Why build a nuclear car when 99% of gas stations sell gasoline?
"Computer games don't affect kids. If Pac-Man Affected us as children, we would spend all of our time running around in darkened rooms eating magic pills and listening to repetetive electronic music..."
Re:MS Browser wars (Score:1)
I believe IE is 100% compliant now for DOM/JavaScript/ECMAscript. It's virtually complete for CSS-1 (I think it misses a few, but I'm not sure on that); It's also the first browser to implement the P3P Privacy standard.
So standards are a good thing, but they've never been intended to be the silver bullet to merge everyone's divergent products into one.
Let the flames begin... :)
How long can this go on... (Score:2)
"News for nerds, stuff that matters"
OK.
Now check this out [openstandards.net]
"Web Services
Revision 2
March 5, 2002"
umm..
In my timezone, it's April 23..
And we're expected to pay for this "news"?
t_t_b
Web services solve a business problem (Score:2)
They are each nice is that they enable you to reuse code by tapping into them from a non-local box. For some of those above mentioned technologies, you can use them from different languages. However, they each require a client stub.
When you need to push a client stub around before you can access the system you have a deployment problem. With SOAP, you can consume those services right away without the need for a client stub.
I see solving the deployment problem as SOAP's chief advantage.
Vanguard
PS Now can anybody tell me why Gartner Group thinks that web services will take off inside the enterprise before it does across the web? Why use them internally when I don't have a deployment problem inside of my company?
Poisoned technology again (Score:2)
Imagine a standard that only allows to address the service by sending HTTP-like request, yet then the connection transforms into a asynchronous bidirectional one, with possibility to stream data in both directions -- with ot without synchronization. Imagine "document" that is just like XML, but without all the Unicode crap (data is transparent -- if one wants to use unicode, mark it as unicode, and make software aware of it, otherwise just don't), without end of document, so data can be streamed endlessly and become available as the tags/fields are parsed.
This would be far superior for any imaginable purpose to all those little "standards" that Microsoft and Sun originate, and W3C ruberstamps by dozens, that would be a truly useful tool that will improve network programming. However I don't believe any software company will now work on it -- no idiosyncrasy in the standard means no advantage, monetary or political, to its originator over everyone else. Only truly free software/opensource/open standards project would be able to accomplish this. I am just afraid that people won't realize this until it will be too late.
Re:Poisoned technology again (Score:2)
XML is transparent. It's text. Granted it's unicode text. If you don't like it, use UTF-8 and play like unicode doesn't exist.
That works only with ASCII -- texts in other languages give horrendous parsing errors if processed as UTF-8, so I would have to convert it to Unicode. The problem is, most of applications aren't going to expose enough information to actually make the conversion, text may be in some charset, and charset is known on the upper level while communication happens on the lower one. And, of course, some data is just binary.
Just don't expect anybody to sympathize if you're system breaks when a Japanese company tries to use it.
That's the problem -- I want to follow standards, not invent creative ways to break them to avoid ideas they are poisoned with.
Unicode isn't a big deal to use, really. Trust me.
I am Russian. I know EVERYTHING about this "not a big deal" bullshit.
Okay, what's the difference between SOAP and a "data encapsulation standard". You do realize that SOAP isn't just RPC. You can use any XML encoding you like inside the data payload not just RPC encoding.
Data encapsulation does not assume that data is being sent to something in chunks, and then the sender expect the response or even completion. It just defines that if I want to send an array of integers, I format them in certain way. But if I want to send endless stream of structures that each contain, say, date, variable-length array of integers and three strings (what would be perfectly reasonable for, say, telemetry system on some device -- it will continue working even if it's absolutely sure that no one listening anymore), I should be able to do it without thinking that variable-length array of integers may happen to be megabytes long, and have to fit in memory for formatting or parsing.
Care for an example?
Unicode, SGML-isms in XML and RPC-ish processing.
Imagine a standard that only allows to address the service by sending HTTP-like request
I can. It's called HTTP.
No, it isn't. I have implemented HTTP, and one of possible [ab]uses of my server allows to use HTTP connection for bidirectional communication after it's established. It was a blatant violation of HTTP spec, and therefore I had to write a library that specifically disallows this if only its normal API is used.
yet then the connection transforms into a asynchronous bidirectional one, with possibility to stream data in both directions -- with ot without synchronization.
Granted this is difficult to do. Streaming data one direction at a time over http is easy. Streaming it both directions is not.
Both things are easy, it's just one doesn't violate the standard, and another one does. People would gain more from extending the standard for bidirectional connections than from all "standards" crap that came from Microsoft and Sun since invention of HTTP.
Can you cite an application where you'd want to stream in both directions at the same time?
Remote console over HTTP-ish proxy. X session over HTTP-ish proxy. SSH session over a HTTP-ish proxy. Typical bidirectional protocols, and they would greatly benefit from URLs as possible destinations if firewalls will interpret those URLs as proxying requests, handle authentication (in the case of X, where authentication information must be passed), etc. Access to various equipment's semi-real-time reporting and control. TV-like access to semi-time-dependent broadcast -- user infrequently sends control commands, while huge amount of streamed data is being sent to him when it's being produced by sources that he have defined in his last control request that may remain active for many days/terabytes.
In my experience, most applications need to request something then receive back a response.
Then all applications you have seen, are "embarrassingly RPC-able" (an equivalent to "embarrassingly parallelizable").
I consider unicode transparent. I have editors that work just fine with Unicode. Emacs on Unix and wordpad on Windows. If you don't want your document to have non-ascii characters in them, mark them as UTF-8 and just write them out as ASCII. Not hard. You will have problems with people trying to send you text that contains non-ASCII characters but you have that problem with ASCII. Do it like this: <xml version="1.0" encoding="UTF-8"?>
_I_ am the person who will most likely send something that is not ASCII. I don't want to be forced into a kludge that excludes even my native language from being usable, so the only alternatives I have is to slavishly follow Unicode requirement, or declare that I would rather break the standard, remove encoding from the header and consider the protocol byte-value-transparent, just like HTTP always was until some assholes started promoting their "esperanto for computers". The second alternative is looking better and better with every new "standard" issued.
You can do this with XML and SOAP. There absolutely no reason that you have to wait for the entire document to get at the data. If you're using an implementation that loads everything into a DOM before you get a whack at it, find another implementation. It's not a protocol problem.
1. The standard is specifically made in the assumption that things are "called" when the data is arrived. Streaming never "calls" anything, it may be processed in part by calling a lot of things as data is becoming available, and structure of the stream may never have an end. Is an XML document with a list that is never closed, compliant with the standard? And with infinite length?
2. There are no implementations that do that. And the requirement to "verify" XML before it's used make this impossible (BTW, what a moronic idea, to verify the format -- verify what, that sender doesn't have bugs so horrendous, they violate the standard? there is more probability that data contains complete bullshit rather than that sender formatted it wrong), but even if it's not verified, XML is made specifically to make writing compliant libraries extremely difficult, in the hope that no one would ever try to do that and will just use the implementations made by Sun and Microsoft (or expat, if they are feeling suicidal).
However I don't believe any software company will now work on it -- no idiosyncrasy in the standard means no advantage, monetary or political, to its originator over everyone else.
That may be true. It's hard to get someone interrested in a protocol that isn't part of an existing standard.
Unless you are Sun or Microsoft.
Only truly free software/opensource/open standards project would be able to accomplish this. I am just afraid that people won't realize this until it will be too late.
YOU have realized it! Now it's time to make it happen. If you truely have something better, post a link to it as a reply to this message. It's great advertisement for your project and could bring you lot of new like-minded developers.
I will have to do that anyway -- my current project is cluster management, and I will have to design and implement a shitload of control/monitoring-related functionality. And no, SNMP isn't even close for being usable for that purpose, it's like umm... what would be a good example?.. oh, found, "ip over pigeons" for video streaming.
I'm dubious about your claims. That's just because I happen to like SOAP. It can be a little daunting to understand all the technologies involved in using SOAP but no more so than CORBA, DCE-RPC, SUN RPC, SMB or any other network protocol.
Understanding is easy, agreeing isn't. All protocols that you have mentioned are heavily poisoned -- why don't you use something else as an example. Say, SSL. It's just as complex, has very few working implementations, yet completely poison-free, it's made to serve its purpose, not to promote someone's way of doing things over all competing ones. Or HTTP, a protocol that is un-poisoned in 1.0, and slightly poisoned in 1.1. Or MIME. Or C language (ok, it's simpler than most of things, but its use is more complex), or C++ without the "standard" libraries. Those things are complex enough, yet they are made for goals other than excluding competing minds. And then look at all the garbage that ITU (more widely known by its former name CCITT)produced -- the purest poison that one can find in the "noosphere" -- standards all tied to each other, with complex, mind-bending and cumbersome 30-50 years old pieces pulled out of their graves to make implementations impossible, so only masochists will try to reimplement those and discover that existing implementations are buggy and incompatible with standards. Then try to honestly answer, where on the scale between those things XML and SOAP are.
I'd love to see you step up and actually post a link to a technical document that explains what you're talking about. ----- I am. Both as a profession and a hobby.
I will do that if/when I will decide to make standard base for my control/monitoring applications. If I will get nearly as much shit from "standards makers" as when I tried to prevent mandatory-Unicode-fication of FTP (yes, there was a draft about that, I have no idea if it ever made into RFC, but in either case no one ever tried to implement that), I would be content with keeping the protocol "published-proprietary", a standard specific for the product. But what I am certainly not going to do is to use SOAP or other XML-RPC-isms as an internal protocol.
Re:Poisoned technology again (Score:2)
As far as the messaging system goes, though, it's a messaging system. not a full communications package; a simple messaging system. if your problem isn't well suited to simple mmessage passing, then you shouldn't use SOAP message passing. Use something else.
I would have no problem with this statement if SOAP was designed to support some specific subset of tasks, or could be dubsivided to standards that define access to methods, data formats, etc. But the problem is, the standard is "everything or nothing" -- either one has to subscribe to all pieces of ideology that it's stuffed with, or not use it at all. And it is presented as a solution to everything -- including problems that it has nothing to do with, and can be applied to only through ugly and inefficient kludge. If standard was separated from the very beginning into data formatting, object model(s), text/non-text handling, it would be useful for everyone -- if not one part than another. But no, "standardizers" want to push the whole thing down people's throats, or declare implementations "nonstandard". This is selfish and unproductive.
If you need something more than what they can handle, just don't use them. Or use them in tandem with something more complex.
SOAP, being "everything or nothing" kind of standard can't be used in tandem with ANYTHING other than SOAP itself. It requires complete compliance and loyalty to the design on all levels. It dictates the basic protocol model, so the only way to use anything else "with" it is to support basically two distinct protocols with different data models. But if another protocol is more flexible, why would anyone use SOAP? Anything SOAP can do, can be done better by a subset of data-formatting protocol using SOAP-ish semantics over it.
web services aren't "robbing" you of anything; they're a least common denominator appropriate for *some* things.
They are certainly not the least common denominator of anything. They are arbitrarily chosen to promote certain way of using network, that, if used by someone, disallows the programmer to improve performance and design over implementations that are promoted by originators of the "standard" thus automatically making the crap made by Sun and Microsoft the best of the breed -- because everyone who will dare to improve either flexibility or performance hits the wall of what is allowed by the "standard". And beyond the "standard" there is nothing but uncharted space -- every attempt to create data formatting standard is turned into "why reinvent something, we have SOAP".
SOAP could be better; what you suggest (a bidirectional whatever) would be really neat. But i doubt it would get as popular as SOAP, becuase it's more complicated. SOAP is popular *because* it's simple.
SOAP is NOT simple, it's merely implemented, shrink-wrapped and promoted in glossy magazines as the only game in town. In fact bidirectional asynchronous transfer is easier to implement, it's just SOAP creators specifically excluded it to make the standard that promotes their narrow-minded ideas by excluding everything else -- and then they will claim that since SOAP exists and supported by "major players" while everything else is at most being developed, then SOAP should be used for everything, despite all its limitations. Software industry is only starting to recover from similar game played with Windows and their "simple" and "wonderful" Win32 API.
I suggest you try BXXP [mundi.net]. I am not sure, but it sounds like it is EXACTLY what you want. It has failed to catch on where SOAP has caught on because it is more complicated. If i am incorrect in this i apologize.
I have looked at RFC, and it has absolutely nothing to do with what I am talking about. I disagree with semantics of the messages and internal format restrictions, not their encapsulation in HTTP or TCP.
By the way, you still have not attempted to explain in any way why you dislike unicode.
I use a non-iso8859-1 language in my everyday life, therefore I don't have to explain everyone why I am qualified to decide if Unicode is good or bad. Full explanation would take many pages, and I have posted it multiple times into Unicode-related discussions, only to find out that others have no idea what I am talking about because they are happily using "Unicode" in the form of ASCII or slightly mangled iso8859-1 with their languages while I am forced into using my language in a completely screwed up way because it's not in first 256 bytes of Unicode. So here I would keep the short answer -- because I am Russian, therefore automatically qualified to make judgment about "encoding for foreigners", and I have shitload of trouble handling Russian texts with it, as opposed to any other representation of text.
you also have not attempted to offer any *alternatives* to unicode. Honestly, tell us, i am curious.
Declare byte-value transparency of protocol and remove mandatory encoding in header. If somene really needs to pass a multi-language document over XML, allow "charset" attribute everywhere where "lang" is allowed, and in the absence of charset declaration never attempt to do any language-sependent text processing. If "charset" is "UTF-8", unicode lovers will get their object of worship without forcing everyone else to do the same. Most of data that travels across the network is not designed to be directly viewed by humans, so if the creators of software do not think, charsets should be labeled for display at the time of data transfer, there should be no requirement for doing the labeling or converstion at the protocol layer. XML is created with the assumption that the primary purpose of any kind of data is to be directly displayed in a pretty-looking page, and this is what was used to justify mandatory Unicode. The problem is, declaring charset for a tag or substring is just as easy for displaying, however when the program just exchanges raw data it helps a lot if there are no arbitrary requirements for its format -- even if metadata about charset exists somewhere it may not be present, or known, when the data is being transferred, and this design decision must be respected.
Re:Web Services is Hype (Score:1)
Re:Web Services is Hype (Score:4, Insightful)
No one is claiming that it isn't rpc, but it is an agreed-upon open standard for rpc across public networks using simple transport protocols. No one else is doing this, and CORBA is web services so don't offer that up as a reply.