Forgot your password?
typodupeerror
Microsoft

Dave Winer On Microsoft, SOAP, XML-RPC In NYT 229

Posted by timothy
from the report-from-the-embracement-front dept.
daveuserland writes: "Lots of activity in XML-over-HTTP. An article in today's NY Times about Microsoft, SOAP, UserLand and me. My comments. In the meantime XML-RPC keeps growing with solid interop. 29 implementations in the new XML-RPC directory. The politics are intense but everything's going well." It sounds like Dave understands this .Net thing; even after hearing about it for a few years, I've yet to hear a really lucid explanation of why I should want my apps and personal data floating in an amorphous cloud, but maybe that's just me.
This discussion has been archived. No new comments can be posted.

Dave Winer On Microsoft, SOAP, XML-RPC In NYT

Comments Filter:
  • A ***LOT*** of people in the mainstream (both home users and businesses, including a lot of my friends and relatives) don't want to know any more about their computers than they absolutely must. They want a nice, simple thing, like a monthly cable bill, and to see all their problems with Windows on a desktop go away. (Don't even suggest they convert to Linux--people who can't deal with Windows are NOT candidates to run Linux.)

    Will .NET deliver this (non-)nerdvana? Maybe, but my hunch is it will fall short in some critical areas for at least the first couple of releases. Remember: MS is relentless. Look how long it took them to get Windows jump-started. They hung on for years after it was first released and almost everyone thought it was totally useless. None of us should expect MS to drop this idea any time soon.

  • by Anonymous Coward
    I think that some people are missing the point. XML-RPC is not formatting an HTTP GET in a fancy way. Also, XML-RPC is not just for web services - it can be used for easy communication between different pieces of software which are separated by distance, language, or platform. For example, a piece of Java that monitors my server's CPU utilization:

    // assume that my class has a void getCpuUsage() method..
    xmlRpcServer = new WebServer(serverPort);
    xmlRpcServer.addHandler ("SrvMon", this);


    And a hunk of Perl that checks up on my remote servers, blah blah...

    $server = Frontier::Client->new( 'url' => 'http://whatever' );
    $result = $server->call('SrvMon.getCpuUsage', '');


    SOAP is more geared towards "web services" and less cool (in my opinion)
  • by Anonymous Coward
    I'm pretty sure MS (and IBM) will take SOAP into directions that Dave didn't want,
    Microsoft [microsoft.com] won't be able to do anything harmful as long as Dave keeps both the SOAP spec and the Hostia compiler GPL'd.
  • by Anonymous Coward
    And timothy should be modded down for the same reason.

    XML-RPC and SOAP are remote procedure call protocols. What have they got to do with "owning apps"? Almost nothing at all.

    Oh wait, but .NET will use SOAP. Well, .NET will also use DNS but I don't see it coming up every time DNS is mentioned.

    SOAP, and XML-RPC are enabling technologies. What Microsoft choose to do with them is another issue entirely.

  • by Anonymous Coward
    ..understands this .Net thing; even after hearing about it for a few years, I've yet to hear a really lucid explanation of why
    Wow, you had insider info on Microsoft's .Net strategy three years ago! Seems strange that MS only just announced it, what...6 mons ago? It's really sad that slashdot has gotten this bad. I find it odd that no one pointed out this glaring inaccuracy (or exaggeration?) above.

    Reminds me of an interview in 1997 where the interviewee claimed 7 years Java experience and we asked him how long he worked for Sun; his response: "Huh?"

    AC

  • The above post is terribly misguided.

    SOAP cannot possibly make \. (or any other site) more scalable. XML-RPC and SOAP require 5-30 times the bandwidth for a simple function call. Both are synchronous protocols. Both require the additional overhead of loading an XML parser, parsing the passed XML for each request, processing the request, and (finally) building the response in XML. IOW these protocols will be *slooowww*! How this could possibly be more scalable than an in-process function call is beyond understanding.

    Secondly, it is being claimed that "services" will reside on various servers. But a simple mathematical argument shows that distributed web services will be *much* less reliable than centralized servers:

    For the sake of argument, assume the probability that any single server is up at any time is Ps = 99.9%, that a program uses Ns = 5 services from 5 different servers(including itself), and that those 5 services do not rely on any other servers. Then the probability that the program can complete is less than or equal to P = (Ps)**Ns = (0.999)**5 ~ 0.995

    So with servers that are running at 99.9% uptime, we have a 1-in-200 chance of program failure. Plug in other values to see how reliable the component servers must be to achieve a "5-nine's" level of program completion, something relatively easily achieved by a single server.

    The same mathematical rules that keep helicopters from making long flights without a critical failure will render *distributed* web services unusable unless extremely high (and costly) levels of server reliability are attained.

    IMO this is yet another reason that, if used at all, web services will be used primarily as a partitioning tool for software development. But whenever possible the services (will/should be) be hosted on the same server.

  • I've yet to hear a really lucid explanation of why I should want my apps and personal data floating in an amorphous cloud, but maybe that's just me.

    Really, it's nothing conceptually different from having your home directory and local applications mounted over NFS, and using daemons like sendmail and httpd to provide protocol services, and using X11 to display on your workstation an application running on a big Unix box, and using CORBA or even rsh to control software on other machines. We all use at least a subset of these technologies every day.

    Why not take all those services that were developed independently and re-architect them to make them work within a unified framework? That's all .NET is, really.

  • Yes, but nobody said that you had to spend your money on word processors. Myself personally, I have lots of stuff that I would rather spend my cash on.

    People spent money before Microsoft existed, and they will continue to spend money even if Microsoft disappears. They just won't spend as much of it on software.

  • It's already happening. Sun's ultimate goal for Star Office is for it to be deployed by ASPs. The beautty of this is that StarOffice will not require that the ASP give a cut of the profits to someone else. This is good for the ASP because it lowers their costs, and it is good for Sun because it A) cuts of Microsoft's Office cash flow, and B) sells more beefy Sun hardware.

    The funny thing is that Microsoft is going to go through a ton of work forcing their customers into the ASP model, only to find increased competition for their software (and from software that will almost certainly be less expensive to own).

  • Well said. And this will be Microsoft's undoing. If the protocols are truly open then what is the ASPs incentive to use Microsoft software? ISPs haven't been tempted by Microsoft's extensions to open protocols, and my guess is that ASPs aren't going to be too interested in paying the Microsoft tax either.

    They will, of course, build .Net compatible services (in order to tap into the large Microsoft using customer base), but my guess is that they will generally shun the Microsoft specific extensions.

    What most Slashdot readers are afraid of is a future in which Microsoft can say "All your Data belong to Us" (in much more polished English, of course), but this is essentially becoming less and less the case every day as Microsoft is being forced by their customers towards open standards.

  • by jjohn (2991) on Monday April 09, 2001 @04:40PM (#303771) Homepage Journal

    what can i do with xml-rpc or soap? why is it so much better than just plain-old http post? just because it has the correct buzzword juju for today?

    Actually, XML-RPC and SOAP both fall under the rubric of web services. Web services allow a program to make a remote procedure call to another machine using some wire protocol (ie XML-RPC or SOAP). The neat part about web services is that they are language neutral. That is, a Perl script on Linux can make remote procedure calls to an NT server running an ASP server. When the Perl script gets the data from the RPC call, the data will be available as a standard Perl datatype. If one were to simply use web pages to do this, every script would have to parse HTML just for that particular call. With web services, the programmer never needs to deal with XML at all.

    Check out XML-RPC.com [xmlrpc.com] or IBM's developerWorks [ibm.com] for more information on web services.

  • Just use SOAP, WSDL, and UDDI locally. Part of Hailstorm is the definition of a service protocol, built around the OPEN standards of SOAP, WSDL, and UDDI (both IBM & Sun have said they will also support these, and IBM's SOAP solution already works with Microsoft's SOAP solution--cross-language and cross-platfor compatibility).

    Nobody will force you to use Hailstorm's services. But the advantage of SOAP, WSDL, and UDDI is great. IBM realizes this. Sun realizes this. But slashdot's readers simply think that if Microsoft touches it, then it's implicitly bad. If you don't like Microsoft, just use IBM's implementation (they've given it over to Apache, in fact, so that it is truly open source).
  • I didn't mean to imply that Sun did own ebXML. However, they are one of the (I would say "The") dominant forces in the standardization process. The incorporation of SOAP into ebXML was done at Sun's request. Sun has played a bigger role, in terms of developers working on it, than any other vendor. But you are very right in saying that ebXML has support from a wide variety of vendors.
  • by The Mayor (6048) on Tuesday April 10, 2001 @01:00AM (#303780)
    People. Please get a grip on your Microsoft bashing. Here's some point from a *user* of SOAP (and I'm not even using Microsoft'1 implementation).

    -There are other versions of SOAP/WSDL/UDDI available. IBM has gratiously given their implementation to Apache. It looks to be the choice for open-source advocates. Sun has also announced they'll be supporting SOAP/WSDL/UDDI in the SunONE platform. Sun is also making ebXML compatible with SOAP, so that ebXML services can be called using SOAP, and vice-versa.

    -IBM/Apache's implementation is interoperable with Microsoft's. Previous versions have had some major problems. But this is less a case of "embrace and extend", and more of a case of "this is new technology, and we haven't got the bugs worked out". I've seen IBM/Apache's and Microsoft's SOAP solutions call one another, from/to different languages. There are still a few quirks, but each release brings both Microsoft's and IBM/Apache's solution in greater compliance with the standard (yes, IBM's solution had some compliance problems, too!).

    -Services are the next natural progression in software development. We've got from monolithic to client/server to 3-tier to n-tier develoment. The problem is that even with n-tier development, each tier relies upon the tiers next to it, in the form of a non-standard, programmer-developed API for the interface. Services free us from this, as they standardize the API between "tiers". And, by breaking down the "tier" concept, services can help bring component architecture to what was once a tier. It makes it easier to developed these components, and easier to assemble the components into an application. And it makes it easier to integrate components from third parties (such as -ack!- Hailstorm--but more importantly from small software vendors--I'm certainly not going to trust my sensitive info to Microsoft).

    -XML-RPC is *not* SOAP. The only thing in common is that XML-RPC allows network programming using XML, as does SOAP. But SOAP packeges each network component into a service. This service can then be described in a standard way using WSDL (Web Services Description Language), so that programs can discover the API for a service and call the service automaticallly. Then, these services can be advertised using UDDI (Universal Description, Discovery and Integration), a sort of yellow pages for web services. XML-RPC provides RPC functionality, transferring data using XML.

    -Web services do *not* have to use the Internet. This concept can work just as well on an Intranet. For large corporations, where there cannot be constant communication between the developers of services, using SOAP/WSDL/UDDI makes sense, as each development group then has a reliable and standard way of using services created by other groups.

    -SOAP != .NET. .NET uses SOAP, but SOAP is not .NET. If it were, do you think IBM & Sun would be using SOAP?

    Sorry for the long rant. I just hear way too many misconceptions. I get tired of it after a while.
  • As was finally said by someone else, SOAP and XML-RPC are the distributed-object scheme that will probably make the concept ubiquitous, since they provide a simple, elegant way to encapsulate requests and objects (XML) and any number of ways of transporting them over existing protocols (HTTP, SMTP/POP, FTP, whatever).

    Slashdot could benefit from it right now. One of Slash's weak points is its two-tier architecture. At present, there's a database layer without much logic, and a bunch of Perl executing on the webserver.

    One of many easy ways SOAP can make the Slash engine more scalable would be to move the heavier Perl functions to another server, and leave the outward-facing webserver free to spit out pages without thinking too hard. I suggest the SOAP::Lite [soaplite.com] Perl module. It's got an elegant API that's so simple to use, you'd barely have to change any of the existing codebase. Throw your heavier logic on a non-public HTTP server, point your shell code on the main HTTP server at it with a couple of lines to tell it where the SOAP server is, and voila! You're three-tiered, and your public webserver no longer needs a direct connection to your database server, so it can be safer too.

    Want to move one particularly nasty piece of logic to its own hardware? Easy. Want to handle some things like moderation asynchronously? Just make some of your SOAP calls via SMTP, and have a cron job pick them up and process them via cron. Again, without changing your existing code much at all.

    RPC and distributed objects aren't really new. You could have done this with CORBA, DCOM, EJB, or any number of other technologies.. but this time around it's especially easy to use, easy to work across languages, and has the added advantage of being easy to make available across the Internet without a fight over exotic firewall rules. You don't have to work to tunnel it over HTTP.. it can be HTTP.
  • I used the example of Slashdot-split-into-3-tiers-with-SOAP to explain to young Timothy what SOAP and XML-RPC are and what they're used for, since he seemed to think they were some kind of useless buzzwords with no application in the real world... because he didn't know what they were.

    I'm quite aware that Slashdot's real bottleneck is in the database, and that this pretty unsolvable as long as (1) their code is MySQL-specific and (2) there still isn't a Free Software RDBMS that supports advanced replication and load balancing that's also Free and open. But since the only tiered web application Timothy's ever seen is Slashdot, I had to use it as the example.
  • You are aware that you dont own your software right? Even if it is free software, unless you wrote it you dont own it.

    No you own your copy(s) of the software .Net are most EULAs are about remove or minimizing this, because of the this little pesky detail called "Right of First Sale". what copyright grants are right over the production of copies. No possetion of all the copies.

  • If you think about it, it's really not much different than keeping your mail on an ISP's mail server and just pulling it with imap on whatever machine you're going to read it from, except that the vision is more than mail -- it's digital pictures, digital music, contact info, free/busy info (aka, calendaring info), and more.

    Don't sell IMAP short. It's the protocol you want to use to keep those pictures, music, contact info, etc. in a central location that can be easily sync'd/replicated as needed on any client. IMAP is all about replicating MIME objects, after all.

    It's a shame more ISPs don't offer IMAP because then we could have distributed disks all around the net.

  • by Osty (16825) on Monday April 09, 2001 @04:52PM (#303790)

    • I've yet to hear a really lucid explanation of why I should want my apps and personal data floating in an amorphous cloud, but maybe that's just me.

    The whole point here is the whole "Any device, anywhere" view that Microsoft has been driving at for a while now (Auto PC, Pocket PC, Tablet PCs, Web TV, upcoming Stinger cell phone, and so on). If you think about it, it's really not much different than keeping your mail on an ISP's mail server and just pulling it with imap on whatever machine you're going to read it from, except that the vision is more than mail -- it's digital pictures, digital music, contact info, free/busy info (aka, calendaring info), and more. The apps don't live in the cloud, only the data does (well, apps may keep a replicating copy in the cloud, but you don't run the app from the cloud -- it runs from whatever device it can run on). In fact, the only app neccessary for most of this is a web browser -- the rich clients are about enriching the experience, not creating the experience.

    At the same time, you won't need a 24/7 internet connection to be able to work on your documents that live in the cloud. Local replication will make sure you have the latest copy of the data (as of the last time you were online) that you can work with and modify to your heart's content locally. Then the next time you connect to the net, it gets propped to the cloud, where you can then access the revised information from anywhere (PDA, laptop, cell phone, auto pc, internet kiosk, wherever). This does bring up some interesting security issues, but then those same issues exist with the current model of ISP mail servers holding mail that you then retrieve with imap, just on a smaller scale.

  • by TWR (16835) on Monday April 09, 2001 @10:37PM (#303791)
    Do you know what I think about Dave Winer having his work basically stolen by Microsoft?

    He deserved it.

    Back in 1997 or so, Dave was bashing Apple left and right, running as fast as he could to port every bit of his code to Windows. He wrote DaveNet missives about how well MS understands and helps developers, how Apple just screwed its developers, and how Apple was on the verge of death.

    Now he gets to see what comes from trying to be a pilot fish in the jaws of the big shark. It couldn't of happened to a more deserving guy.

    -jon

  • Also, XML is somewhat self-documenting because the tag names hint at the purpose of each field. Which of these file formats would you rather try to figure out? CSV: "Bob", "756838", "124437" XML: <name>Bob</name> <phone>756838</phone> <fax>124437</fax>

    In the above case, one could use a "header" line in the CSV file:

    Name, Phone, Fax
    Bob, 756838, 124437
    Jerry, 34543, 5554
    ...

    It's easier to view a lot of CSV data in a glance than it is to view XML data (maybe not a biggie, but still). It also helps than you can load the file into a spreadsheet program (aka Excel) for editing.

    Structured data - now that is a pain in CSV. You have to carefully maintain relations between objects, so it is possible.. but hardly convenient.

  • I wish i could remember where i heard this, and i'm paraphrasing, but i think it really sums things up: ".NET is Microsoft's initiative to rewrite the Internet the way they wanted it to be all along"


    --

  • .net is just an attempt at Microsoft to prove that they DIDN'T completely miss the boat when they first came up with MSN and dismissed the Internet (and TCP/IP) as a "toy for students and academics".

    2 years later they managed to (poorly) integrate BSD's stack into windows (remember how long you were forced to use 3rd party IP stacks simply because MS was to dumb to read RFCs?). By now, however, they have weasled their way into the standards bodies that MADE the Internet what it is today. They have learned from their mistakes, and are ready to do MSN the "right way" - the "Microsoft way" and damn anybody else who wants anything to actually interoperate.
  • SOAP and XML-RPC are the distributed-object scheme

    I don't see how XML-RPC or SOAP are a distributed object scheme. Sure, it's a communications protocol. It allows you to send data structures down a wire, or do remote procedure calls.

    But where is the encapsulation? Where are the methods? How is this equal to serialized component architectures?

    One of Slash's weak points is its two-tier architecture.

    I am sure. But what you are describing is not depedendent on having a distributed object architecture.

  • Well, actually it was because the computer refused to let me install '97, whenever I uninstalled 2000 and installed '97 it would tell me my license was invalid when I tried to run any '97 programs. Messed with it for a while and finally gave up, and installed 2000 on a few computers that needed to directly interoperate with that one computer. Then I got complaints from other groups about interoperability, even though it was rare that they shared documents, and they could have manually converted, I got tired of them complaining and installed it on all computers
    N=20 actually, not a huge number. Anyways shortly after doing do it occured to me that I could have probably reinstalled windows and been able to installed '97 on the initial machine, but it was too late. And I felt pretty stupid about it :)
  • by J.J. (27067) on Monday April 09, 2001 @04:57PM (#303804)
    obligatory no login link [nytimes.com]
  • Even if the protocol is eventually made public, they can still force you to use their servers.

    I would really love to see a .localnet initiative. That is, your private server is the 'cloud'.
  • by Azza (35304) on Monday April 09, 2001 @04:59PM (#303812)
    If you're interested, he's talking mainly about MS and IBM adding WSDL to the SOAP spec. The original userland article is here: http://davenet.userland.com/2001/03/29/unstallingS oap [userland.com]
  • + Word 2000 file format compatibility

    This isn't really a feature. This is an anti-feature, designed to make it difficult for people to run any version of Word before 2000.

  • "Yeah sure Bill, I will trust you with my data!" says the guy who has a Hotmail address. I'm cracking up!
  • True the stupid home user ends up subsidizing the schools and the corporations not to mention all the people in the third world.

    P.S. For the not too awful stupid people you can get education discounts via internet pretty easily (of course you could just pirate the thing like the rest of the world does and stick the lusers with the bill).
  • Of course you are right. Windows users are profoundly stupid and could never grasp the complexity of using a computer to anything non trivial. It's for this reason that I always recommend MACs for first time computer buyers and those windows users who always tend to mess things up royally even though windows is so easy and usable.

    Believe it or not there are people in this world who get confused by windows and mess it up, cause it to crash or lock up. These nincompoops do stupid things like lose their start menu, destroy desktop settings etc.

    There is hope on the horizon though. MacOsX is very promising and may become the one platform for the drooling retards that use windows now and the geeks that like unix. Time will tell.
  • "...yeah, it must have been dirty tricks... "Oh but they gave it away for free!" - you bastards!"

    Fucking communist pigs! Don't they realize that it's un-american to give things away for free.
  • Well netscape was better then IE till IE came out version 5. Until then Netscape was a much better browser and the only way MS could dislogde it from the top was by giving away IE when netscape was trying to sell it. IE came preinstalled while NS was a huge download. Without this bundling IE could never have had market penetration to dominate like they do today.
    Unfortunately for NS they did not have a monopoly where they could cram their browser down peoples throats wheather they liked it or not.
  • It really depends on weather or not you want to be paid like an HTML programmer or a Java Programmer.
  • Is there a widely used language without an ORB? Oh yea probably VB.

  • Not true. You'll still have to rewrite your app to take into account the new field. If you are going to rewrite it anyway then what have you gained?
  • It's called not having a moral compass.
  • XML does no such thing.
    Client 1 insists on sending you ADO XML format datasets, client 2 runs oracle, client 3 rund DB/2 and client 4 uses WDDX. They all send you XML files with different attributes. Worse yet one spells out states the other one put's in a abbreviation. One put "city of New york" and the other puts "new york city" and the other one puts "new york".

    Oh yes it's all XML but so what? you still have to write a parser for each and every one of your clients because they sure as hell ain't going to rewrite their app or change the data in their database. Not only that but you have to translate the data itself for each and every client because XML is unable to force them to put "New York" in their city field.

    All this talk about a beautiful world full of all singing all dancing interop apps is pure bullshit pitched at PHBs so they can part with shareholders money. It's a fucking lie.
  • By the time IE5.0 came about millions upon millions of homes had IE installed with windows. Netscape was struggling to make money because every time they came out with a product MS came out with the same prduct and gave it away for free. With NS running out of money they could no longer afford to keep NS competitive and started trying to find other markets. But It was moot because MS had an infinate amount of money and kept undercutting their business with giveaways (I guess communism is sometimes good).

    In the end all netscape had was a portal and MS was ofcourse attacking that too. AOL bought it pretty much for the portal and the rest is history. How ironic that now MS is facing the same threat except from a bunch of communist, anti-american snot nosed kids.
  • once again no advantage for XML. If I wrote my app using perl or http post it would ignore the extra field too.
  • "Netscape's browser was for all practical purposes free from the beginning. The only people paying for it were large companies."

    Are you saying that large companies paying for your product is a trivial source of revenue? A comapny needs every cent it can get.

    "Are you saying Netscape couldn't afford to put resources into further developing their web browser? I haven't heard that before."

    Any business person would realize that MS was going to win the browser war because they were giving away the product that netscape was trying to sell. Same with web servers. It would have been suicide to budget money towards developing the browser if MS was going to bundle theirs with windows and put it in every desktop in the world. They scambled looking for markets where MS did not play but every time they entered a market MS entered the same market with a similar product and gave it away for free. It was a doomed task. When MS wants to kill a competitor the competitor will die because MS has what amounts to infinate amount of money and they can subsidize any of their products with their office and windows monopoly. Ask Real.
  • I dont' know if it's true or not but...

    When Freud was asked about communism he reportedly said "it won't work because people are not that good".
  • I can parse delimeted data or fixed length data with one line of code too it really does not matter. Every client who sends me data produces a different object. I have to write my interfaces for each and every XML document my clients send me. Not only that but somebody could send you &lt name&gt bob&lt/name&gt and somebody could send you &lt name value="bob"/&gt both are legal XML both require different code. XML is only self documenting to a human.
  • "if MS and Oracle can't agree on their dataset format, that's not really a problem with the underlying technology. "

    I disagree vehemently. If the purpose of the technology is to ease interchange and the technology is not able to achieve that then it's a failure. XML is a silly standard because it specifies nothing at all.

    "but there's also got to be a better way. "

    Here is a better way. Have XMl actually enforce relationships just like a database does. Specify tags that say this element is bound by a relationship to this other element (or url). Have realistic contraints and lookups. Only then will it actually be useful.
    XML as it exists now is only a hair above useless. CSV does pretty much the same thing without added bloat and even perl style hashes are more useful ans easier to parse. Any technology which leaves it up to competitors to agree on a standard is doomed to failure. MS and Oracle will never agree and you will be writing translators for their XML formats.

    XML is a failure because it does not deliver what it promises.
  • XML schema is just this side of useless.
    You can specify that some element ought to be text but you can't specify the context. Let's say that you had an element called county. You have to make it text right? Well can somebody put "goatse.cx" into this county field? Will the XML parser say "sorry that's not a county" of course not. Worse yet the US postal office says that there is a county called "lake and peninsula" in alaska does your XML parser reject "lake & peninsula"?
    Schemas only enforce trivial rules not real life business rules. There ought to be a schema element that says "this is a county as defined by the USPS URL, the elements must exist in this URL". I suppose you could list all umpteen thousand counties in your schema and be screwed when the USPS creates a new one.
  • by Malcontent (40834) on Monday April 09, 2001 @08:00PM (#303833)
    And then he woke up...
  • by Malcontent (40834) on Monday April 09, 2001 @10:16PM (#303834)
    There are already dozens of standards including the humble comma delimeted file that every body can produce in a hurry. I can't tell you how many interfaces I have built in my life but I know for a fact that XML or SOAP or any other acronym will only mean more work. The single biggest success I have seen so far is HL7 (for the health care industry). They got together and created a massive standard based on fixed length data which specified not only positions and lengths but datatypes and content as well. Only when you enforce data integrity and consistency will you actually have a usable interop tool.
    Tagging arbitrary data so that a generic parser can turn it into an object is pretty much useless.
    What's needed is not tagging but data integrity and enforcement. Something like constraint rules in a database except over the internet. That would be cool.
  • bah.. read the EULA, Microsoft can recall any piece of software they want and if you use it after the fact you are violating the license.
  • so because you can control how you use the software you are not willing to purchase it? Bavo! Now if only everyone would apply this to the software have today, we wouldnt have such rediculous things as the EULA. But no, most people are willing to just break the law and ignore the license. One day they just might come break down your door and put the cuffs on you for having an expired copy of win98. "This software has been recalled, it's people like you are contributing to the downfall of society!!" as they drag you off to the Ministry of Love.
  • Dude, I'm talking about linux as in the kernel. Everything else would have to be written. Some leet framebuffer gui running natilus or something.
  • by QuantumG (50515) <qg@biodome.org> on Monday April 09, 2001 @04:42PM (#303843) Homepage Journal
    I dont understand. You are aware that you dont own your software right? Even if it is free software, unless you wrote it you dont own it. Even if it is in the public domain it is debatable whether you "own" it, as per any understanding of the word property. Perhaps when app rental is a reality the misconception that software is like that bike you got for christmas will finally die.
  • by QuantumG (50515) <qg@biodome.org> on Monday April 09, 2001 @05:20PM (#303844) Homepage Journal
    Sun Microsystems sales staff circa 1994?
  • Not just .NET though, For a while Corel was porting their Office Suite to Java to run online, Sun was pushing the Network Computer idea. .NET itself may be recently announced, but the idea of renting applications and storage are not.
    treke
    Fame is a vapor; popularity an accident; the only earthly certainty is oblivion.
  • They are transport and encapsulation. This says nothing about what's being sent:

    A query like:

    is useless for interoperability unless you know WHAT the encapsulated data is and how to interpret it. All you get from XML-RPC is encapsulation, transport, and easy generation/parsing.

    These are good deeds and noble goals... But interoperability requires knowing the semantic meaning of what is sent.

  • Doh! Slashdot nuking tag-like entities:

    <query data="414324hg3j5hg34j5g3" command="4j2f345hj3g5hj3g5jh32" encrypted_paramater="4325435435254">

    Is not interpretable.

  • by Convergence (64135) on Monday April 09, 2001 @06:18PM (#303855) Homepage Journal
    Except that the the servers will be run by Microsoft, the protocols will be either be essentially secret, or be designed such that you *still* have to use microsoft's servers.

    Even if the protocol is eventually made public, they can still force you to use their servers. (See the protocol-enforced centralization of ICQ/AIM protocols versus the DNS-based ones of (say) email or http.)

    Oh, and you'll probably pay by the minute or pay by the access to use their servers, and there will with no real alternatives.

    I would expect this behaivor from .net, combined with the set of other normal techniques. (they own the client software, which doesn't interoperate well with 'other' servers.... change the protocol every other year.... public protocols for encapsulation, but proprietary specifications for the data being encapsulated anyone?..... Outright proprietary protocols.)

    This is what I see .net as being: Basically inetd, except poorly designed. Oh, and like MSN from 1994, except cheaper for them cause they don't gotta pay for the modems or phone lines.

    While the real public standards would be fully public from the data, encapsulation, and transport. Probably have multiple vendors shipping servers and clients. Real competetion, run your own server, or purchase from a selection of service providers who have purchased the code... Much like HTTP or SMTP.... and the rest of the internet.

    There's no free lunch; you're obviously paying for what services you use either way. But, which option will likely give you better service, more choice, higher stability, cheaper prices, and more control over your critical computing infrastructure.

    One way makes Microsoft a perpetual middleman, in control and siphoning money. The other way gives you a choice.

  • As i understand it, and so far doesn't appear to be mentioned here is that MS has taken an open standard (SOAP) and started the "embrace and extend" approach.

    The fact that Dave is a coauthor of the spec (AFAIK he seems to say it a lot if you read davenet or scripting.com) makes it doubly frustrating to him.

    While I can understand Winer's POV, he often let's his emotions overcome him and pummels things into the ground. That just demonstrates his dedication in one respect, but it often seems to overcome his objections and cloud his judgement.

    I'm pretty sure MS (and IBM) will take SOAP into directions that Dave didn't want, just as the Netscape folks decided to do with RSS. However, unlike Winer's constructive approach of working on RSS 0.92 (ratehr than 1.0) I haven't seen anything constructive yet. I'll give him the benefit of the doubt and believe he's coming up with some uber plan for SOAP.

    Regardless of what happens, I am sure he'll continue his version/vision of SOAP in his Manila products. I realy wish him well. It'd be a shame to not be able to use his products or have them not work with MS products merely out of spite from MS.
  • Actually, XML is simply a data format for structured data. End of sentance. XML parsers make it much easier to create objects with unique namespaces. It will always be up to the programs to cajole the data into a useable form. These are not, however, parsers. The parsing is complete. This is certainly not a PHB concept; they couldn't think of things so elegant. If you want another buzz word, it's called data-binding. You should seriously check out http://xmlc.enhydra.org [enhydra.org] and look at XMLC. It works beautifully!

    I will agree with your level of frustration with the buzzwordology and incorrect representations of what XML is actually useful for.


    --

  • Well XML/RPC is basically taking the place of DCOM/COM/blah blah blah.... and the .Net, what a wonderful idea that is. Yeah sure Bill, I will trust you with my data!
  • Most people get the local geek to install software. Once installed i doubt very many update their office software or web browser. As long as it works, they use it. Doesn't matter if there's a security hole, they don't know about it or probably don't care either. They just want it to work.
  • Are you starting to see the parallels with renting movies and renting software?

    Actually no i'm not. I don't pay a monthly fee whether i rent movies or not. And i usually use my computer for many more things then word processing and web browsing. But even those things i do quite frequently. I guess in your comparison, i'd want to watch the movie whenever i want, hence i'd buy that movie, not rent it.

    Well, no not ALL the processing. As I understand it, this type of technology wants some of the processing on your end -- such as the handling of user events, typing, clicking buttons, and what have you, but the data storage and processing to be done on the server end.

    What processing, exactly, would be quicker to do on a server then on my own machine? Clicks and what not are handled on my screen. Does bolding get done by the server then? Table creation? Even things like image or sound processing would be pointless to have a server do, since the time it takes to upload the original and download the finished piece is much greater then the time it takes for my computer to do it. I would think most people don't want their personal data stored away on some distant server. Just like i don't use the library to file my personal love letters.

    A typical example is a word processor. I write maybe three documents a year. I'd rather pay a small fee each time I create a document, rather than $50 for the whole word processor, which I may never use more than once before upgrading to the next version of Office.

    Actually i'm betting people word process more then you think. But the only reason to upgrade currently is b/c you need to keep file formats compatable. If MS saved documents as something like HTML by default, you would never upgrade your word processor. IMHO, it has too many 'features' as it is. Nothing has really changed in Word for the past 5 years i'd say.

    It also maintains flexibility. Use Word from any of your desktops, at home, at work, etc. Download it when you want it. (Kind of like Pay-Per-View, to keep the movie watching analogy.)

    Your analogy is flawed. I have cable here, but if my parents don't, i can't order pay per view now can i? Even if i did, THEY get charged for it, not me. I also can't order ppv on any TV that doesn't have a cable box with it. I can already word process where ever there is a computer; thats a pretty standard thing. I may not be able to play the latest game i'm enjoying anywhere, but so what? I doubt many people are like 'damn, i wish i could use this software on anyones computer.' And if there are, they typically bring the CDs with them...

    You're thinking about this all wrong ... these people don't want to sell you the use of their server resources for number crunching. They want to rent you lightweight software that can be built with web technology (DHTML, XML, Java, etc), where the programs are stored on the server, and possibly even your data. The idea that the server is going to be doing most of the work is just wrong.

    Sounds good, but i want 100% reliability. Whenever i need to use the applications, i want to be able to. But i doubt MS servers will never crash, or even that the best/only route to the MS servers will always be up. I remember a few short years ago one of Worldcoms fiber optic lines was cut, and it brought the internet in the Easter US to a halt basically. It was almost impossible to get anywhere. I sure hope that doesn't happen if i need to type a letter and send it out by a certain date.

    That makes no sense. Either people are going to pay to rent Word (pay-per-use), or they aren't. If you don't use Word, then you don't pay for it, therefore what do you care if it's better than the last version?

    Actually it makes perfect sense. This is what MS has been doing all along. A few of the companies i worked for have upgraded, even though they didn't want to. They knew there was no point, but someone (client, supplier, whatever) upgraded and can't figure out how to save as an older version. So we upgraded, just to be able to read the new file format.

    Well you've got that choice as a consumer, but it seems kind of close-minded to make that decision before you have all the facts, let alone a trial of it...

    Just b/c he's decided not to try it at all does not mean he is being closed minded. Maybe he just doesnt think the pitfalls (or potential pitfalls) are worth looking into this further. Don't call him close minded b/c he's made a decision on the facts currently available.
  • read the EULA, Microsoft can recall any piece of software they want...

    In the absence of an evil piece of crap like UCITA (or a particularly corrupt judge), EULAs that don't show up until after you've paid for the software are usually considered meaningless. This is just one of the reasons why the fight against UCITA is so important. UCITA would make all the evil crap in EULAS legally binding. See www.4cite.org [4cite.org].

    --
  • Find a good article, put a link, and put your comments in the submission

    Well, this is a little different since this new york times article is about him (Dave Winer submitted the link). That makes his comments on the article pretty on topic...

  • I agree with the potential utility of subscription ware.

    What I fear, though, is that, alongside the potentially great developments in technology (automatic upgrades, bug fixes, translations) and in business models,(subscription service revenue) will be embedded the same gratuituous toll collectors gathering money merely because they own the tollboth on the main highway and not because of any intrinsic new value added. (See other posting about Word 6.0 being essentially featureful.)

    I'll go out on a limb and put words in the mouths of Bill Gates and Scott McNealy...

    "Standards? Standards are great! I love standards! As a matter of fact, I own several!"
  • I've yet to hear a really lucid explanation of why I should want my apps and personal data floating in an amorphous cloud, but maybe that's just me.

    XML and its family of technologies such as SOAP, SXLT, et. al. is all about making two "alien thing" to talk to each other at the data level WITHOUT putting any restriction on those two "alien things" (other than that they have to understanding XML.)

    Yes, this is being pitched as web-services -- by the market. This is fine as long as you don't stop your thinking at this level, as XML is much more than enabling web-services. For example, there is nothing prevents us from using XML to enable two applications (or components) to communicate to each other even when they are sitting on our local hard-drive.

    Now, think about this. If ALL programs and applications had an XML based API (SOAP-RPC) image how simple it would be to integrate and capitalize on them with your own application.

    Hell, I can create a brand new application buy accessing the 100+ much often repeated functionality and code in my Linux applications. Now THIS is what I call Open Source (and code reuse) -- a system that I can re-use over and over WITHOUT having to do an open heart-surgery on it.

    ---------------
    Sig
    abbr.
  • "How many people really jump up and down at the idea of not owning software."

    No one's jumping up and down for it, but corporations will sit up and beg for it if Microsoft says it will reduce their TCO.

    Peace,
    Amit
    ICQ 77863057
  • For those of you trying to make a quick buck. Find a good article, put a link, and put your comments in the submission with a lot of banner ads. i.e. "Check out this on cnn.com and my comments." [burckart.org]
  • YOu are correct. I apologize for I was unable to get to the article.
  • While that might be the microsoft sales plot, on the technological level there's a lot more to it than that. MSNet (.net is the stupidest name in ages) is an extension of the principles of COM, which is a reasonably powerful extensible architecture. The ability to access objects anywhere is a very revolutionary (and not microsoft exclusive) concept.

    Look at a newsbrowser, for instance. It contacts the server. The server gets the article out of memory, and translates it from however it's stored to a transmittable form. From there it's sent along the wire and converted into a different form, and from there it pops up on your screen. Tin, and most other newsreaders, are reasonably huge. The servers are pretty massive, too. They're both a tangle of code around something, that from a modern perspective, smells a lot like a hack. Often you have to process a huge stack of data before you even see an article. "Loading Headers..."

    Were you designing usenet today, using distriubted object technology, the news indexes, lists, and articles, would all be objects on the server that you call. NNTP would be unnecessary - the communication is taken care of by whatever they're calling RPC today - so the communication stuff gets trimmed out of each end. The client becomes thin, so you don't have to worry about state on the client end. It goes from technology that people have spent years trying to get right (because they have to solve a half a dozen big problesm other than news storage and indexing), to something a couple guys can hack up in a month in thier free time because they only have to worry about the news. Not protocols, timeouts, latency, client state, etc... but news. COM/CORBA/MSNet handles all of the communication.

    Am I looking forward to a lease, no-configuration-control, subscriber based software model? Not on your life. Does that mean that MSNet and the technology therin is a smelly pile of elephant dung? No. COM, CORBA, JavaBeans, MSNet, whatever - pick the product brand of your choice - is some incredible stuff. It forfills the promise that RPC introduced.

  • I know little about the serverserver communications for NNTP. I know enough, though, to know that reasonably successful group mirroring across the world is quite an accomplishment. I think you miss my point - the "distributed, replicated, client-server collaberation system" part should - and is now becoming - seperate from the "news".

    All of these systems are quite admirable, because they do everything themselves. They had to - nobody else did it then. You wouldn't design it that way now, though.

  • I thought you might have, but of course we have a world of readers following our conversation.

    I'll throw in at this point something that you might already know - Mozilla is built around XPCOM, which aims to bring most of the benefits of MSCOM to the open source world. I don't know how much they get into inter-machine communication, though.

  • What XML does is it makes reading in files MUCH easier. With one function call, I can turn a filename into a tree structure of its contents.

    CSV is great if you've got 'flat' data (like a list), but what about if your data is in a tree?

    Also, XML is somewhat self-documenting because the tag names hint at the purpose of each field. Which of these file formats would you rather try to figure out?

    CSV:
    "Bob", "756838", "124437"
    XML:
    <name>Bob</name>
    <phone>756838</phone>
    <fax>124437</fax>

    File manager too cluttered or slow? Try this:

  • When the Perl script gets the data from the RPC call, the data will be available as a standard Perl datatype.

    I've been doing this between PHP, Cold Fusion, and ASP (VbScript) for quite some time. See http://www.wddx.org [wddx.org]. Very Simple, very Elegent. It just doesn't come with a kitchen sink - and that's fine by me.
  • whatever. xml doesn't make that necessarily any easier or more likely than any other alternative.

    Good point. But, XML seems to be the alternative that people have settled on. So...

    And I think your stock quote app was a good enough example. Say you feed it a ticker symbol and get back price and time of last trade. The the quote provider gets generous and starts serving price, price at last close, time of last trade, and volume. If they change the order, your tab-parsing routine breaks. If they just add XML tags, in whatever order, your app will just look for the ones it wants and ignore the rest. Ideally, that is.

    But fundamentally you're right, if we all agreed to parse tab-delimited files left to right, top to bottom, they'd be easy enough to use.

    Where it gets more interesting is with stuff like RDF, where an app that doesn't even know what tags to look for can read a document+schema and begin inferring facts about it.

  • True enough, I probably won't use .NET much either. But .NET isn't just a move towards thin-clients and server-side processing. In fact, most products will NOT be written that way. Office will be essentially the same in terms of distribution - a large binary client installation, with enhanced features IF you hook into a .NET server.

    But more importantly, Microsoft has made a critical decision in my mind - and that decision is to seperate the layers of storage, logic, and presentation into distinct parcels of code.

    You may not look forward to this now, and we may shun it as unnecessary, but I honestly see this move (though I wish it were from someone else, maybe Applix) as positive for the software industry AND *nix users.

    Right now, writing applications for multiple platforms is a trying experience. Delivering a high-quality app to multiple clients via distributed servers over an untrusted network is largely a build-it-as-you-go project. I've worked on projects like that before - and we were constantly making things up as went along - we didnt have the benefit of standard transports, processing, and distribution methods.

    A real life example: in my spare time, I am writing an app, without using any .NET stuff, that is designed to manage a moderately complex website, edit documents, change site-wide variables, generate reports, and other functions. I want it to be server neutral, database neutral, and client neutral. I want the ability to write a client that works as a binary application, and I want the ability to write a client that works in a web-browser. I want the client to be able to run on a Palm-based portable, as well as PocketPC.

    I've been working on this for a while, and as such, had to develop my own standards for transporting commands and data. Right now, as it stands, I have a server-spec that allows a client to hitch into an Apache or IIS based site without modification to the client. Right now I am writing a binary Windows client, and next a Java client will be written.

    With .NET, things could be so much easier. Instead of my own crappy flat-file standard for messages, I could have used XML, which is easy to parse in Java and MS languages. Instead of writing a transport layer that can compile with any POSIX OS, I could have used XML RPC. These two additions alone would have saved me months.

    But besides the technical aspects, you miss what .NET is really about. Trying to find out from MS is damn near impossible, but this my perspective on why you should be interested in .NET, from a Linux/Windows end-user perspective.

    .NET will allow you to use Linux, or any other .NET compatible platform, to communicate with the vast majority of users who are Windows subscribers. Moving data, in the form of Office documents, closed-type formats, and other "feature-added" (pronounced proprietary) formats will be trival. Now, moving data between platforms/programs is a major task unless you don't mind your data being mangaled in the process.

    A real specific example of why Word would be better with .NET capabilities would be this: you write user manuals for a software company, free lance. You have a new-whiz bang computer/laptop, but not everyone at the mega-soft corp company has a new version of xyz app. Luckily, your software company runs MS xyz server, and has decided to allow you to upload the documents to it weekly for progress review. Members of the review staff have Office 95, but luckily, the MS xyz server tranlsates the document on the fly, delivers them a properly formatted version for Office 95, allows them to edit it, saves changes, and updates the original, still in the newest format. Sure, you COULD do this by hand, but when 10 editors take the file automatically, edit it, and send it back to you in file versions ranging from Office 95, 97, 2000 and XP, things get messy. You have to compare each one, and find the changes. But luckily, you used a server with .NET extensions, and all the mess was avoided.

    A final note, because I suspect that you won't be swayed by any of these arguments: the public has a pension for leasing stuff. People like predictable, reasonable prices. It helps for budgeting, and for long-term planning. Most businesses would rather lease a car than own it, mostly because a lease has a set pricing model - you pay $XYZ per month, regardless of any outside factor. Usually, they have warranties in place, so that regardless of accidents, malfunctions, or required fixes the price per car never exceeds $XYZ/month. They are willing to pay a premium per car for that stability. I think this will translate into software exceptionally well for most companies. The formula will be simple: total number of computers * software cost * months = software budget. A company could sign a three year lease for software, and never have to worry about upgrades, updates, bug fixes, or other things that give IT people like me a headache.

    Anyways, food for thought.
  • I have an idea, why don't they just screw .NET and go with open file formats instead?

    I think thats a great idea, like I said, I won't be using .NET anytime soon. Open File Formats would eliminate about 75% of the "problems" .NET is trying to solve.

    I relaly dont think .NET is going to be great technically, but I think that many businesses will use it because it solves problems that they have (whether created by MS or not is irrelevant).

    About FTP, yes that solves the problem so long as you are using the same software/version/file format. .NET *would* be cool when it will convert and manage file formats automatically across platforms/programs/architecture's. FTP doesn't convert file formats, manage revisions, and is definately not secure by its nature (unless we talk SFTP).

    As to why MS won't go with open file formats.. its just one more way to force you into using their software, and to futher their market position in applications. That's my guess, at least.
  • I think you and I are on the same page. Rent-ware is very popular already. I now work for a company who essentially only sells software via a leasing-model. Sites sign up and get the whole package including hardware, software, service, upgrades, etc etc for a set period of time. VAR's are not new, and I think .NET might succeed just because anything that helps VARs will be used in the field in short order.

    Standards are a different issue. I love standards, and I don't own any. But sometimes, a file format becomes intellectual property - basically some are closed because they are core the product. Thats when things get sticky. If MS released a full-detailed spec for The Word 2000 file format, then the value of Word would be erroded (from MS's stand point of course) because the world would be that much closer to making a knock-off of Word.

    I think when it comes down to it, file formats should be mostly open, but in some cases, I support a closed format. Compatibility is a feature, and so I think file-formats should be something that consumers shop around for. If you don't like how MS uses closed ones, how about not supporting them, or buying thier products. If compatibility with non-MS products isnt important to you, then go for it, and use whatever MS products you want.

    Anyways, from my perspective, its a choice issue, but one I wish MS would eliminate by using only open standards.
  • XML is only part of the equation.

    In my own app, I have to form the XML (not hard by any means), then I have to encode it in packet form, transport it, and then recieve and decode it. Basically, I have created my own stupid standard because their is no commonly used transport layer for XML. Using some .net standard would have helped, namely that I wouldn't have to code a few thousand lines of error checking, transport, decoding, encoding, and other similiar things.

    Mind you, i can do all this without .NET, but it would have been easier, thats all I was doing. In fact ,I am almost done, and 100% without crappy .NET.
  • He is not one to sugarcoat his criticisms. "Every time I use the Web I am reminded why I hate Microsoft," he wrote in a recent DaveNet posting. "What could have been a lovely competitive space, overseen and supported by a statesman-like Microsoft, turned into a cesspool of lawyers and dirty tricks.

    While I wouldn't be so complementary to MS (short on coffee today) this sums it up nicely.

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • Yeah, the nerve of those people.... They replaced netscape, still one of the dodgiest computer products ever released, with a more stable, more powerful, faster, easier to use alternative.... meanwhile netscape rested on their laurels.... yeah, it must have been dirty tricks... "Oh but they gave it away for free!" - you bastards! Or maybe give away the razor (ie) and make money on the blades (iis)?

    The major primary complaint I have had with MS is the pervasion of inferior technology by means of superior marketing. Check Out my blurb on Boiling Frogs [slashdot.org].

    You comment on giving away the razor and making money on the blades is flawed, because for shear quantity, IE outnumbers iis. So to be precise, If they were giving away the razor and making money on the blades, they should be giving away iis, and selling IE at 5 or 10 bucks a pop. This is not the place to review the wonderful emails and duplicitious testimony of MS at the anti trust trial. "We gotta integrate IE into Windows because we'll never win otherwise"

    In other words, they couldn't win on the merits of the program. So they HAD to win by other means. This was well in advance of IE 5. Are you still Enthusiastically using the ActiveX desktop?

    I used to like Microsoft, really. but I got ticked off everytime they did something that that tried to lock me into their system. Maybe you like handcuffs.

    "No these are not handcuffs, they are golden bracelets, really!"

    ha

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • IE won the browser wars despite that Active desktop shit. I don't know anyone that uses it. It's completely irrelevant in any sort of "browser wars" discussion. While Microsoft may have thought they'd 'win' by making the desktop more like a web browser, they actually won be producing a stable, fast product more standards compliant than anything else at the time.

    the Active desktop has been renamed, and is coming back in all sorts of user freindly features in the future. Just watch>

    IE on an open fair market is one thing. Fine. Forcing me to use it, and blocking out the competition I suppose is superior marketing, And making it damn hard to remove is another.

    The proof in the pudding are products like 98Lite [98lite.net] that remove Internet Explorer from windows, and result in faster and more stable systems. Rememeber, MS has testified in court that not only was this impossible, but that doing this would destroy windows.

    ha. nice try

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • mm, if you ask me, IE won the browser wars despite that Active desktop shit. I don't know anyone that uses it. It's completely irrelevant in any sort of "browser wars" discussion. While Microsoft may have thought they'd 'win' by making the desktop more like a web browser, they actually won be producing a stable, fast product more standards compliant than anything else at the time.

    Just to let you know, I did the following.

    I went to www.98lite.net, and downloaded their IEradicator program. This removes IE from win 9x, including win 98 and Millenium.

    It works great. And guess what. Netscape is 5 times faster, and is 5 times more stable.

    Now tell me that MS would never screw over the competition in their code. All this does is to make me more angry an MS.

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • It was moot because MS had an infinate amount of money and kept undercutting their business with giveaways (I guess communism is sometimes good)

    Please explain what that has to do with Communism? PS: Your McCarthy funded "Education on Communism" will not shine here... what do you know about *Communism* proper; and why do you choose to insult the ideals of Communists when you obviously have a SitCom-PopCulture understanding (read: know nothing) about it?

  • Your argument seems to be based on the premise that Microsoft is trying to make things cheaper for you. To that I'd say are you insane? Microsoft may be willing to make things easier for you, add features, whatever. The one thing they will never do is reduce the amount of money they think they can get from you.
  • Right now the cost of Office for a large portion of users is $0 because it's warezed. By increasing copy protection and lowering the entry cost (through rental), MS is betting that they can get more money by getting all the deadbeats to pay up.
    The majority of Office users are either business or OEM. MS will not increase their sales in those segments by lowering prices. They may increase sales by reducing piracy, but the subscription model is quite separate from the anti-piracy (i.e. central authorization) features in the new products. Over here in Australia they've been trialling central authorization with Office 2000 for a least a year, without subscriptions.
  • How many people really jump up and down at the idea of not owning software. I don't see anything in this that will ever make me not want to own my apps outright.

    It could be because you appear to have no real idea of what this idea of "not owning software" is all about.

    Do I rent some movies? Sure. Do I own some movies? Sure. Why do you do both? Why not buy all the movies you like to watch?

    Hmm, what's that, you say some movies you only want to see one time? You don't wish to pay $20 - $30 and own the movie forever? You'd rather pay a small, one-time fee and watch it only once or twice?

    Are you starting to see the parallels with renting movies and renting software?

    The idea as I understand it is to leave the gui at home and move all of the processing onto the servers.

    Well, no not ALL the processing. As I understand it, this type of technology wants some of the processing on your end -- such as the handling of user events, typing, clicking buttons, and what have you, but the data storage and processing to be done on the server end.

    A typical example is a word processor. I write maybe three documents a year. I'd rather pay a small fee each time I create a document, rather than $50 for the whole word processor, which I may never use more than once before upgrading to the next version of Office.

    It also maintains flexibility. Use Word from any of your desktops, at home, at work, etc. Download it when you want it. (Kind of like Pay-Per-View, to keep the movie watching analogy.)

    And if you don't like renting applications, don't. Buy them instead. If there is a market of people who would rather own the software, believe me, there will be someone there to supply the demand, even if Microsoft doesn't (and they will).

    The really processor intensive things like encoding and image editing aren't going to really benefit from this.

    You're thinking about this all wrong ... these people don't want to sell you the use of their server resources for number crunching. They want to rent you lightweight software that can be built with web technology (DHTML, XML, Java, etc), where the programs are stored on the server, and possibly even your data. The idea that the server is going to be doing most of the work is just wrong.

    This is just a plan to get us to get people hooked before they realize the newest Word isn't really any better than the last.

    That makes no sense. Either people are going to pay to rent Word (pay-per-use), or they aren't. If you don't use Word, then you don't pay for it, therefore what do you care if it's better than the last version?

    Even if linux does this I won't use it

    Well you've got that choice as a consumer, but it seems kind of close-minded to make that decision before you have all the facts, let alone a trial of it...

    -thomas
  • Hmm....

    I noticed that the NYT was much less flattering of Microsoft than Dave was.... His point [userland.com] seemed to be that this is a major strategic mistake on Microsoft's part.

    I say, let us zig to their zag-- as they try to centralize, let us decentralize.

    Also notice that Free software is the bane of this sort of model because the major arguement for centralized application serving is the fact that it can often cut the number of licenses that need to be puchased for a given piece of software.

    I do think that there is a place for Application Service Providers, but that their role will probably be geared mostly to small businesses because this is the only market segment where real value can be added. This is .NET's potential market.

  • NET will allow you to use Linux, or any other .NET compatible platform, to communicate with the vast majority of users who are Windows subscribers. Moving data, in the form of Office documents, closed-type formats, and other "feature-added" (pronounced proprietary) formats will be trival. Now, moving data between platforms/programs is a major task unless you don't mind your data being mangaled in the process.

    I can already do this. It's called ftp.

    A real specific example of why Word would be better with .NET capabilities would be this: you write user manuals for a software company, free lance. You have a new-whiz bang computer/laptop, but not everyone at the mega-soft corp company has a new version of xyz app. Luckily, your software company runs MS xyz server, and has decided to allow you to upload the documents to it weekly for progress review. Members of the review staff have Office 95, but luckily, the MS xyz server tranlsates the document on the fly, delivers them a properly formatted version for Office 95, allows them to edit it, saves changes, and updates the original, still in the newest format. Sure, you COULD do this by hand, but when 10 editors take the file automatically, edit it, and send it back to you in file versions ranging from Office 95, 97, 2000 and XP, things get messy.

    So then, what you are saying is that .NET is gonna be great because it's a solution to the problem deliberately created, perpetuated, and maintained by Microsoft? I have an idea, why don't they just screw .NET and go with open file formats instead?

  • My understanding is the main scalability bottleneck at Slashdot is the database server, and that's primarily due to the fact that it's running mySQL, and it's still running mySQL due to the fact that the core slashcode is not database independant.

    But anyway, saying SOAP or any sort of message passing is the magic bullet without knowing the app foolhardy PHB talk. Since Slashdot is running is such a controlled environment, you could do distributed processing with far lower-level rolled-your-own socket code, for example.

    Although, it would be cool to get Slashdot as an XML document that you could format client-side using a XSLT or CSS stylesheet. Or insert a middle tier that does the XML to HTML formatting seperate from the content generation.
  • And the difference between that situation and the situation today is only the cost of writing a middleware piece that transforms the format and data between these systems.

    Nobody said that integration is easy. It is in fact the most expensive and problematic part of information technology. On one hand, anything that makes it easier is a good thing. On the other, tech like XML formats are just another thing that you have to integrate everything with in 2^n fashion.

    Don't make IT PHBs out to be that stupid. These problems are old as the hills and nobody expects an easy or cheap answer.
  • Netscape's original business plan was never to charge for the browser. The place I worked at the time wanted to roll Navigator 1.0 out to 5000 desktops, but woudn't do so unless Netscape took their money for a support contract. Our Netscape 'rep' had no idea what to do with this offer. By v1.1N, they figured out that having money waved in your face wasn't such a bad idea.

    The entire idea of Netscape's psuedo-free browser was supposed to be free advertising for their server products. You can blame Netscape's death on Apache/IIS and free SMTP/IMAP servers to a much greater degree than you can blame IE.

    The browser was never "their business", and the sad fact that a whole bunch of homepages pointed to the default setting of http://www.netscape.com/ was the only real asset picked up by AOL just shows that their gamble at being an enterprise server vendor failed mightily.
  • So, you take a specific problem domain (health care), and figure that if someone hasn't generalized that out to every problem domain, you conclude that you might as well be using flat text files. XML is supposed to be the infrastructure to make this possible, BTW -- if MS and Oracle can't agree on their dataset format, that's not really a problem with the underlying technology.

    I can't tell you how many interfaces I have built in my life

    And how many bugs have you created in low-level parsing and message passing code? Don't get me wrong - I know there's programmers out there that have spent their entire professional life maintaining fragile interface code, and sure it puts food on the table, but there's also got to be a better way.

    Data Integration has always been the #1 problem in IT, and always will be. Nobody thinks they've got the magic bullet -- but putting new tools in the toolbox can't be a bad thing, assuming they work.
  • by Petrophile (253809) on Monday April 09, 2001 @07:53PM (#303953) Homepage
    The nice thing about Microsoft is that they tell you exactly what they are going to do before they do it.

    And when MS made the "Pearl Harbor Day" annoucement that they would build a Netscape-class browswer and integrate into the Windows shell, I though they were insane. Netscape was too slow, too bloated, too crashy to ever be considered a fundemental component in the user interface.

    And sure, MS's early attempts like IE 4 were pretty much the clusterfuck that I expected. However, while integration was a pretty crappy user application (and probably will always be, even with more doodads under XP or things like Natulius), it did put the pressure on MS to look at the browser as a mission critical application for the desktop. And by v5.01, they had pretty delivered. This shifit in thinking about the importantce of the browser was something that Netscape, with it's endless crappy .01 releases, could never do.
  • by Petrophile (253809) on Monday April 09, 2001 @08:16PM (#303954) Homepage
    Features (that I use) that are in Word 2000 and not in Word 6.0:

    + Red underline marks for misspelled words.
    + Word 2000 file format compatibility
    + Improved stability with documents with very large embedded pictures and objects.

    Word 6.0 was essentially a feature complete program, and there just hasn't been that much that even all the geniuses at Microsoft could think of adding. If you have any doubts about this, ask Mr Clippy "What's New in Word?" (Ooo, new table border styles!)

    Most users are currently on Office 97, and will be for a year or more past the upcoming Office XP release. If it weren't for the interoperability issues, most users would still be on Word 6 or Word 95.
  • One of SOAP's principal designer's is Henrick Frystyk Nielsen, one of the key authors of HTTP and the lead author on HTTP-NG. SOAP is simply a logical extension of stuff he has been working on for years before he joined Microsoft.
  • Secondly, it is being claimed that "services" will reside on various servers. But a simple mathematical argument shows that distributed web services will be *much* less reliable than centralized servers:

    You're wrong, since you assume that adding more servers means adding more single points of failure, and that the failure rate for each server stays constant.

    This is misguided for two reasons: A major reason for splitting up things over multiple servers is to run less complex services on each machine. As you reduce complexity, you also reduce the likelyhood of a failure on that specific machine (if nothing else, then for the reason that you may have moved the buggy parts of your code to another server).

    Now, if you keep a sequential path through your servers, with no failover, you still may (depending on the quality of your hardware and OS maintenance) achieve greater stability, since most software handle high load/low memory etc. situations worse than they should. It's a good chance your applications will have a lower rate of failure if you can split it up to reduce the chance of resource allocation problems.

    But with a well engineered distributed system you will also either make calls persistent, so short term outages for services that aren't time critical doesn't matter because the call will be completed when the system is back up, or you will add multiple servers handling a specific service, and make the caller automatically fail over to one of the other servers if one is down.

    Redundancy is a much safer way to bring you to 5-nine's than running everything on a single server: 5-nine's imply about 5 minuts of downtime a year. That doesn't give you much room to deal with hardware failure. I'd much rather have a distributed system where non-critical parts are farmed out to other machines, and critical parts are duplicated on multiple servers.

    Actually, we're designing an XML based distributed communications system for our mission critical system right now: The ".name" gTLD registry, that I'm head of development for.

  • by snoop_chili_dog (314897) on Monday April 09, 2001 @04:26PM (#303972)

    How many people really jump up and down at the idea of not owning software. I don't see anything in this that will ever make me not want to own my apps outright.

    The idea as I understand it is to leave the gui at home and move all of the processing onto the servers. There's just no reason for that. We live in a day of celeron's and athlons. The most commonly used apps are web browsing and word processing. Word processing isn't processor intensive or at least it shouldn't be. If a word processor chews up my 433 MHz celeron it needs to go. The really processor intensive things like encoding and image editing aren't going to really benefit from this. What time would you save by uploading a huge wav and letting a remote server turn it into an mp3 and then ship it back to you.

    This is just a plan to get us to get people hooked before they realize the newest Word isn't really any better than the last.

    Even if linux does this I won't use it

  • You should probably ask before you post. What this discussion is about is Microsoft (and others') proposed BUSINESS PLAN for distributed software. .Net is something different altogether. .Net is a software development platform based on the CLR (Common Language Runtime) that changes VB and introduces C#, along with introducing the Web Services concept to developers. One of the capabilities of .Net is the ability for software components to reside on diparate and heterogeneous computers all over the internet. This might enable a company to make desktop software that uses SOAP to gather bits of data from the 'net, but most likely NOT. The main uses for SOAP have been from one web server to another. This is because:
    • Not everyone has a constant connection to the Internet. Office can't crap out just because your friend beeps in on Call Waiting.. Developers know this.
    • Not every Web Services provider can garauntee 100% uptime, which would be necessary to support the availibility of Web Services.
    • ROPE (MS's implementation of SOAP) isn't fast enough to provide the kind of responsiveness that people look for on the desktop.
    • Not even Microsoft has worked out the revenue model for Web Services.


      • In closing, before everyone freaks out about "M$" taking their cash (read:actually having to pay for Microsoft software they use anyway), visit msdn [microsoft.com].

Swap read error. You lose your mind.

Working...