Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet

Pervasive Computing: Microsoft, MIT And The Future 131

illuin writes: "There's an interesting article over on BetaNews with a potential take on Microsoft's vision of the future internet, and internet based applications. Of course, it sounds quite a bit like Project Oxygen (press release,) currently being pursued by MIT's Laboratory for Computer Science." The recent "dot-Net" announcement by Microsoft throws a new light on Oxygen, and on other distributed projects like Gnutella and Freenet. Project Oxygen and Microsoft may have radically different views on how all this diffuse computing ought to act and be organized (read "Who pays, how much, to whom?"), but the fact of widely disseminated files and an increase in ASP-style distribution seems inevitable.
This discussion has been archived. No new comments can be posted.

Pervasive Computing: Microsoft, MIT And The Future

Comments Filter:
  • Leaving a "spare" P75 on and attached to the net in a corner will cost you $5-10/month in power bills.

    Extra cpu power comes from systems that are currently being used for something but not nearly using all of their cpu (ie: most peoples desktop computers), not old ones sitting in the corner to be a cpu farm.
  • by mikpos ( 2397 ) on Sunday July 02, 2000 @10:05AM (#962667) Homepage
    I must say, good job. It's not often that we see Godwin's Law enter at the *beginning* of a thread.
  • You raise an excellent point. It's dirty business to try and pass off characters higher than 127 as ASCII.

    And you are correct that it may not be a "proprietary Windows-only" thing, (or even a Mozilla thing, I use OPERA.) But let's face it: when someone dumps dejour standards for defacto ones, it's probably* done with the ignorant phrase, "well it works under MS...".

    (* Microsoft is not responsible for the sorenson vs. mpeg fiasco. But they are still an evil empire. )
  • VRML makes you a type of pirate too

    My VRML software is being built on some Open Source components (zlib, libpng, and libjpeg), but notice that I drew a distinction between market pressures due to piracy and market pressures due to Free Software/Open Source. The latter are a legitimate, natural part of the development of things in a free market economy. If you lose to Open Source, you lose fair and square.

    Nothing I've done is "warezed" and any use of Open Source in my work is legal and properly documented. I don't see how you can put me in the same category as some gangster from Hong Kong with a CD replicator.


    #VRML V2.0 utf8
  • by Lysander Luddite ( 64349 ) on Sunday July 02, 2000 @04:12PM (#962670)
    So if all these wonderful services are brought to me by corporate providers and all my data is stored on corporate servers, what exactly do I own? And if I own nothing what kind of legal rights do I have should something happen to me or my data?

    Let's pretend that all my personal data is magically protected from 'hackers'. What's to stop the providers from using my information any way they want? This could be like many contests where the sponsor gets ownership of your creations as a condition of entry. Remember the flap when Geocities claimed to own or be able to use anything created on their site because you were using their services and resources? Will this be SOP in the ASP/.NET world?

    What's to stop them from putting continual commercials in my applications that I use online? Like the recent Eudora release, applications could flash commercials or even track my movements as a condition to using the service. And if the service reaches critical mass where the penalities for non use are greater than acceptance (similar to using a word processor that cannot read/write the most recent Word format) what real alternatives do I have?

    If the providers lose my information what legal recourse do I have? At least now I can back things up locally. If files are stored arbitraily on different servers can I still make my own back ups? I know I don't have much recourse in states that have passed UCITA, but who can I trust or blame should something go wrong?

    And what benefit do I as a consumer actually get? Nothing really changes from my perspective. Am I forced to upgrade at the whim of my provider? Can I easily switch providers? Let's say I prefer an old copy of Freehand and don't want to upgrade to the latest version. Will I have a choice or will the upgrade be forced? Can I easily switch to Illustrator or am I stuck in a contractual obligation?

    Let's not forget this won't happen for the next few years. I remember when push and interactive TV were going to be big. Unless consumers see some material benefit this thing will have a lot of hurdles to overcome. If the MS spinmeisters couldn't explain .NET to a room full of 'technology journalists' after several hours what hope do they have of convincing the general public?
  • Why don't people see that there is an alternative view here.

    Since my machine is connected to the Internet, it can function as a client and a server. Napster/Gnutella show us this. If I had bandwidth into my home, I could use that bandwidth to get back to my home resources. I look toward the time when some smart hardware company develops the Personal Data Server. You could buy two and they could back up to each other...to offer redundancy (or online to an (encrypted) vault in case of disaster). You could also buy a Processing Server. Imagine...just RAM and CPU...you could get several, stack them up, and serve your family. This hardware will get CHEAPER, we should remember that.

    Remember, no ASP in the world will guarantee you 800MHz of dedicated CPU and 25GB of dedicated storage...not for a price you can afford.

    I, for one, want my PDA and WAP phone to browse my personal computing "servers", not someone else's. I want control of my information, and when I interact with a vendor, I will disclose it at my discretion.

    I believe in pervasive computing, just not the "centralized" model of it. I was at a conference when Bob Lucky (former Exec. Director of Bell Labs) spoke. He talked about when they made the phone networks, they made the end user devices dumb, and the network "smart (and complex). He said if he could do it all over again, He'd do it the other way around. I agree.
  • This reminded me of electricity.

    In the First World, we get our electricity from the network. We put an enormous trust in the power company. It's ubiquitous. Everywhere you go to. (Of course, you don't go to places without power)

    In the backwaters, the power company is not trustable. You have generators to supplement or substitute the network.
    __
  • Please read the "Halloween Papers" again!

    This is "embrace and extend" -- applied to the whole internet!

    "Next Generation Internet"... Sounds GREAT, eh?

    Open-source is now being challenged.

    -Ben
  • Laugh. It was a joke.
  • Sure they do.
    I detected no humor or irony or joke in his post.
    Did you?

  • I feel comfortable giving as well as recieving. Selling, on the other hand, has always bothered me. Lying, hyping, ect., I won't do that.
  • Jini relates more to "Universal Plug and Play". I'm not an expert on either, but it's important to understand one major distinction: Jini is based on Java. Now, I personally like Java, but shouldn't any long term solution not be tied to a single language?
  • )
    (Forgot to close a paren in previous post.)
  • Gonna sic Ralph Nader and the Green Gang on me because I have a Mac SE/30 running NetBSD that logs huge amounts of uptime?

  • Re: Some of us know what happened when the 'net finally scaled up but didn't adopt a security model that was newer than ten years old. It resulted in the most prominent email platform being attacked because it was vulnerable and really only had security scaled to a workgroup level.
    That's a failure of the concept of a free and open Internet, dude. You can continue to blame Microsoft if you want to play chicken-little. Some of us won't buy your B.S. though.

    It's the failure of the operating system giving every script root access while conected to a global network. Data has been moved around via for for decads before microsoft unleashed their VBS (virus building system) into the hands of clueless users.

    Make no mistake, ms had every oportunity to make sure their operating system more secure but they failed to execute, opting instead to dump an unsecured OS and script interperater onto the net that caused billions in damages.
    ___

  • Godwin's law doesn't apply to Slashdot threads.

    It applies to Usenet threads that often run on and on for weeks and months.

    The discussions on Slashdot die after a day and a half by design. Not enough time for Godwin's law to even become relevant.
  • by (void*) ( 113680 ) on Sunday July 02, 2000 @10:16AM (#962682)
    Ubiquitious computing sounds really good. And it might actually sweep our whole society towards it, what with MIT, Microsoft and other tech companies behind it.

    This is the future? Can't we have something else? What are the alternatives?

    For a very good, poetic rant against this vision, I recommend Dan Simmon's Hyperion and The Fall of Hyperion. Highly recommended reading. If you haven't read it, be warned of spoilers.

    In brief, Dan Simmons paints a world where computers control everything, and are truly ubiquitious. With help of their invention, Mankind colonizes the stars, using farcaster portals. But there is a price for this technology. Humankind becomes enslaved, dependent on ubiquitious compuetrs. So much so that they cannot fight an interstellar war. What does the novel offer as an alternative? Real starships. "Real" tech in the form of FTL ships and weapons.

    Regardless of whether you agree or not, this is but one of the many threads in the Hyperion story. For those overtly enamored about ubiquitious computing, thinking it will liberate us, this novel is a very good antidote against that.

    But like all Sci-Fi, things will never truly happen this way. The novel may be presenting a false dilemma. A good read and interesting viewpoint, nevertheless.

  • by Admiral Burrito ( 11807 ) on Sunday July 02, 2000 @10:18AM (#962683)

    The words interoperability, security, uptime, connectivity, cross platform are phrases that never into the minds of anyone at microsoft.

    "We must limit cross-platform connectivity and interoperability even at the expense of uptime and security!"

    ;)

  • That's an Apple IIGS, "dood". My very first computer. If only there'd been more software written that actually took advantage of the multimedia (like some actual games), it'd have been a good machine.

    Do you honestly want your development tools to be battling PublicFreeGnutapsterNet 12.0 for bandwidth? I don't think we'll ever truly have a persistent excess of carrying capacity. Software expands to consume all resources which are made available to it.
  • /. /. Oh what a relief it is! Dis site ish a fuuken jok (as my Liverpudlian friends say).
  • It is fun to be Open Source developer. No need to think, leave it to commercial "greedy" segment, all you have to do is just clone, clone ...
  • To become an engineer, you study the best way that it's been done before, and software engineering is no different. Since the best way it's been done before is UNIX, the ready availability of the blueprints for Linux makes it a natural choice for study. Mucking about with Linux is equivalent to moving I-beams around in the Twin Towers to see what happens -- except, naturally, quite a bit cheaper. The theory of software engineering -- computer science -- is more readily studied if the tools you use for the study are more readily available. The downtime associated with Windows means that it's not readily available...

    -_Quinn
  • OK, so maybe I'm not sure what his work is going to be, but if you check out project oxygen's page, you'll find that Ronald Rivest is among the researchers on the networking portion of the project. Makes me feel a bit safer already :-)
  • I also suggest Ray Bradbury's short story "There Will Come Soft Rains"-- a similar idea. Note that it predates this article by several decades.
  • by TheDullBlade ( 28998 ) on Sunday July 02, 2000 @06:26PM (#962690)
    Oh? So you're willing to sign an affadavit certifying that this virtual machine is absolutely free of security holes and cannot be compromised?

    If I wrote it with that in mind, of course I would. What kind of coward won't stand up and take responsibility for the quality of his own work?

    A secure virtual machine for making arbitrary calculations can be very simple indeed; you only really need a few operations. It would be like signing a statement that you totaled a column of numbers correctly; you'd want to check it over until you're certain, and charge extra for the time and worry of that, but it's a simple enough task that you can eventually be certain that you're correct.

    How do you think hardware designers ever get anything done? There's no magical difference that makes bug-free hardware possible and bug-free software impossible.

    No chance of somebody inserting malicious code into the machine so that when I say "What's the VA stock price" the car-computer gets sent "Set cruise control to 5 trillion miles per hour. Set steering to target that cliff over there. Lock controls, set unlock password to '!seineew era sreenigne droF'"?

    I never said anything about that. I was very clearly responding to "I personally wouldn't want to be in charge of maintaining a machine which is set up to accept and execute arbitrary tasks from passing users." and talking about protecting the machine from the tasks (and the tasks from each other). I'm not talking about communications security (which, of course, can never be perfect, for physical reasons; all theoretical communications security models rely on the absolute physical security of certain things, which is impossible in real life), I'm talking about the security of one machine and the processes that run on it.

    BTW, what kind of idiot would let their car be controlled by a distant server over a network? Lines get cut, solar flares disrupt communications, networks go down.

    I don't know why this got moded up. You never made any arguments, just asserted that I was wrong, and threw in a few non sequitors.
  • This is _not_ flamebait but: Who needs ASP (as in application service provider) when you have telnet-the oldest ASP that's been around. If I need graphics: xhost, setenv DISPLAY and so on... you get my point...

    Thank you.
    //Frisco
    --
    "At the end of the journey, all men think that their youth was Arcadia..." -Goethe

  • To save the world from vision-less companies that only aim to make money out of the masses.

    Linux .NET for the masses ...now, that's what I call "vision".

  • by Carnage4Life ( 106069 ) on Sunday July 02, 2000 @10:46AM (#962693) Homepage Journal
    What the hell are you talking about? The email bugs are due to MSFT's faulty security model. Shit, I have friends at MSFT who have admitted that the Outlook team got chewed out for not implementing a sandbox model after the I LOVE YOU virus got out. The virus wouldn't have spread if the developers at MSFT had chosen security over perceived usability.
    Secondly, what exactly do you mean by internet Security Model? Do you mean a restructuring of TCP/IP with security in mind or the use of routers to block certain packets (how this would have stopped I LOVE YOU is beyond me). Frankly both your posts seems like the ravings of a clueless non-techie who is pro-MSFT simply because he has bought the hype. Watch this... I LOVE YOU, Melissa and the others were emails sent by users carrying attachments... No Internet Securiy Model will suddenly be able to tell between programmatically sent email and user created email, unless of course you believe some central authority will be able to direct all the mailservers on the 'net to filter certain emails dynamically. Then what happens when some other 'net protocol becomes widely used for proliferating viruses, e.g. MSFT's .NET .

    Finally about your little crack about Joel Klein and billions of dollars, what exactly is your point? MSFT got where it was by commiting crimes and breaking federal laws. The fact that it was making money for a few investors should not change the fact that they should be punished for their crimes. If you're trying to pin the fall of NASDAQ on MSFT...Get a clue. The fall of NASDAQ can be blamed on the fact that the Dot Comm Bubble Has Officially Burst [yahoo.com], film at 11.

  • JINI is a nice distributed entity architecture with good automatic discovery of new and local system memebers, but it's little more than a communications frame work and not the "X and Y will work together automatically" system that Sun's PR machine hyped it as. It's also got a particularly virulent license which has really hurt its adoption even within the not terribly pro-open source Java community.
    --
  • iCab's error report found 362 errors, and it's really picky. But the use of ’ for an apostrophe was not reported as an error.
  • Of course I don't see them as being completely dry either. The Open Source movement is quite capable of salvaging the best ideas from what is out there and innovating from there. From Napster came Gnutella, which will have a very long life even if the RIAA brings Napster down.

    The analogy I'm trying to make is that the idea of having resources accessible on the Net from wherever you want to be is something that will appeal to a lot of people. However the real solution is for people to configure their home servers that way. I'd love to be able to set up a desktop at home that I could take wherever I went. And not some big networked server I can't control, but my personal computer that I am in control of.

    In the end, things like Office are too easily reengineered by the Open Source movement. The operating and basic application suites are too easily recoded by independent developers. Microsoft trying to clamp down on piracy and going to a subscription model for Windows and Office will end up driving people over to Linux and the software suits being developed for them, where there are no such costs.

    In the end I see the Open Source movement taking on the ASP's by letting everyone become their own ASP. Everyone loads the applications onto their home servers that they like and then software is developed to let them run it from anywhere they like. And since this software is going to be Open Source, there are no monthly access fees like Microsoft plans to charge for their efforts.

    So that is how Microsoft will be defeated, when the Open Source movement takes what ideas from the whole .NET concept that make sense and are useful and leave out all the stuff that lets Microsoft remain in control and puts in all the stuff that gives the users control of their own systems.

    Microsoft does not understand that the Net is too big for it to dominate and control the way it used to do with the personal computer industry. Or they haven't realized that they have an opponent that they can't simply beat in the end. The computer industry is growing sicker and sicker with Microsoft's control and everyone else is coming to the realization that while they can't control things themselves, with Open Source they can make sure no one controls them.
  • I'm not so sure about that : if distributed processing take foot, manifacturers will stop producing full-fledged PC (at least the cheap ones) to start producing network PC.

    I don't see that as much of a problem; after all, you can build a computer from parts for cheap, and parts stores are easy enough to find, despite the fact that most people don't know what a motherboard is. What worries me more is if Microsoft and friends pass a law (well, get one passed, same difference) that makes regular PCs illegal, say because they can be used for pirating--er, for "circumventing a copy-protection device" (thank you, DMCA). And upon finding it they could even take it upon themselves to wipe your entire hard drive clean--watch them define documents and data files as "part of the application" they're allowed to delete upon belief of infringing use.

    Likely? No. Scary? Yes.

  • Maybe I'm giving Mr. Nielsen too much credit (I don't know him from Adam), but this decentralization of computing may happen without MicroSoft. Gene Roddenberry thought of this several decades ago, and it has been a staple of SciFi for a long time. (I seem to remember stories that pre-dated Star Trek doing this, but I don't remember specifics, and Niven didn't reach as many average-joe types as Star Trek has in the last 30 years.)

    Whether MS does this, or AOL, or VALinux, it still might happen. The flavor will be different, but it will most likely happen, regardless of who the major players are.

    Louis Wu

    Thinking is one of hardest types of work.

  • I don't think .NET will do very well - I think people would put up too much of a fight about paying a monthly fee for access to programs, especially with free alternitives availible.

    Furthermore, they will be pretty stymied by a shortage of wireless bandwith and availiabilty (even if they get the connections up to 56k [which might be OK to send compressed voice over for control] you still even now have fairly terrible digital phone access across the US [though admittedly I've tried only Denver, New York, and San Diego])

    And if that wasn't enough, you face a very ugly upcoming war of standards for these wireless devices with slow connections and small displays. I just got back from an XML conference where they also touched on WML and alternitives - what I got out of that was that WML is being pushed here and in Europe, though not in wide use yet (only one phone on the market has the most recent WML browser). In Asia, cHTML (compact HTML) endorsed by the W3C is already a standard - and in Japan they use iMode (a dervitive of cHTML) all over the place. Sure you can transcode to take away some of the differences, but there are enough mindset differences to make any content provider really want to support only one.

    Anyway, that's my own personal off-topic rant about how I hope WML folds and cHTML is accepted. But my original point was meant to be this - that I think in the end the popular alternitive will be for people to buy software still, but basically have a home .NET for each person - they just get a simple box that acts as a server for the house as well as the remote stuff, with perhaps a subscription to a backup provider if they care enough. You could buy more powerful home boxes if you wanted to do the remote voice stuff, and multiple boxes for redundancy.

    And the point I was trying to make THERE is that I don't trust Microsoft to know where it makes "the most sense" for my computing to be performed - I might want quite a bit more local power if I go to remote areas often, and I don't want to be locked into one very hostile .NET service.
  • The research projects described under Oxygen are excellent and worthwhile. But they hardly represent a new vision, as the press release seems to claim. Ubiquitous computing, human centered computing, wireless networking, deictic interactions, speech, vision, and all the other technologies mentioned in the Oxygen vision statement are technologies that have driven thousands of researchers for years, not only at MIT LCS, but at many research labs. I wonder whether Dertouzos really believes that he is articulating something new.

    This article has caused me to browse around the MIT LCS pages a bit more, and I find it pretty depressing. Many pages are written like press releases, complete with quotations that make them look like journalism (the way press releases try to imitate journalistic style). Everything screamed "commercial relevance", "spinout", "efficiency", "scalability", "commerce", "break-through technology", and "invest now".

    In part, that seems to reflect a genuine reorientation of research focus and motivation. In part, it also seems to reflect a misrepresentation of the reality of research: research is hard work, true breakthroughs are rare, and almost everything is built on other work, even at MIT.

    I find these trends worrisome for the future of basic research. Where is basic research going to happen if even universities think about the commercial value of everything and graduate students are motivated by how they can sell the next billion dollar startup to VCs?

  • >In the past, computers were expensive.

    Well, now that computers are "inexpensive", why do we want to move away from personal computers back to terminals? (sure, fancy ones, but terminals nevetheless.)

    Even assuming we had an infinite bandwidth, who does really benefit from that ASP model?

    The users? most users use their computer at work or at home and that's about it. I doubt people need to write letters in the bus or in their car (and personally I would rather that drivers focus on driving rather than some other tasks.) And some plain enjoy not having a computer handy.

    It seems the ASP model is more some Big Brother wet dream... they charge you per use, plus they store all you information, can it get any better than that?
    I think a few years ago, when Bill Gates started his MSN he said he wanted to make a few cents on every computer/network transactions... The ASP model seems to fit that vision perfectly.

    Also given the state of reliability of softwares offered by Microsoft, AOL, and other software giants, I would be more than reluctant to have to rely on their ASP service to get any work done, even if I trusted them with the integrity and privacy of my data (which I don't.)

    Janus

  • Pervasive computing, bah, that's oldschool... Which operating system will reign in the future? Not windows, but ours... yes, the brain is an operating system, and the programmers of the future will write applications for it... the Internet will not connect hardware machines but minds. Sure I acknoledge that there would be trouble (hackers - well more importantly crackers), but then again we are all dreaming aren't we, we are about to become extint anyway
  • I don't put much faith in this guy's prognostications; if he was a real usability guru he would have set up his DNS so that useit.com => www.useit.com. Saving me typing == usability in my book.

    As for ASP, once consumers and businesses get bitten by someone else holding their data and apps hostage (or even just due to a network outage), they'll be moving over to free, unencumbered, and open systems like never before.

  • The network is not the computer, the computer is not the network, and as far as this user is concerned, there are times when I'd like the network to fuck off and leave me alone with a completely functional machine.

    Amen! Here are a couple of interesting scenarios for you. A couple of weeks ago Inacom declared bankruptcy and closed their doors, completely shutting down an outsourced help desk operation for a Blue Cross/Blue Shield of Michigan. Imagine your ASP failing to achieve critical mass, filing for bankruptcy, and leaving you without access to your applications, databases, etc. Your corporate masters will only want ASP's with deep pockets...like MS, AOL, or Oracle.

    A more ominous scenario is the effect of a shift from the machine to the network to Open Source software. The Open Source movement is focused on replicating the functionality of OS and applications built for the desktop or server market. If you want to use an automotive analogy you could say that they build vehicles of various types and let you use them for nothing. What happens when the functionality is no longer in the cars but in the network of roadways? Where do the resources come from to maintain a parallel network of value-added roadways that can be used for nothing?

    Of course, the .NET, AOL, and Oracle folks have to actually get some value added to the roadways first...

  • by Anonymous Coward
    Actually, Jini has an amazing number of licensees. It's a fundamental shift in infrastructure that is taking a while to take place, but that's to be expected.

    It's hard to compare the two, since Jini is thoroughly documented and implemented, while MS' .NET is pretty vague and targeting two years from now. But in principle I support the idea of pay for use as an option for users, as long as the service provider's behavior is legally constrained to be reasonable. For example, it's not acceptable that your office software get cut off at the whim of the service provider any more than your phone service or power.

    The other thing that would need to be in such as system is a means of locating competing implementations of desired functions. For example, in Jini I can ask the world what's available near me that can print a document, and then I can choose which one I want to use based on the printer's capabilities, physical location, etc. By extension, I should be able to locate available spelling checking services, and pick the one I used based on specific vocabulary, pricing, etc. Fundamentally Jini is completely distributed; there is no "root" -- everything is locally coordinated, so that it can scale to support billions of devices on a global scale. This means that while Sun (somewhat) controls the protocol, it cannot control the services provided via Jini, which means that the open marketplace rules.

    If those same attributes are true of MS' .NET, that would be great.
  • Whatever Anonymous Coward. Programs suck nowadays for the same reason we got left with the Y2K bug, management decisions. Too much work, too few coders, and no time to do anything right, much less test. You think kids cant program nowadays? Can you say Linux? Ever seen Half-Life?

    Dont make blanket statements out your ass like " And some people wonder why software is so bad. Becuase the goddam kids can't program at all. Not like my day... "

    Im not even that good of a programmer, but I would throw the guantlet down with you anyday. Bring it on old cranky anonymous codger.

  • by TheDullBlade ( 28998 ) on Sunday July 02, 2000 @11:08AM (#962707)
    I personally wouldn't want to be in charge of maintaining a machine which is set up to accept and execute arbitrary tasks from passing users. (Yes, you can use sandboxing and other such strategies, but every security protocol is vulnerable.)

    This is sheer and utter nonsense. A virtual machine can easily be simple enough to be bug-free and handle every kind of overflow without hurting the machine it's running on.

    Not every security protocol is vulnerable, just those ones where the expense of perfect security wasn't justified by necessity (for example, when you want to sell a "secure" system, but you can hire marketers to hype it as secure more cheaply and effectively than you can hire programmers to make it secure).
  • hmmm. could it be that you are a troll whore? that is to say, a 'karma whore' of sorts, but only looking for brownie points from the jackass fakers from inchfan?

    I am the only real troll here, i can't even help it.
  • JINI builds on the standard Java protocols, but AFAIK it doesn't require Java. Normally, just a JVM would be sufficient.

    It is possible to write in another language, and compile that to Java bytecode. (I don't know of any existing compilers, but it can be done.) These programs will run in a JVM.

    It should be possible to access JINI services/devices from such a program.

    (It should also be possible to implement the JINI protocols outside of the JVM, but then you'd have to do more work.)

  • Do the words "Irony", "Humor", "Joke" mean anything to you?

  • OS/390 is the latest name for MVS/XA,MVS,OSVS,OS/MVT,OS/MFT etc. etc.

    It is the oprating system that runs on most IBM mainframes (and the Hitachi, Fujitsu compatables).

    RACF is (one of) the security systems you can use on the OS/390 OS.

    It is interesting how little "mindshare" this OS has, condsider:-

    Evertime you get a bill, there is a 90% chance that it was produced by one of these machines.

    There is something like a 95% chance that you bank account is managed by one of these machines.

    When you buy an airline ticket the booking will be processed by an IBM mainframe -- for sure!

    Something like 90% of all the business data in the world is sitting in databases managed by IBM mainframes.

    So noe you know :-)

  • I did my Master's degree research with MIT LCS, and an interesting fact that has escaped this discussion is that Microsoft has just donated around $25 million dollars to LCS for their new building and research. At the 25th anniversary party last year, when Oxygen was first presented with lots of fanfare, Bill Gates gave the keynote address. So these visions are close for a reason...

    Of course, most of the grad students aren't too happy about this, I remember one guy who immediately donated $100 to the FSF and posted their thank you letter on his door. Oh, and at least when I was there (1 year ago), Stallman had an office in LCS and spent alot of time there. Wonder how he feels.

  • It seems like the main focus of this plan is to turn your computer into a glorified terminal. How is this different from the terminals that were connected to time-sharing systems in the '60s and '70s? Why were people so excited to buy the Apple II when they could have simply had all of their processing handled remotely by big computers?

    The fact is that Americans, and most geeks to a degree, like to own things. It's why DIVX failed, it's why people take exception to the Microsoft End-User License Agreement, and it's why people buy VHS tapes of movies they will probably only watch a few times, when they could've just rented them from Blockbuster.

    We like to at least have the illusion that if anything goes wrong, the box is right here, and we can fix it (even if we really can't). Why else would people still feel the need to own cars in cities with good public transportation systems?

    ASP essentially violates the principle of the personal computer. Now, the processing is being done somewhere else, a place that you can't possibly reach. Most importantly, you're computing, using YOUR computer that YOU paid lots of money for, on somebody ELSE's terms.

    If you don't like the services that the provider offers, and you aren't in the majority, then your software preferences will have to adapt to everyone else's. Suppose I have a piece of software that I really like. It's my absolute favorite piece of software, and unfortunately, there are only 200 other folks out there who use it. Now, under the current distribution model, the software publisher simply stops selling that software.

    No sweat, I've still got my copy. I can use it as much as I want. Under the ASP model, however, it's not that simple. The provider can no longer afford to keep that service available. It's eating valuable space that could be used instead for the Next Big Thing(tm). So it cancels the service, and I don't get to use my favorite piece of software anymore.

    People don't want that. Maybe they don't know exactly why they don't want it, but the thought of centralization sends chills down their spines. Unless someone can find a killer app for ASP (and I haven't seen one yet), I expect it to go the way of Netcast.
  • 3) Poor Bill's lost his Word 2005 file for the 10th time today... I'm glad I bought Office 97 before they pulled it.

    2) Steve, will you let off those hardcore movie downloads just long enough for me to save?

    And, of course,

    1) Hex editor + netscape.exe = no more <BLINK>.

  • I raised this issue at Andrew Layman's talk on .NET at XML DevCon 2000 in New York last week.

    Microsoft gave this lovely humorous video presentation of a "Jetsons"-like world where all this chap had to do to get medical service was connect to his insurance company and approve their access to his medical records, and then approve access to the doctor for the same. All done using XML of course ;-)

    Sadly the privacy implications of this are incredibly far reaching, and being almost totally avoided in favour of rapidly building this ubiquitous system. My suggestion to Andrew Layman was to go out and read Simson Garfinkel's book "Death of Privacy...". I sincerely hope he does, and many other people do, because otherwise we're going to end up in this situation of people getting bogus records or transferring records to people hacking into the network (you approved the record to be transfered to who you thought was your insurance company, but it was in fact joe bloe hacker). And we'll get my Grandma approving her own labotomy because she didn't know which button to press. Lets all celebrate this joyous new technology!
  • If I wrote it with that in mind, of course I would. What kind of coward won't stand up and take responsibility for the quality of his own work?

    Ummm... read any software licenses lately? There's a reason they disclaim all warranties beyond "If the CD is scratched and you notice within 90 days of purchase, you can have a new one." Unless you're coding for a very limited environment (like, say, a Space Shuttle's control readouts), it is IMHO very unwise to start making guarantees. This is especially true if your code is exposed to the Internet where anyone can try to break it. "With enough eyeballs, all bugs are shallow" can cut more than one way.

    A secure virtual machine for making arbitrary calculations can be very simple indeed; you only really need a few operations. It would be like signing a statement that you totaled a column of numbers correctly; you'd want to check it over until you're certain, and charge extra for the time and worry of that, but it's a simple enough task that you can eventually be certain that you're correct.

    Can you? How? Can you prove that your double-checking mechanisms are not in error? Can you prove that everything was implemented correctly? What about the libraries against which you linked? No floating-point bugs in the hardware? Nothing corrupted by a recent system crash (perhaps deliberately induced)?

    I see nothing to refute my claim that 100% certainty is impossible. I see some good techniques for getting the uncertainty to very low levels, but you're claiming perfect security.

    How do you think hardware designers ever get anything done? There's no magical difference that makes bug-free hardware possible and bug-free software impossible.

    This is not strictly true. Hardware is governed by a few simple physical laws. You can simulate and model hardware under realistic conditions in order to test its operating range. Software, on the other hand, is a set of arbitrary finite state machines. It cannot be simulated with nearly the same accuracy or precision --- there are fundamental theorems of computational theory that demonstrate this. It is mathematically impossible to say "this program will always work in this fashion". If you wish, I will post the proof.

    I never said anything about that. I was very clearly responding to "I personally wouldn't want to be in charge of maintaining a machine which is set up to accept and execute arbitrary tasks from passing users." and talking about protecting the machine from the tasks (and the tasks from each other).

    I'd say that protecting the tasks from the machine also falls within the duties of someone maintaining a machine running arbitrary submitted code. Why do you think this is not the case?

    I'm not talking about communications security (which, of course, can never be perfect, for physical reasons; all theoretical communications security models rely on the absolute physical security of certain things, which is impossible in real life), I'm talking about the security of one machine and the processes that run on it.

    It's unreasonable to restrict discussion to one machine when talking about ASPs, though. The whole point of such things is that data and code are flying around the network like mad and getting executed all over the place. The communications behavior of the machine is a fundamental part of the system under discussion, and if that can be compromised, I believe my original claim is correct.

    BTW, what kind of idiot would let their car be controlled by a distant server over a network? Lines get cut, solar flares disrupt communications, networks go down.

    Read the articles, especially the Dertouzos press release. Now extrapolate. Which do you think makes more sense: me hunting around for a button on the panel, or me saying "Car, set cruise control."? Cars are likely to eventually have a limited autopilot. If I want to take a nap and don't want my teenage kids trying to drive, I'll want to be able to lock the controls. (Admittedly, a verbal password is a bit of the wrong idea here --- probably want a voiceprint or other biometric.) According to the article, all of this is going to be handled by the car's computer sending my voice to a distant server, which processes the voice into commands for my car and sends back the commands. There's your remote-controlled car.

    To look at it another way, companies can profit if everyone is dependent on stuff being shipped through private networks to private servers via proprietary protocols. The more things that are transmitted and controlled, the better. IMHO, this will lead to some things being networked that shouldn't, but that won't change until some early adopters get hurt or killed.
  • by istartedi ( 132515 ) on Sunday July 02, 2000 @11:14AM (#962717) Journal

    ASP model is all about piracy prevention. You can't pirate a service as easily as you can pirate a product. Will it benefit the consumer? Of course not. Thank the pirates. Welcome to the future, where you will here people saying "I can't use my word processor, the network is down".

    You might want to thank the Free Software movement too. You can't really sell free software. You can sell a service. Software vendors pressured by falling values for software sold in the traditional manner will do what they can to follow the ASP model.


    #VRML V2.0 utf8
  • "...an infinite bandwidth..."

    are you sure you meant to say that on slashdot? there are some pretty merciless trolls around here, they may try and take your lunch money.
  • intimidated IT guy rants as AC, film at 11.

    is anyone else's colon twitching like mine is right now?
  • thank you, pirates.

    including you, istartedi, because everyone knows VRML makes you a type of pirate too, eventually.

    do I need to spell it oot?

    I'll give you a hint: a good euphamism for this type of piracy would be 'livin next door to Versace' or 'friends of the he-haunch'.

    just my $.02 US.
  • But ASPs are easier to clone into Free Software packages than traditional software. The interface can be cloned. And the equivalent networking/sharing functionality can be done with encryption+TCPIP networking. Alternatively, if users do not want this kind of cloning, we can go the other way: make a client clone. Then the network protocols can be monitored and reverse-engineered. Class action lawsuits can be filed against companies to keep the protocol open for interoperability. And they cannot sue for copyright violation becuase only they have the running binary, so how can we be copying anything?

    MS should go ahead and make their .NET. Makes it easier for Open source/Free software to kill them off.

  • Thank you, Free Software movement for the falling price of software. Thank you also for obsoleeting such a traditional rip off. What little programing I have to offer shall be yours.

    I have no fear of loosing things such as word processing. I've got at least three editors with source code, ispell, and Word Perfect 8 to work with. In fact, I expect better things in the furture, regardless of what MicroBoft does.

  • I was looking over the .net front page [microsoft.com] and I noticed this:

    "Microsoft .NET is a revolutionary new platform, built on open Internet protocols and standards, with tools and services that meld computing and communications in new ways."

    The part that gets me is the "built on open protocols". Hmm, how can they take open protocols (such as XML, which they mention in the white papers) that are freely used and distributed, bunch them together loosely with some unstable code, put in in a box and stick it on store shelves for $500 (or whatever the price will be)? Well, they did it with Kerbos so i guess they can do it again...


    --
  • by orpheus ( 14534 ) on Sunday July 02, 2000 @11:33AM (#962724)
    I can't believe no one has mentioned the privacy implications of this. 1) you an encode as much as you want during transmission, but there is currently no way to work on the data without decoding first. The main CPU (and the 'rented' progfgram) sees all your personal data, and it's not under your control 2) If you trust all commercial CPU cycle providers (I don't), what about cracked/compromised systems? But more importantly, what evidence do we have that commercial enterprises can be trusted in this fashion? Even EU privacy laws hace limited utility against a US server. 3) Consider this as well... if your apps are following you around, running on whatever machines are nearby, and those machines are programmed to configure themselves to your custom settings, then trojan/virus/macro checking becomes tougher. Each machine can only (at best) detect the known public viruses. Meanwhile that custom reporting macro your employer put in a 'petty cash' template, follows you from bar to bar, to the house of your college friend (who has had more social diseases than Don Juan's taste-tester; and eighteen misc. misdemeanor arrests for DUI, drug possession, disorderly conduct, and trolling /.) Etc. Etc. You can *know* the proper configuration on your home/office machine. An anonymous machine can't recognize what 'belongs' -- And (you heard it here first) what about a macro that is set *not* to load from the server to your home/work machine? One that effectively lives outside your door and follows you only when you're out? Sorry, I'll stick with my private hardware, running my privately owned copy of software -- and the PDA with electrical tape over the IR port
  • dont count on it. computers crash ( or at least most windoze machines do ) so people are used to downtime. if the OS gets corrupted and your system is down you send it to the computer repair shop which means a coupla days off for joe random luser. with the ASP model, if your PC dies, you buy a new one and your connected 24/7 again. or if your network link is clogged plug into a cybercafe and youre online.
    its much easier..just like electricity. are you held hostage by your electricity/gas company ? sure. but you tolerate it because the alternative is costlier (gas generator etc).
    think about maintaining the computer...most people have no clue. in the ASP model data is backed up, stored in a safe place etc etc. how many people rely on webmail ? how many run their own mail server ? the ASP model *does* work for email - what makes you think applications like office are so different ?
    micro$hit *is* going to win this one if we dont watch it - the ASP model is superior and M$ is banking big time on it. and no - joe random hacker running linux isnt going to be able to participate -- and theres nothing you can do about it.
  • A fantasy with hostile AIs does not equal a luddite rant. It wasn't even dependance on computers that was the danger, it was the dependance on the farcaster network invented and controlled by the AIs.

    If any prophetic message is to be gleaned from those novels I'd say it is something like Don't trust a technology you do not understand.
  • by Anonymous Coward
    We a year or so ago, were trying to design something almost like what microsoft is trying to do, of course there was no handwritting recognition, speech recognition, or any of that recognition crap, but it was designed to capture knowledge, appointment, project work etc, on Windows and Unix system using the applications you had at the time, actually it was quite simple on the Windows system, with their filesystem API, creating hooks, or TSR systems were s sinch. We called it InSync. It was mainly focused on Businesses, mid-size and up. The company that was financing us 7 million to start the project pushed us into putting a whole lot of crap that basically to get more people on board because they thought capturing knowledge wasn't a saler, or they didn't see how efficient it was for companies like Intel who were loosing alot of great talent at the time. Anyway, I love what microsoft is doing, but I think they will run into the same walls we were having. One getting people to trust that their data is secure (this also includes lost data etc.) Two,try not to load it down with alot of nice sounding, but bloated stuff that doesn't make any sense being on the system in the first place. Also I just thought of something that scares me out of my mind. If microsoft controls your data, then basically microsoft controls you. Haha, they know when you are going to merge with Warner Brothers, when you will meet this corporate executive to discuss price grouging. Hmmm, this might not be so bad, but then again, my source code will be there for them to take, damn Microsoft!!!! Actually they might actually solve this problem by allowing the person to pick if they want their information sent to a outside source or not. This was one our solution for having a totally replicated system, the user could have all their information on their system, while giving people access to it, the only down side to this is if you want to share some information with dear old mom, you would have to leave your system on. Sad aint it?
  • This'll make for a crazy re-alignment of bottlenecks. No longer will the memory bus or the HD access times matter, no longer will the modem suffice. Video frame rate will continue to suffer though, with transmeta integrated video thin clients and set top boxes ruling the scene.

    Letting this stuff flow could have some positive effects though:

    It will serve to encourage the large scale adoptioin of fat pipes for grandma.

    Every user will be easy prey for those of you 133ts that need it. How hard will it be to open a franchise app server? like a quikie mart or a Mc Donalds. Better than banner ads for joe serverman?

    This will surely allow for easy 133t h4x0r1n6. Like what's-his-name from Mona Lisa Overdrive hacking the Northern Seaboard Fission Authority's power grid.

    Level load times in MMRPG's will include transmission of all graphics, geometry data, inhabitant/inventory data rather than just player state info. Maybe this will open up a million new exploits for me.

    Just a few thoughts off the top of my head...

  • bullshit. slashdot *is* an ASP. data is being passed untrusted from client to server. and it works. as does hotmail or any other webmail provider. they're ALL ASPs.
  • Yes, computers are cheaper now than they were.

    But the total cost of ownership isn't. What you have to include in the figure are the costs of managing them, which isn't trivial when you have tens of thousands in yourorganisation.

    There are a number of ways of reducing the TCO, including:

    • Making systems more stable, less vulnerable to viruses.
    • Reduce the software distribution effort by only sending software to a set of servers, rather than individual workstations.
    • Reduce the cost of support by keeping software in strictly controlled configurations, again on servers rather than uncontrolled on users' desktops.
    Now MS are probably pushing to reduce piracy and put you on an ever increasing payments plan rather than reducing your TCO. However this doesn't detract from the fact that moving to a server based application infrastructure (not necessarily an ASP) should reduce overall costs. This doesn't apply everywhere, but it does apply where you have many workstations all running the same, small set of applications.
  • I think people would put up too much of a fight about paying a monthly fee for access to programs, especially with free alternitives availible.

    You're forgetting corporations who may well find the ASP model cheaper than maintaining their own MIS and support staff.

    that I don't trust Microsoft

    Blah, blah blah... change the record, slashbots.

  • Seriously, though, I can't imagine ever saying "are emm dash are eff asterisk" to make room, or "cursor right, cursor right, select, cut, damn it, undo, copy, cursor down . . ."

    Thats because you're trying to marry standard text & keyboard interfaces to a completly diferent input system. It won't work, just as trying to use a mouse to type won't work; it wasn't designed to do it.

    Interfaces provide a layer between the input devices and the computer, therefore the interace is designed for the input devices available. If you have voice recognition, you'll have a diferent interface.

    For example, instead of "arr emm dash arr eff star" you could use more natural language, such as "delete all files in the temp directory", or "Open the abiword file that contains my current cv".

    All of this is very obvious, but it seems a lot of people will have to be dragged away kicking and screaming from their text consoles.
  • I know and I'm one of those C writting old COBOL programmers in serch of a job... teaching myself XML and PHP3 at the moment...
  • The author seems to be getting really het up about this idea that one company will control "Root.NET". It will only happen if we allow the protocols underlying to be proprietary. I have to wonder exactly what new protocols the lookup service will require (I guess any hosted applications may need specific protocols, maybe there is a use for BXXP). Isn't this the kind of stuff that CORBA has supposedly been doing for years.
  • Making systems more stable, less vulnerable to viruses.

    Microsoft, obviously, will go backwards on this one, so doesn't that kind of negate the other two points somewhat?

    --

  • You have a point - people haven't really deserted MS in the past over the quality of their software or their version-upgrade merry-go-round. Maybe people are used to getting jerked around in the software market and can't imagine any other way. ASP is only going to increase that feeling of lack of control - people will only use it if they don't think they have another choice.

    Of course, there hasn't been a free, usable desktop with available apps until recently. I see this as an opportunity for Linux distributors. Why pay MS (even micropayments) to hold your data hostage and maybe even "upgrade" your apps for you, when you can set up an easy, reliable server for your data, provide an intuitive, locked-down desktop for your users, and relax? It might depend on the size of the business that you have - smaller businesses are going to be less willing to maintain their own IT department. But on the other hand, ASPs are going to be less willing to cater to the really small accounts, so I think things will balance out.

  • Looks like a lot of "innovation" for the court, which will probably turn out to be vapor, but at least they're excersizing their "freedom to innovate", by tying everything together so tightly that it'd be nearly impossible to split the company.

    I thought of this too, but no, they'll simply split it so the MS-Apps company (Microsoft, Inc. with president and CEO Bill Gates) gets all the .NET stuff, along with everything else (Office, Internet Explorer, MSN, Hotmail, DirectX, the X-Box, Halo, etc. etc.).

    --

  • by Anonymous Coward
    I'm not so sure that voicerec is the way to go except for simple tasks like "Lights, on".
    This "computers of the future will not be controlled by keyboard or mouse but speech and vision" is nonsense. (Though I would be happy to see mice disappear for many tasks; vision could be a solution here). How about simple text editing and coding? It is diffucult to tell a person "remove that character there" so try telling that to a computer. Speech is not an exact input method, keyboard is. Before someone comes up with a better text-input method, I think workstations will have a keyboard, although it can be enhanced in many ways.
  • I pretty much agree with you about ASPs. The only thing I might want an ASP for is backups (as long as all the data is encrypted before it leaves my computer).

    But I disagree with your assessment of the security situation. First, ASPs can work just fine without mobile code or agents or whatever. Consider your example of a speech-recognition ASP: It will already have the code installed on it, so there's no need for it to run untrusted code.

    However, it is still possible to run untrusted native code safely if you have the right OS; too bad "the right OS" in this case isn't any flavor of Unix or NT.
  • Speaking of ASPs and the future of computing: what happened to Broadway [broadwayinfo.com]? Broadway was XConsortium's X11R6.4 server and was intended to "web-enable" (cool buzzword) X11. It did so by including a new X protocol (LBX) with lower bandwidth requirements and adding features neccessary to run applications from untrusted sources on your X server. I think this sounds like a great and relativly simple way to provide applications over the net. The last thing I heard was that XFree 4.0 should be based on X11R6.4, but the release notes dont mention any of the broadway features, nor could I find any application for this. Info, anyone?
  • I just finished the Hyperion series (Endymion and The Rise of Endymion included). The key to the pervasiveness of the artificial inteligence entities in the novel was their ability to provide humanity with instantaneous interstellar travel.

    >>"Who pays, how much, to whom?" The humans pay by letting their brain's spare come "cycles" be used by the AIs

    Today we are far from having transporters on hand. Instead we are faced with the prospect of never having to leave our internet connections. We are everywhere we want to be. Are neural jack's that far off?

    The current internet is just a shadow of the cyberspace dreamed up by Gibson. Distributed computing is the next step in making it a reality. Of course, letting M$ be the center of this new realm doesn't seem like the best idea to me.

    I'm more worried about letting someone take over a possibly beautiful human endeavor through patents or whatnot than i am about seeing the AI Demons of Hyperion evolve.
  • In the past, computers were expensive.

    Well, now that computers are "inexpensive", why do we want to move away from personal computers back to terminals? (sure, fancy ones, but terminals nevetheless.)


    "Back to terminals" is a load of BS. It is the same load that PDA and wireless sales-people where pushing at PC Expo! They want to be the first ones to support the next big thing. But are PDAs and wireless the next big thing? It may be.... but there is another reason to push new technology.

    The reason is: They can't sell anymore of the old technology. Personal Computers are cheap and plentifully. If you set out to compete with Compaq, Dell, IBM, and Sony in this arena be prepared to eat it. The profits from a nice new PC is not what it was in the old days. Competition has cut profits to the bone.

    In order to sell you something that they can profit from, they have to sell you something new. Of course Microsoft is going to try to sell you something new. They have 95% of the market! What else can they do, besides go after pirates! That is chasing crumbs for Microsoft (they'll do it though).

    Hey Microsoft! What happened to the old idea you were pushing a couple of years ago? About a convergence of TV and PC? Did that go out the windows? I guess so since PC are out this year. So much for your vision of the future.

  • hei. i m sory if i make u look stooped. but i m jus smartter den u. noe need 2 kall namez
  • Um, why don't you just ask Red Hat how they manage to do it?

    Good point... but then again RedHat is freely distributed and open source, so that may provide the loophole they need...



    --
  • I'm not so sure that voicerec is the way to go except for simple tasks like "Lights, on".

    Oh, great--and this right after I ordered The Clapper® [cornells.com]!

    Seriously, though, I can't imagine ever saying "are emm dash are eff asterisk" to make room, or "cursor right, cursor right, select, cut, damn it, undo, copy, cursor down . . ."

    I've no doubt we're on our way to some cool input technology for general use (e.g. chord keyboards, gesture recognition)--voice is not it.
  • nope. they didnt. ever notice M$ never lets an idea go ? in this case they want to converge the PC, TV, MP3 player, handheld, Palm, pocketPC and *everything* in between to form a seamless ASP.
  • by Anonymous Coward
    hmmmm.. do we win something for /.ing microsoft?
  • The whole ASP/distributed computing thing still hasn't taken off, even though it's been hyped for years; I still don't understand why everyone's so desperate for it. And as for MS's ".Net" (hey, can -I- copyright a TLD too?), I'd be impressed if even their marketing department could get it off the ground.
  • This is the kind of situation that authentication is supposed to avoid.
  • by Masloki ( 41237 ) on Sunday July 02, 2000 @09:39AM (#962750) Homepage
    Sorry, I am too lazy to actually code html, so check out http://www.useit.com .

    Jakob Nielsen heralds .Net as the dawn of a new era. It scares me that with so little information, and Microsoft's current track record that we all complain about, Nielen offers very strong support for this. As some have said, this can be a good idea, but at this point I don't trust MS to pull it off well. (I don't hate all of MS either, just the management. A lot of excellent programers and designers go to MS because that is where the money and security is.)

    More on topic, pay to play IMHO sucks. I personally prefer the system as it is now, pay once, play close to forever. Of course, this option won't go away, but it will cost a bloody fortune. OTOH, the idea that paying for what you use has been a dream of mine. To pull off these two ideas so that the consumer wins means one thing...pay once for only what you need and have the option to buy components later. Sounds a bit like the auto industry. And yet, the auto industry is setup to screw the consumer. So will the IT, IS, etc industry pull it off in a better fashion? Or are they slavering at the profits automakers are able to pull off?
  • The users? most users use their computer at work or at home and that's about it. I doubt people need to write letters in the bus or in their car

    Yes, but I do want to read e-mail in the airport. And I want to read the NY Times when I'm on the bus. And it sure would be nice if I didn't have to go searching for where I left off every time I take my "book on tape" into another rental car.

    Think of how many people put up with the bullshit of toting a laptop around with them all the time -- everyone laughs about the old fashioned "sneaker net", where we were forced to cart around data and applications on floppy disks. Now, we have "oxford net", where dumb-asses cart around the whole damned computer!

    I'll give you an example of what I'd like to be able to do someday. Burn your computer and your telephone. Set them on fire. Don't enjoy that plastic smell too much, of course. Now, call up the ITS department, and ask for replacements. How long is it going to take to get your computer up and running like before? A day? A week? Is it still going to impact your productivity for a month? Now, how long before your phone is up and running again? A half hour?

    Or, go spend a weekend in hawaii. Don't pack or plan -- just go, with nothing but your wallet and some spare socks (my feet always sweat on planes). At the hotel, ask to have a computer brought up to your room, and some more towels. Use the computer to finish a project from work -- write some code, or proof read a report, or respond to some e-mail. How long will it take you to set up this new machine to do those things? A half hour? An hour? A week? Or would it take a month to be productive on the new machine?

    In an era when a computer costs the same as a good telephone or a VCR, it has reached the point where every time I get a new machine, I easily spend far more money/time setting up the machine than I spent on the machine itself. Some people like this -- but thats ok, because some morons enjoyed dicking around with autoexec.bat and config.sys files, too. Other people don't have that problem, apparently because they haven't discovered that their machines are configurable. I, personally, can't stand it, and I definately look forward to the day when I can sit down at a computer in an airport terminal, and have it work exactly like the computer I'm using right now.

    (Of course, I can't imagine what Bill Gates thinks he's going to do to solve the problem. I'd love to have my data on a central server, but I sure don't want it on his server...
  • of course, he's a fictional character, right?

    George
  • Using &#8217; is NOT an error - it's a valid entity, specifying a Unicode character. That's even better than the usual character 146 used for appostrophes by Windows apps (as character 146 is the Windows appostraphe). By HTML standards, that entity is perfectly valid. I'm not sure if character 8217 (0x2019) is a standard Unicode character, but it IS a Unicode character, and should be supported by Unicode compliant browsers. If it fails to render in Mozilla, then that's a Mozilla problem, not a "proprietary Windows-only" thing.
  • The concept of an ASP world frightens me, especially in light of the trend to turn everything into spyware. I don't want to run an installer for the win98 driver for my new printer, let alone the RealNetworks installer. And I sure don't want to rely on some mission-critical application when I can't even read the binary.

    From this day forward, I will run no program that I didn't compile myself from source code. Okay, that's currently unrealistic, but it's something to shoot for. So join me, won't you? Spread the word! No source, no cycles!
    --
  • Since when does ASP have anything to do with distributed computing?
  • by kerskine ( 46804 ) on Sunday July 02, 2000 @01:25PM (#962765) Journal
    The late Mark Weiser of Xerox PARC wrote an article for Scientific America [ubiq.com] in 1991 that defined much of the ground work for things like Project Oxygen. Give it a read, it changed my career.
  • by Alik ( 81811 ) on Sunday July 02, 2000 @01:40PM (#962767)

    This is sheer and utter nonsense. A virtual machine can easily be simple enough to be bug-free and handle every kind of overflow without hurting the machine it's running on.


    Oh? So you're willing to sign an affadavit certifying that this virtual machine is absolutely free of security holes and cannot be compromised? No buffer overflows? No hidden back doors? No chance of somebody inserting malicious code into the machine so that when I say "What's the VA stock price" the car-computer gets sent "Set cruise control to 5 trillion miles per hour. Set steering to target that cliff over there. Lock controls, set unlock password to '!seineew era sreenigne droF'"?

    Every security model has a vulnerability, be it in the trust model, the underlying implementations, or an actual protocol flaw. There is no such thing as a 100% secure system; all there is is a system which is very hard to break into and which thus discourages most crackers. Unfortunately, given enough time, you will run into a true hacker out to cause you grief, and then your system is going down.

    This, IMHO, is another reason why the "network is the computer" philosophy is bad. Removing the net connection from a computer almost always decreases vulnerability by orders of magnitude.

  • But I disagree with your assessment of the security situation. First, ASPs can work just fine without mobile code or agents or whatever. Consider your example of a speech-recognition ASP: It will already have the code installed on it, so there's no need for it to run untrusted code.


    But does it have every single module it needs? For example, let's consider an ASP providing voicerec in Darkest Africa. Now let's say Bubba comes sauntering along through the jungle and sees a brightly colored snake. He asks his palmtop "Is that thang friendly?" The palmtop contacts the nearest ASP via wireless, indicating that it needs recognition for English, and would like dialect-specific services for Alabama. Most likely, this server does not have English(Alabama,USA), or if it does, it does not have the latest version. Therefore, it will query the local network. Let's say that Billy-Sue, angry at Bubba for not getting her new hubcaps, has compromised a node on the local network such that it appears to have the necessary software. However, her software is specifically instructed to reverse the sense of any safety-related query. Thus, when the palmtop sends the image of the snake for image-recognition, it ends up asking "Is animal (deadly redneck-eating snake) dangerous?", which returns true. Bubba therefore hears his palmtop say "Yes, it is." Thinking that the snake is friendly, he is promptly devoured.

    The whole principle of ASPs, IMHO, is getting the latest code to the user just in time. If somebody compromises the delivery chain, malicious code can be inserted. (I can't wait to see the sorts of things a hacked Office.net could deliver. All you'd have to do is muck with the target's DNS resolution and point them to a malicious site. (Yes, the code might need a signature, but many users won't bother to read the resulting dialog box, will click on 'trust code', and get screwed.)
  • The words interoperability, security, uptime, connectivity, cross platform are phrases that never into the minds of anyone at microsoft. Anyone who takes their mono-platform view of the future seriously needs to take another look at the internet and it's goals.

    As an example of this, I just used last nights build of Mozilla and attempted to load this page (http://www.microsoft.com/net/) [microsoft.com]. I got the message " The connection was refused when connecting to www.microsoft.com." They can't even get a server side referer script right, how can any right minded IT professional not look at their vision as just more useless PR.

    On the issue of microsoft going forward with this horific idea, the unresolved DOJ case has their hands tied. Yes, they are in a strong position to exert their will onto the market, but they are not as strong as they would be if this case was resolved.

    We all know what happened when ms took an operating system with a terible security model and unleashed it in the hands of clueless users . . it resulted in Billions and Billions in lost revenue and time when the ILUVYOU VB Script rippled around the world.

    .NET - yes it's a vision, but so was Mein Kamph
    ___

  • The article opens by saying: I don[]t have any inside track with Microsoft. Heck, our company doesn[]t even have a Microsoft developer or marketing rep!

    The first thing that caught my eye was the blocks that appeared where apostropies should have gone. The page, (produced by Adobe GoLive) suffers from that same glitch that Frontpage-created pages do, which stems from assuming you are reading it with de-facto browsers.

    When an article discussing a microsoft-dependant market contains html errors that punish me for using Opera and not Internet Explorer, is that irony or hippocracy?

  • by Alik ( 81811 ) on Sunday July 02, 2000 @09:50AM (#962777)
    I'm personally not convinced that voice-rec is the way to go for mobile computing. If I'm on the bus or anywhere else where someone can hear me, I don't want them to know what I'm saying to my computer. OTOH, if you're actually the one driving the car, it does make sense.

    More importantly, though, is this vision that most mobile machines will stream all their data to nearby big iron which will crunch the numbers and stream back finished product. Let's pretend for a bit that the bandwidth issues can be worked out. Who's going to actually be running the machines that provide all these spare cycles? Are we going to have companies which simply maintain large computers for performing standard tasks like voice recognition and Web searching on behalf of mobile users? I personally wouldn't want to be in charge of maintaining a machine which is set up to accept and execute arbitrary tasks from passing users. (Yes, you can use sandboxing and other such strategies, but every security protocol is vulnerable.)

    I did mobile-code research for a few years, and the resource question was always coming up. There were some papers [dartmouth.edu] written by a grad student with a background in economics, and some modeling was done, but it was never quite proven that this could work. (One can't really model all the various kinds of automated maliciousness that could occur.)

    Finally, I'll add the standard gripe that I think ASPs are a step in the wrong direction. I don't want to be continually dependent on a manufacturer for access to an application. Let someone arbitrarily deny me word-processing services because they don't like what I write? Be forced to use a new version of software which adds features I hate and removes the ones I love? No thank you. If I want to take my laptop to Mars and do my word-processing there, I want to do so without interplanetary network lag.

    Of course, if played right, this could be a big win for Linux and other free-software projects. I believe that once users get bitten by the ASP model, they will want to get away from it. Obviously, the big companies won't let them. If, however, they can just switch to a purely-local free-software office suite, we might see a large jump in the use of free systems.

    The network is not the computer, the computer is not the network, and as far as this user is concerned, there are times when I'd like the network to fuck off and leave me alone with a completely functional machine.
  • Yeah.. we don't have to look at it.
  • " . . . all there is is a system which is very hard to break into and which thus discourages most crackers."
    I have OS390 running RACF in my office. Yes, real security is real expensive, but it works.
    1Alpha7
  • 640K? No 64K should be enough for anyone, 640 is going way over the edge, what are you trying to run there, some kinda VR simulator.
  • by Anonymous Coward
    I expected more from MIT, but things seem to have gone downhill since my undergrad days there.

    Well, what would you expect? They're all fucking around with their Linux kernel when they should be learning something new, somthing that isn't a remake of Unix-1989.

    To become an engineer you don't just muck around on the beach with a sand pail and shovel. To become a computer scientist you don't just muck around in one particular re-implementation of Unix.

Kleeneness is next to Godelness.

Working...