Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Upgrades Technology

Pointless IT Innovations Considered Harmful 143

Makarand writes "According to a comment column in the guardian innovations in IT are most often simply more trouble than they are worth. Most innovations in IT today are platform specific and are easy to come up with in the computing fields. Innovating gets easier if the platform sticks around for a long time. These innovations accrue incompatibilities making it difficult for users to switch platforms and absorb the costs of switching to a new platform. Users will not switch to a competitor's product if they believe that their platform will be later updated to deliver the same benefits."
This discussion has been archived. No new comments can be posted.

Pointless IT Innovations Considered Harmful

Comments Filter:
  • by Anonymous Coward
    You never know when you're going to stumble across the next big revolutionary idea.
  • by AndroidCat ( 229562 ) on Sunday March 02, 2003 @11:14AM (#5418476) Homepage
    Are the companies that need the continous cash-flow of selling the latest upgrade every couple of years. (Naming no names. :)
    • by Angry White Guy ( 521337 ) <CaptainBurly[AT]goodbadmovies.com> on Sunday March 02, 2003 @11:19AM (#5418490)
      You mean the Tech industry, the automotive industry, etc. This is a part of the cycle which we have made ourselves a slave to. Why does the auto industry have to change their models b x percent a year? Cars would be cheaper if they built the same car year after year, then changed models once enough aggregate advances have been made. But we pretty much tell them that they have to follow this yearly cycle of 'improvement'.
      • Sure, all industries try to sell the Latest and Greatest. One point of the article was that in technology, companies frequently try to bind us with innovations. I don't have to buy the latest Fnord product. I can switch to someone else's or keep my current model. If the rest of the office switches to a new version of Word, I'd probably be forced to go along. (And if I'm running OpenOffice, I'd better be really and truely compatable.)

        It's the "lock in" factor that makes innovation in computers different from cars, or as Admiral Akbar said "It's a trap!"

      • by Anonymous Coward
        Most innovations should come in small steps rather than large newly redesigned leaps. Figure if, say Ford discovered 50 new technologies in cars over the next ten years but didn't put a single one of them in their models until enough accrued to 'justify' changing the models.... you'd get a probably get a piece of shit that can't last 100,000 miles in ten years with all that suddenly newly built in stuff that hasn't been time tested. I don't know about you, but a car from 2000s most likely vastly improved than from the 1980s (even if the same model), and if you ever driven a car from the 1960s or 1950s.... cool to look at... but to drive daily? Blech.

        The cycle of small constant innovations is better than giant leaps forward, imnsho.

        But I have to say one thing on software, most 'innovations' are purely feature bloat that 99% of users probably don't use but it lets the company print on their marketing 'hey, we have this new feature foo and it increases productivity by 1000000000000000000000000000000000%!'
      • Why does the Linux kernel have to be changed? Why not just stop where we are. Unless of course you want the latest innovative ideas.

        The way to insure that innovation is real is to use Linux to innovate with. Then if the innovation works with Linux it should work with the proprietary platforms. Demand it. No lock-in.

        tb
        • But the Linux Kernel is held back into a development phase until there are major, tested differences from the older 'model'. Sure, some features get backported from the development tree, but for the most part, the next major release is signfigantly changed from the current major release.
      • Cars would be cheaper if they built the same car year after year, then changed models once enough aggregate advances have been made.

        Are you sure about this? Certainly continuous processes of small incrimental changes are much cheaper than bringing out fully tested massive overhauls in manufacturing and in software.

        follow this yearly cycle of 'improvement'.

        I suggest you get a used car which is 20 years old and try it out. The improvements are not fake; just because the car industry doesn't move at the same speed as the computer industry doesn't mean they aren't improvements being made.
        • I agree with you (and the test driven development people) that "continuous processes of small incrimental changes are much cheaper than bringing out fully tested massive overhauls in manufacturing and in software" and add that they are more reliable too. I have to disagree with you on the automotive example because my "new" car is 17 years old and my "old" car is 20 years old, but there is nowhere near the difference in functionality, usability and even reliability than there has been in computer and software over the past 20 years. Indeed, I think my TI 99/4a or Apple IIc was much more reliable than my latest and greatest hardware and software (especially from companies like MS who have the advantage of market entrenchment), while every year my computer becomes more useful to me. Very few people buy a new car every year in order to make driving a car more productive or because the car they bought the previous year didn't quite work right, but most of us update our software often for both reasons.
        • I am not saying that technology has not changed at all in the last 20 years, I am merely stating that many of the changes from year to year are cosmetic, aside from the changes in the second model year where the company fixes mistakes in the manufacturing processes and quality issues. Each year machines are adapted for these changes. The taillight lenses become more rounded, a new mold had to be made for the lens and quite possibly the fender. This in turn had to be prototyped to ensure that the minor changes did not affect the structural integrety of the quarterpanel that the lens sits in. Machines have to be retooled for this, and for what? A fucking rounded taillight. All I am suggesting is that if we were more lenient in what we considered a 'new model', we might not have had car prices which increased over 20 times the rate of inflation. As for major improvements, like antilock brakes, airbags, etc., I am not implying to hold off on deploying these technologies, I am merely saying that a taillight lens or a different ground effects kit on a car merely to reach their quota of changes for the new model year.

          I suggest you get a car that was one of the last off the 2002 production line and get one that was the first off the 2003 production line and tell me what vast improvements have been made in the 6 or so month span that these cars designs have been finalized.

          I realize that the auto industry doesn't move as fast as the computer industry, but both suffer from artificial innovation. And if you can't make a better car in 20 years, you don't deserve to be in business.
          • I suggest you get a car that was one of the last off the 2002 production line and get one that was the first off the 2003 production line and tell me what vast improvements have been made in the 6 or so month span that these cars designs have been finalized.

            As I said in the original probably nothign vast. OTOH if you ask the dealer he can probably give you 50 little changes that are improvements along with another 50 which have to do with style. BTW I don't care much about the style issues year to year either. OTOH when you look at an 15 year old car you can see the cumulative effect of the improved styles. Even the headlight example is probably for good reason.
          • Do a little bit of digging into the history of auto makers; the constant micro-updates of body styling is pure marketing. It's not misdirected efforts to innovate; it's a very deliberate, time tested marketing technique.

            There are quality improvements being made; as other posters have said, compare a 2000 model to a 1990 model to a 1980 model... wear aside, the engineering quality of virtually every component is steadily improving. Complexity is up too, but if you want airbags, seatbelts that lock only in a crash, antilock brakes, traction control, cleaner emissions, and better fuel efficiency, you need some microprocessors. I drive a 1995 model year car now, and I remember my 1980 model year car. It's laughable to say that in 15 years there wasn't any real innovation. People now don't need a car that goes 1000 times faster than a 1988 model year car, or that has 1024 times as much trunk space, but that makes sense in the computer market, so that's what we got. With cars we got improved safety, fuel efficiency, increased power, handling (especially on slippery pavement), and improved reliability. Airbags are smarter and safer now than when they first came out, and in some cars there are side-cushion airbags. That's innovation on top of innovation. Forget about navigation systems and Java cars and heads up displays; the basic car systems are way way better.

            But, there is definitely a lot of changing stuff (I like the taillights example) just to change it, so that you can still sell a 2003 car when it's basically the same as the 2000 model, and so you can sell lots of expensive repair parts instead of giving OEMs an easy time tooling up to make repair parts or allowing owners to hunt in junkyards as easily.

            Cars in general are a monumental scam, and not a new one, and not a secret one.
        • Oh yes I love the innovations of the last 20 years in car manufacturing:

          - the sealed engine block,
          - computers everywhere inside, just great, how easy they are to tune now. Wait, they don't need to be tuned, right?
          - the 6-litres engines in SUVs, love them. The noise, the pollution, the mileage. Not to mention the great view from behind the whole behemoth.
          - the nice soft bumper bars that need replacement if a tennis ball thrown by a 2-year old bounces into them.
          - the great headlight optics at $500 a pop (which also break if a bee flies into them)
          - The false sense of security given by ABS. Now everyone can go faster and be sure to brake on time!
          - The great air bag safety measure. Now even less reason to put on your seat belt.
          - For some reason newer cars seem to have a very poor air inflow system, so that as soon as temperature reach 25C/78F you *have* to turn on the air-con otherwise you broil.

          I could go on. The fact is the Automotive industry is very conservative and that changes for the better have been evenly matched with changes for the worst. I remember reading that in America the average mpg rating hit an all-time high in 1988 or so, and has been steadily declining since (especially since SUV are so popular and must only meet a lower mpg standard). Sorry that can't be progress.

      • Evolution is measured by the amount of cars a civilization can produce, which is why humans are ranked nr 2 in the universe. Germans are nr 1.

      • The auto industry is worse than IT in that respect. It sometimes seems that they make no effort at all to use existing off the shelf parts (just how many configurations of spark plug do we need?). They apparently have no appreciation of modular design. They certainly don't design them to be fixable. My favorite is when a part has 2 SAE bolts and one metric. The result is that parts and service cost more. At the same time that PCs roll out the tooless case, the auto industry gives us engines where nothing can be reached without partial disassembly of something else.

        Of course, recent CPUs seem to be trying to catch up with the auto industry. For the last 2 or 3 years, new CPU has meant new motherboard.

      • If they stopped changing bumper and tail light shapes every year I bet cars would drive themselves by now. Damn marketing.
  • Added Value (Score:3, Interesting)

    by MisterMook ( 634297 ) on Sunday March 02, 2003 @11:15AM (#5418480) Homepage
    If added value is subtracted value in the scheme of big business it's certain why most of the big media outlets are still shaking in their boots about the internet. Not only does Sony Music have to keep up whatever dubious value it promotes in it's acts, but it has to compete with the innovations of a still blossoming media sector. Every new innovation on a computer is something else to distract a person from the lack of value on a cd?
  • by keyslammer ( 240231 ) on Sunday March 02, 2003 @11:19AM (#5418489) Homepage Journal
    The gist of this article seems to be that it is innovations that deviate from established standards, rather than innovation per se, that are harmful.

    This is pretty much a no-brainer at this point in time.
    • You can deviate, but only if you don't force people to throw away everything else to get there. [Stupid analogy warning] It's like: rather than replacing the entire baby, you only change the diaper. However, after 20 years, it's time to rethink that stategy. :^)
    • The gist of this article seems to be that it is innovations that deviate from established standards, rather than innovation per se, that are harmful.

      I don't think so.. I think the author is basically saying that people choose conservatively and are annoyed by innovations that don't actually add value, and companies try to protect their market share with lock-in technologies and patents/copyrights. OOoo!! I see nothing of significance in the article about innovation for the sake of innovation. Defensive "innovation", sure. Crush the competition and control customers, sure, I see that in the article.

      I'm all for breaking standards for reasons of actual innovation - that's how we got everything important. But that's not what the article is about.

      • I don't think so.. I think the author is basically saying that people choose conservatively and are annoyed by innovations that don't actually add value

        The author points to two examples (microchannel bus and Motorola 68K) which did, in fact, add value but were not widely accepted because of the cost of adopting an incompatible new technology.

        More generally, he also discusses how it is easy to introduce innovation that improves a technology given the experience of that technology (the 20/20 hindsight rule) and talks about the circumstances in which an incompatible standard has produced a market shift. That said, this article appears to me to be about the trade-off of innovation versus established standards, a concept that should be pretty obvious to most /. readers.

        But you're right about there being nothing about "pointless innovation" in the article, which gets back to the part of the point I was originally trying to make: where the hell did this headline come from?
    • by Erris ( 531066 ) on Sunday March 02, 2003 @02:06PM (#5419194) Homepage Journal
      The author has cleverly confused "standards" with market lock in in order to push M$.NOT. Standards come from organizations like IEEE, W3C, ISO and what not. Market lock in comes from comercial vendors who corrupt those standards and make it painful for users to do anything but lease their software. Real standards alow for innovation because they don't change. You can add onto and improve real software without losing anything. Market lock in is a product of closed source development which he details very well in the first few paragraphs of the article to create fear. He details some of the wastefull losses suffered because of closed source development further that fear then paradoxially concludes that the answer is the newest closed source monster. Let's look at some of the silly things he says to support dubious chain of thought.

      If we could start again from scratch, with hindsight, we might well decide to adopt the MCA bus, or something similar. Since we are not starting from scratch, we have to consider the switching costs.

      PCI anyone? MCA failed because IBM made it too expensive relative to the hoads of imported clones that soon swamped the market. Yet CERN made a better bus and it was adopted under reasonable use terms. The more open standard won.

      One of the many reasons that Apple lost the desktop wars was the conclusion arrived at by every rational person: that Apple was bound to lose. One day,

      Apple is dying, he says. Right. I can't think of a better computer for most people to own. But that pales in comparison to the finishing touch:

      One day, Microsoft could face a similar problem [that Apple supposedly suffered] with GNU/Linux. So not only must it maintain Windows' dominance, it has to maintain the perception of future dominance. In this case, of course, the answer is Microsoft.net.

      Of course! Now I see the answer, all of the illogical strings above have tied my thoughts into a knot, but M$.NOT will set me free. I am free of fear and confusion knowing that M$ Office will alaways predominate, that my platoform performance and security is much less important than conforming so I don't look foolish. Yes, free from fear, uncertianty and doubt. I am a rational person and now know that market lock in is more important than standards. I'll just sit in my single window manager (AKA Windoze) prison and watch as warring companies smash all the ammenities so that nothing ever works right and what does work won't for long. I'll eat whatever new trash M$ throws into my cage.

      What a laugh. It is so obvious that free code with it's transparency and freedom of modification solves all of the problems the author can dream up and that others suffered. Free software is modular, replacable and never dies. MCA runs just fine under linux and a 486 PS/2 makes an OK workstation that can effectively interoperate with more modern hardware. Under propriatory code, PS/2 is simply junk like most any older computer. Free software has been ported out to all maner of hardware and it's users can make use of anything out there, Arm to IA64. Because XFree86 is free and open, I can have any number of window managers, each vasty superior to M$, and they can all interoperate together. Even the silly painful world of M$ Office formats has been made less painful by Open Office, K Office and other free and open codes that can read that crap and extract the information out of it. It's amazing that the article started off with a very perceptive view of the evils of propriatory closed software development but ended up recomending no change except the adoption of some new M$ garbage.

      • Just a dispassionate analysis of what Microsoft needs to do to maintain its dominance. The article is most definitely not saying that .NET is the solution to all the lock-in problems it pointed out earlier. All it's doing is pointing out that, to keep from slipping, Microsoft has to convince people that .NET is an amazing architecture that will keep it safe from competitors like Macs and Linux.

        And the thing about Apple? The author never said Apple was dying, just that it had lost the desktop wars. The reason he pointed it out, in fact, is because they lost primarily because it became "common knowledge" that Apple was losing, and they just couldn't climb out of that hole.

        It seemed to me as though the author was pointing out .NET as a continuation of the trend towards lock-in, not as a solution to the trend.
      • It is so obvious that free code with it's transparency and freedom of modification solves all of the problems the author can dream up and that others suffered. Free software is modular, replacable and never dies.

        I agree very much with what you said, but I have to troll about this one thing. This weekend I had a losing fight with GNOME and compiling a GNOME application on an non-GNU/Linux platform. The cob-web of dependencies is sickening, and even when I got the app to compile without errors, it crashed without a core dump (wonderful, time for gdb-land).

        I love how far Free & OS software has come, but there are still interoperability issues that even autoconf falls far short on. Actually, I am wondering if autoconf is a liability (false sense of security, perhaps). Regardless, there are instances of GNU/Linux lock-in and even distribution lock-in, which aren't nearly as painful as Windows lock-in, but still puts a poker in the ass occassionally.
  • by FreeLinux ( 555387 ) on Sunday March 02, 2003 @11:22AM (#5418496)
    innovations in IT are most often simply more trouble than they are worth

    I see this everyday. Not just in the areas that they are talking about in the article. I see it most commonly on enterprise applications.

    For example a company will have a mainframe based app that they have used for years through a terminal emulator. Everyone knows how to use it and flies through the application often typing several screens in advance. But, some bright spark thinks that green screens are passe and insists on "updating" the application. They spend LOTS of money developing some gui database application or, worse yet, some browser based interface to the application.

    Suddenly, the application is slower than molasses, going up hill on a cold day. No one knows how to navigate the new interface and productivity takes a major dive.

    Naturally, the bright sparks asssume the problem is old hardware and spend another fortune upgrading equipment to get performance back to where it was before. It's a total waste of time and money, not to mention that it pisses off the user community in a major way.
    • If the head of IT makes this sort of mistake frequently, or allows it to happen, then he's doing a terrible job.

      Of course, at my company, it happens plenty of the time too, and the IT manager has no choice: These changes are pushed down from higher up the food chain. Our users usually understand the politics of these events better than anyone in IT, so they seem surprisingly understanding.

      I'd be pissed as hell.

      The only things we work on "fixing" are things that should be cheaper, or things that don't work/aren't reliable. Very few traps to fall in.
    • by MaxwellStreet ( 148915 ) on Sunday March 02, 2003 @12:42PM (#5418805)
      One of the big reasons for upgrading from a fast, efficient mainframe or (yes, it's still out there) DOS applications is ease of integration.

      These days enterprise apps are being required to talk to each other - and some obscure data format from 15 or 20 years ago would cost a -ton- to get integrated with something modern.

      I'm not necessarily disagreeing with you - it's management's responsibility to choose a product that won't kill productivity for too long while the users learn the new system. And an even larger responsibility to prove that the cost of integration (both in user experience and hardware/software/consulting costs) is more than offset by the benefits of integration.

      Too often, upgrading for the reasons you mention happens, with disastrous results. But that doesn't necessarily mean that it's -always- a bad idea.
      • by Kenneth Stephen ( 1950 ) on Sunday March 02, 2003 @12:53PM (#5418847) Journal

        Your comment exemplifies a major cause of the problem. The PC-based, Java only, mind view of so called "architects" are incapable of solving problems without using the platforms that they are familiar with. If a data format is your problem, fix the data format. Todays mainframes are perfectly capable of interfacing / integrating with any hardware / software platform. Yet, when faced with such problems, these "architects" apply the only hammer (PC solutions and Java designs) that they know off to the nail.

        I'm not sure why these people reach architect level positions. Just the other day, one of these architects was advocating Adobe Distiller as a cost saving solution, when GNU/Ghostscript solves the same problem (converting .ps to .pdf) for much much less. Another one was re-architecting an asynchronous application to use SOAP when the existing email based solution had no known problems (other than it used a technology (sendmail and PERL) that the Java-only architect didnt want to learn about.

        • Another one was re-architecting an asynchronous application to use SOAP when the existing email based solution had no known problems (other than it used a technology (sendmail and PERL) that the Java-only architect didnt want to learn about.

          If security is an issue, then Sendmail's monolithic design is an issue. While countless man-years of work have made sendmail more secure than before, newer vulnerabilities are bound to be discovered. However, unless the architect you refer to is including a firewall with a SOAP application gateway in his/her design, it's not clear SOAP is that much of an improvement w.r.t security.
        • I'm not sure why these people reach architect level positions. Just the other day, one of these architects was advocating Adobe Distiller as a cost saving solution, when GNU/Ghostscript solves the same problem (converting .ps to .pdf) for much much less
        • I'm not sure why these people reach architect level positions.

          Sympathy. They weren't good programmers, probably didn't get along in teams...but otherwise were nice people and we'd hate to see them go...okay, promotion!
      • by PD ( 9577 )
        I've used Java applications that take their data from mainframes, sending everything over MQ Series. It's actually a nice product, with guarenteed delivery of messages. It was even simple to use in my Java programs, and I found it fun to use. You don't need to move apps off a mainframe to allow communications with other apps.
    • ... often typing several screens in advance.

      I've dubbed that phenomena "muscle memory". The only way to combat the muscle memory of secretaries, etc., is to give the higher-ups more real-time application reporting capabilities (how many widgets were sold today, etc). The higher-ups can never get enough reports, and they'll forgo anything to get them.

      • The only way to combat the muscle memory of secretaries, etc., is to give the higher-ups more real-time application reporting capabilities (how many widgets were sold today, etc). The higher-ups can never get enough reports, and they'll forgo anything to get them.

        Are you trying to help this company or jst make money?

        • Are you trying to help this company or jst make money?

          Sometimes the best way to help a company is to give the management types lots of cool new toys to play with while allowing the 'grunt work' folks to keep doing their job with the best tools possible. This keeps the PHBs out of the way of the people actually doing the work, and still allows everyone else to be productive.

          Of course not every company is composed of ignorant PHBs and smart, hard-working underlings, but in my (limited experience) it's most often better for the business overall if you can give the execs something to keep them busy. :)

          • Of course not every company is composed of ignorant PHBs and smart, hard-working underlings, but in my (limited experience) it's most often better for the business overall if you can give the execs something to keep them busy. :)

            This, I can understand. It just looked like you were looking to change the stuff that the grunts used and using fancy report generators to justify it.

        • Hey now, she speaks the truth.

          The business people need to know all sorts of things that reports provide. They're always going to need to see it some new way for a client, or figure out a better way to view the data.

          How is giving them what they need not helping a company?
    • We are running an older mainframe app through terminals emulators. The company who makes it announced a new GUI-based app that used a different database structure and everything.

      I told anyone who listened this was a bad idea. Before I knew it, our president forked over $40,000 as a down payment. I begged that we run it in-house parallel for a couple months to see how it worked. It sucked it big time. The GUI was awful, the database didn't make any sense and we would have had to customize the hell out of it.

      In the end our president asked for our 40K back.

      • Very glad to hear that. Most of the time, my group gets ignored by upper management and (to save whatever influence we have) I have to block protests before the execs get too defensive and calmly (privately) see if I can get any kind of compromise. It usually doesn't work, usually for political reasons.

        If you have other tips on how you accomplished this, post em! Begging and pleading tend not to work here -- and are often counter productive.

        • To be honest, I didn't expect it to work for me. I had almost resigned myself to working with this new piece of software.

          I knew our president used our mainframe just like everyone else, so I asked him to help compare the two. That is what I've done in the past. Try and get the person(s) using the new product along side their current product. Even most bone-headed people will complain if it doesn't work. Granted you can't do that for everything, but it works in some cases.

          The problem is when those who decide on things like that, don't use them. Like a Sales VP who never uses the companies CRM product so he doesn't know it sucks.

          • The problem is when those who decide on things like that, don't use them. Like a Sales VP who never uses the companies CRM product so he doesn't know it sucks.

            Agreed...and often the lower ranked won't complain to them either. Example;

            1. On a US Fed. Government contract, I pleaded to to the govenment managers to get users into the loop. Instead, the managers insisted that since we (the contracting company) did this work, that we knew better. That is: Get to work, stop asking me these silly questions -- my people are busy!
            2. Months went by, repeated requests for a user and/or one or two of the managers to simply comment on the design documents if not join the project as oversight, and the same mildly annoyed response was always returned.

              Within weeks of the project's end, the managers finally sent 8 people from different regional offices. For 3 days, the users pointed out all the defects in what we (contracting company) produced. Many little and not so little problems in the design and implementation. Some comments were useless, though overall they were right. It was crap; it didn't handle the work that needed to be done.

              At this point, I had to agree, and my sympathy was with the users. Plans were put in place to address the comments, and a new release schedule was drawn up so that the needed changes could be made. Thankfully, the back end was designed well and would not have to be gutted -- though the changes would take up to 1/2 a year to implement.

              At the end of the 3 day flame fest one of those governement managers came in and asked what the regional users tought. Without hesitation, they all gushed over how wonderful it was and that they couldn't wait to see it in the field.

            I'm still trying to figure out that project.

    • The scenerio you outline is kind of true, but there are still reasons for a GUI front-end. A modern GUI at 1024x768 can display a lot more information in a single screen than a 24x80 character display (I've never seen a mainframe-type app use 132 x yy displays, although I'm sure some have, but it'd be hard to see).

      This alone is valuable; it requires fewer screen switches, fewer DB transactions, and the user can possible draw different conclusions based on more data.

      It can lead to better/faster navigation; the user may be able to view ancillary data without losing the screen they're on, or they can go to an arbitrary application stage without navigating a bunch of unneeded screens.

      It can even be possible to have a GUI client and the old screen-based method at the same time if the DB host can accessed client/server style, either directly or through a middle tier.

      The most important thing, though, is to not just translate screen-for-screen a character-mode app to a GUI; it's important to re-think the application's flow and interface when converting it to a GUI mode, and to do so in a way that preserves whatever investments in hardware and technology you have.
    • For example a company will make an operating system that people have used for years through a command line. Everyone knows how to use it and applications fly. But, some bright spark thinks that command lines are passe and insists on "updating" the operating system. They spend LOTS of money developing some gui interface, worse yet, some browser based interface to the operating system.

      Suddenly, the operating system is slower than molasses, going up hill on a cold day. No one knows how to navigate the new interface and productivity takes a major dive.

      Naturally, the bright sparks insist the problem is old hardware and the world spends another fortune upgrading equipment to get performance back to where it was before. It's a total waste of time and money, not to mention that it pisses off the user community in a major way.

    • This is exactly what my employer did. Just over a year ago, they switched from a 3270-based front-end to a NT4-based system, still accessing the same mainframe servers on the back end.

      Aside from the learning curve associated with the switch, and the loss of years of muscle memory, the company spent an obscene amount of money replacing the terminals with workstations, redesigning the physical layout of the work area, and developing the GUI.

      And you know what? We're not one iota more efficient than we were before the switch. And our IT guys are out on the work floor MUCH more often.

      (Before the switch, you basically only saw the IT guys out on the floor when someone's terminal died (about once every couple of weeks). Now, they have to come out several times a day to reset workstations, fix login problems, and do other NT-specific junk that was totally unnecessary with the terminals.)

      But the funniest thing is that the GUI app only replicates some of the functions. So we still have to access the 3270 screens through terminal emulation to get all the functions done. So much for innovation.
    • Yeah, one of our recent "innovations" where I work:
      • Runs slower, much slower
      • Has no keyboard shortcuts, HEAPS of mousing.
      • Doesn't always work, requiring us to revert to the old system.
      • Didn't solve a single problem that the existing system had. In fact, it introduced some.
      • The client and server software is young and very buggy.
      But we have to use it because:
      • We bought it.
      • It's supposed to be better.
      • As far as snr management is being told, it is better.
      • Supposedly has plenty of benefits, though we didn't buy the package has these benefits.
      The last six months at work have been a mess. Management have been saying we're just reluctant to learn, but we've been avoiding it to do better, faster and more accurate work.

      The system does do it's job a lot of the time, nearly most of the time, but the fact is, we had something better and simpler, and only took a step backwards. But because it cost hundreds of thousands, nobody (authoritive) has told the General Manager that it's a major cockup. Anybody who has get's reminded why it's (potentially) superior in so many ways, with claims that can't be backed up in any way, rather shot down emphatically by half-a-shifts work.

    • i hope there are more people out there like you. It's amazing how hard it is to try to convince others what you are saying - they have been in the path of the market cannon's for too long and it only makes sense to them that the only way to get a machine to run twice as fast is to buy twice as fast hardware [nevermind that the current hardware is running a bloated-OS, running bloated-gui-programs, often with spyware/adware/virusware/JUNK/running in the background.]
    • Suddenly, the application is slower than molasses, going up hill on a cold day. No one knows how to navigate the new interface and productivity takes a major dive.

      I think every big company has done this somewhere. I saw a beautiful one-page web form for entering labor hours and billing codes get transformed into a behemoth web app that literally takes ten clicks of the mouse where there was one. On top of that, it all runs on one Win NT server that gets trounced when thousands of employees "attempt" to enter their time at the end of the week. I've wasted hours just trying to enter the hours I've worked! Ugh.
    • Thanks for the term. I've always known to shy away from people who came up with 'bright ideas', for just the reasons you said. Now I know that they actually have a name, 'bright sparks'. Very good.
  • Crux of the matter (Score:5, Insightful)

    by Tri0de ( 182282 ) <dpreynld@pacbell.net> on Sunday March 02, 2003 @11:23AM (#5418503) Journal
    Is the tension between "innovation" and compatability. Nothing new there.
    from the story

    " Which isn't to say that the ThinkPad was not innovative. However, the innovations came in things like colour and finish, screen size, the new TrackPoint pointing device and short-lived "butterfly keyboard", bundled software, price (low by IBM standards), marketing and support. The ThinkPad innovated in areas that were valued by customers, and customers were therefore prepared to pay for them. However, it did it without departing too far from accepted industry standards, which would have made customers reject it as "incompatible". Lesson learned."

    I have seen very few end users even *THINK* about future compatability if it has the bells and whistles they want/need today. Quite frankly the typical customer does not see WHY there should be so much problem: I've never heard a good reason why the new software can't at least do what the old software did the same way it did it; pretty piss poor UI design in their opinion. Unless one has a Microsoftian stranglehold why should anyone upgrade to new stuff that deosn't work as well as the old stuff; 'working well' being defined by the end user, not the IT department (who exists to serve the end user, not the other way around)
    • by Spoing ( 152917 )
      I have seen very few end users even *THINK* about future compatability if it has the bells and whistles they want/need today. Quite frankly the typical customer does not see WHY there should be so much problem: I've never heard a good reason why the new software can't at least do what the old software did the same way it did it; pretty piss poor UI design in their opinion.

      Agreed, till compatability issues bite them. Then, they are gun shy. For example, my brother-in-law refuses to archive his analog photos -- even though he wants to -- since he doesn't think he'll be able to get his photos back later. I've discussed ways to make sure it is not a problem and he remains unconvinced.

      The reason? He ran into compatability problems when switching between different versions of a word processor. Then, it happened again with a different word processor. Importing and exporting Word docs into WordPerfect added other complications.

      So, I agree, most people don't care -- initially. Later, once burned, they care and feel paralyzed over it.

  • He's right! (Score:4, Funny)

    by Anonymous Coward on Sunday March 02, 2003 @11:25AM (#5418505)
    We never should have moved away from the 8" floppy disc! MFM hard drives were the best! Networking only leads to trouble!
  • Users will not switch to a competitor's product if they believe that their platform will be later updated to deliver the same benefits.
    This is a bad thing? No, to those making the platform and driving innovation on it, that's a very *good* thing.
  • by Anonymous Coward

    Innovation considered harmful articles considered harmful.
  • This is typical "if we can't weave it out of granola then it tortures puppies and brown people" Guardian prattle. How the hell do they think innovation actually happens?

    Do they propose that all computers ever needed was a grooooovy 12th generation permanently backwards compatible rillly rilly kewl AT bus?

    Wankers.
  • by shoppa ( 464619 ) on Sunday March 02, 2003 @11:32AM (#5418524)
    moving from large centralised machines (mainframes with dumb terminals) to decentralised client/server systems (mainframes, minicomputers, and other servers talking to PCs and other smart terminals)

    This shows a remarkable lack of insight into how similar things today are to a few decades ago. A few decades ago we had IBM mainframes and terminals with local blockmode editing; today we have web servers and PC's with web clients with form-filling capability. Are the PC's capable of much more? Yes. Are they often used to do much more? No, not really. The only real difference (ignorning frilly graphics) is that Internet Exploder and Netscrape crash a whole lot more often than a 3270-type mainframe terminal :-)

    • I think you are missing a mid point step which was very important.

      1) Mainframes
      2) Decentralized programs
      3) Recentralization (web interface)

      The complaint people had with the old system and they have with the new system was that it required IS/IT to write a program. Which meant that the casual business couldn't get the functionality they wanted.

      For a while there was a healthy 3rd party software industry and employees loaded their machines with a custom suite of 3rd party apps which came much closer to meeting their needs than the mainframe apps did. On the other hand these PCs were so unreliable and the data so hard to combine that there has been a counter reaction back to centralized solutions.

      But if you look at a company from the business user's prospective you will still see substantial decentralization. Many crucial database tables are maintained in Access on soem guy's desk with the mainframe being updated monthly if at all.

      Of course from an IT perspective little has changed, the decentralized model was an employee rebellion against their IT depeartments. Now that IT departments are back to being underfunded on day to day operations and not being given the money to support most customer requested new projects its likely we will see a push away from centralization. Business departments can't develop large solutions but they deploy small apps on individual PCs.
    • No, not really. The only real difference (ignorning frilly graphics) is that Internet Exploder and Netscrape crash a whole lot more often than a 3270-type mainframe terminal :-)

      It is orders of magnitude harder to guarantee the operation of a 50-million-line program than it is the operation of a tens-of-kilobyte terminal app.

      Many people have yet to learn that complexity is the enemy.
  • I don't think that innovations are bad per se, there are much more involved in this, inertia, marketing, managers, that if not explains alone why some innovations failed, joined could be a recipe for disaster.

    I think innovations are a good thing, but not necessary patented (or in some way with restricted use for only few players) innovations, things that in a way or other get tied to a product or company and sink with it, with no opportunity of rebuilding it in some more sucessful way by the same company or others.

    Also revolutionary innovations, those that change the way we see some technology, are the ones that have the potential of changing it all, or fail, because adoption will not be as fast as evolutionary innovations, the ones that are improvements over existing technologies (a simple example could be MCA vs EISA, one completely new, other an evolution of an existing standard).

  • by Brown Line ( 542536 ) on Sunday March 02, 2003 @11:59AM (#5418631)
    The article's point, IMHO, is that change for change's sake is not good. Sometimes change is clearly the right thing to do - for example, replacing job-control language with a modern operating system is (usually) the right thing to do, as is replacing assembly language with a high-level language for writing applications. The gains in reliability and maintainability make the effort worthwhile. However, change just for the sake of change is often - usually? - leads to a degredation of reliability and maintainability, rather than the other way around. Companies that pursue a will'o'the'wisp often rush into a bog. The point is, it's not too much to ask managers to perform some basic cost-benefit analyses before they sign onto the latest fad.
    • IMHO the article really makes the valid point that often time innovation does not have a future. If you make the wrong choice you get caught. It is then in the realm of forture telling to make a good long term decision. Take for example the inovation of the PC. Who knew at the beginning that that would take off. Or the innovation of the Internet. Or the innovations of JAVA say.. well that one is being challenged now by .net... here let me throw the bones to see which direction at the fork I should take so as not to reach a dead end, end up in the swamp or worse yet run in to robbers and theives and multiple toll gates.. thowing the bones... looking down... hmmm..


  • it was nice as long as it was a flourishing
    community of hackers in true spirit [like the
    MIT AI lab, the "altair" revolution etc]

    the more and more IT has become mainstream the
    trend seems towards dumbing down towards
    the common good and innovation has been quite
    slow relatively [or at least purely based on
    "consumer demand" which is defined by the "market survey" folks].

    i think this is the case for any industry.
    an industry maturing generally tampers the
    speed of superb innovation found in its
    initial stages. except in the case of IT the hype has been much more and the effects global.

    thanks for reading,
    vv
  • by hillct ( 230132 ) on Sunday March 02, 2003 @12:08PM (#5418663) Homepage Journal
    This should come as a siprise to no one. 'innovation' is the strategy by which platform vendors differentiate themselves in a bid for greater market share. It's beneficial to the vendors so we won't see this sort of thing end any time soon. It locks their customers into tyheir platform for the long haul, which is why you will never see the same 'innovation' made to all or even several platforms at the same time. Leveraging innovation to facilitate greater synergy is the IT industry's answer to advertising verbage such as 'new and improved' you often see on any consumer product marketing materials. It certainly is harmful, but it isn't going to stop any time soon. There was something of a backlash to ptoptietary innovation back in '98 and '99 so vendors began to work more with open standards, as a half measure to apease consumers. Microsoft is a good example of this. Their strategy to 'embrace and extend' open standards, to again differentiate their product offerings has worked out extremely well to date. It certainly isn't ideal for the IT consumer, but this is where modern marketing and business practices meet the IT industry's little piece of the world-wide technology market.

    --CTH
  • 'Down with the hype (Score:3, Interesting)

    by Netmonger ( 3253 ) on Sunday March 02, 2003 @12:24PM (#5418709) Homepage
    Im sick of vendors thinking that it is ok to re-invent everything every couple of years and force all your customers to adapt to it.

    Im even more sick of watching silly companies spend the re-structuring costs because they believe all the hype thats said.

    Some IT departments spend all their time upgrading to get the latest versions of everything working together instead of just using what they have to its full potential.

    Thats ridiculous.. just pick a technology and do something with it!

    If a particular technology works for - use it! Dont let the marketing hype disuade you from using it.

    Its more important to get your project done than it is to develop it on whatever someone tells you is the latest greatest technology.

  • by Booie Paog ( 640418 ) on Sunday March 02, 2003 @12:30PM (#5418745)
    it's wrong in the way that the words 'innovation' and 'pointless' are used and defined. Innovation, in and of itself, is not bad at all. It is the product or process or idea that has no merit, or is realized to have no value. But even with those *without* any value to it creators, the public can, and does, benefit (sometimes greatly) from that exercise. Example: the innovation that ANYTHING could be sold on the internet. or the innovation that brought about the idea and business model solely resting on advertising revenue. did it work ? no. did people lose their jobs, and billions of dollars ? yes. but that doesn't mean that the exercise was "pointless"...to the contrary, the current environment is now able to change and better predict future ideas on these precedents. "pointless" is a word that is not only subjective, but I would say incorrect in this article. Replace the word "pointless" with "sometimes doesn't produce something that the originators can't make money from".
  • by eyefish ( 324893 ) on Sunday March 02, 2003 @12:31PM (#5418752)
    Today's ERP systems (SAP, PeopleSoft, Oracle, etc) suffer from the exact same problem: they promise you the moon (and many times actually deliver it), but once you depend on it you're completely stuck with it. In the case of SAP (and the same case happens to other ERP systems) if later you want to change something it's going to cost you big. Plus you usually pay very high consulting and maintenance fees.

    The same can be said of other packaged applications which do not make public their data storage formats and/or communication protocols.

    This is why I think it is such a big deal to have (1) a true cross-platform executable platform (i.e.: java), (2) a true cross-platform communications protocol and data interchange (i.e.: XML), and whenever possible (3) a comprehensible and standards-compliant-as-possible data repository (i.e.: mySQL, Postgress).

    Note that regardless of the article being viased or not 9as some other readers here point out), the reality is that many IT managers are beginning to realize this now. This is why the huge push to Linux, Java, PHP, and XML, and many Open-Source technologies.

    It is also why Linux, XML, and J2EE (Java 2 Enterprise Edition) has had such a success, and why many IT managers are thinking twice about Microsoft .Net.
    • (3) a comprehensible and standards-compliant-as-possible data repository (i.e.: mySQL, Postgress).
      It's quite possible that I've missed something, but in what sense can "mySQL" be said to be "standards-compliant"?

      The Postgresql folks clearly care a lot about complying with the SQL standard, but despite the name "mySQL" doesn't seem to care that much (maybe that was the joke? "It's *my* SQL now!").

    • "It is also why Linux, XML, and J2EE (Java 2 Enterprise Edition) has had such a success, and why many IT managers are thinking twice about Microsoft .Net."

      If only that were true. Many companies sincerly believe that .NET is an open industry standard that will lower support costs and integrate better with every other ms-only product. Go to zdnet and read any .net article. Its full of people who have fallen for the hype and say things like "....7x performance and 1/4th the code of java...ms = innovation" or "..I can't wait for the open IEEE complaint .net for linux...". They do not see the danger of proprietariness and look at it as an advantage towards integration.

      I think alot of coders who hate the win32api are the ones talking about .net hype not to create webservices but to convince their bosses to ditch the hell that is the win32api. Many improvements like winforms get rid of alot nasty code that needs to be typed in.

      Is it just me or is not a perl cgi or a java servlet a web service? My sig below tells of MS VP Jim Allchin claiming that Amazon.com is a non webservices enabled website. Why? Because they do not use .net. Isn't that silly. XML is another thing. Isn't it not just a text file with tags? Sure you can create SOAP objects with it but unix/linux been doing everything is a file for decades. Just use a program to parse an ascii file and create an object based on the attributes. Whoa totally new concept. I wonder if kde uses this....

      Its all the same old hype and crap. In this age of penny pitching and bean counting I am glad that CIO's now have to decide what works and what does not. Not doing the follow the leader hype. I hope most of the big companies who still have big IT budgets will see this. Maybe this is why MS is dissapointed that not everyone is switching to .NET as fast as they hope. But still sadly the trend .net is growing thanks to a full active force of salespeople at Microsoft planning them the moon to CIO's. The climate of penny pitching is really only slowing down adaption and not actually stopping it.

      Infact the gartner group did a study showing that close to half of all companies want to ditch java and go with Microsoft. They are still hyped up about them and many IT executives do not tinker with linux or perl. They view VB as the standard of what every language should be.

  • "It's a trap!" (Score:4, Insightful)

    by AndroidCat ( 229562 ) on Sunday March 02, 2003 @12:57PM (#5418866) Homepage
    One of the ironies that the article points out is that you can use innovation to lock people into your products. They'll think twice before decarding what they have to switch to something completely new.

    But on that day that the ground moves under you, all that "lock in" suddenly turns around and bites you. Until now, the idea of going all Microsoft was good. All the Office products work well with each other, they work well with the OS. But now that plus is turning into a minus. If you want to keep using Office, you have to accept the next OS from Microsoft. You can't keep using what you have now. And if you have to make a change, why not look around? And if you know that everyone else is thinking twice, think three times!

    Microsoft benefited from the last Great Change when Win 3.0 took off. Suddenly all the kings of the DOS world suffered a Reality Reset and had to compete on a new playing field. Microsoft's playing field. Why not switch from Word Perfect and Lotus 1-2-3 if you're changing the "OS"?

    We live in interesting times.

  • From the article:


    The issues that actually drive the corporate agenda in the IT industries therefore include: how do we develop something that makes it hard for users to switch to a competitor, or for a competitor to clone us and compete with us? Their weapons include patents, copyrights and other intellectual property rights, and especially the proprietary control of the hardware, software and peripheral interfaces that affect compatibility.


    Whereas in the world of Open Source the impetus is to make software that works well with other software.


    The simple solution to this problem? Demand that your vendors implement solutions that run on Linux. Its a quick cure to being "locked in" to a particular hardware platform or individual piece of software.

    • The simple solution to this problem? Demand that your vendors implement solutions that run on Linux. Its a quick cure to being "locked in"..

      No it isn't. It's a quick route to ensure being locked into the Linux platform. You might not be paying any money for that platform, but you're locked in just the same.

      Now ensuring cross-platform compatibility where possible, now that's a better route. Most of the open source infrastructure solutions that ever get discussed actually do things this way.

      Cheers,
      Ian

  • hubris (Score:1, Insightful)

    by Anonymous Coward
    when the author provides us with a perfect formula for predetermining viable innovations vs inviable ones I'll take his assertions more seriously.

    you'll notice that his examples of gratuitous , and supposedly damaging , innovations ( IBM , Motorola ) were both squelched by the market. Heaven forbid the Guardian acknowledge the utility of market forces.
    • by Anonymous Coward
      Darth Guardian: "Hmm, the Market Forces are strong in this one!"
  • by frdmfghtr ( 603968 ) on Sunday March 02, 2003 @01:21PM (#5418973)
    The problem with pointless technical innovation is that it is trivially easy to do, especially in computing, but hugely expensive for users.


    This isn't innovation; this is marketing attempting to create a demand to cover the costs of developing a new technology. GM did this in the early part of the 20th century. The product drove the market demand. WHen GM developed the GTO in the '60s, the initial plan was only to produce 5000 of them. However, when they were flooded with 15,000 orders, suddenly the market demand was driving the product development. Innovation needs to provide INCREASED value to the customer at the same or REDUCED cost. "Innovation" that is expensive for users isn't innovation at all. It's creating a sustained revenue stream for the company developing the technology at the EXPENSE of the user.

    If we could start again from scratch, with hindsight, we might well decide to adopt the MCA bus, or something similar. Since we are not starting from scratch, we have to consider the switching costs. And unless the benefits are much bigger than the switching costs, we are not going to switch. Certainly we are not going to switch every six months, or every time some manufacturer brings an "improved" but incompatible system to market.


    Now, the MCA developed by IBM may have been innovative on a purely technical basis; however, to adopt this innovation it would have cost the end users a lot of money in terms of replacing already existing hardware that was incompatible with the new architecture. If the MCA had been made compatible with existing hardware, then it would have been innovative. (Risking exposing my hardware design naiveness here, but that's OK)
    • If the MCA had been made compatible with existing hardware, then it would have been innovative.

      A backwards-compatible MCA would have defeated the whole purpose of its creation. The MCA was designed for pretty much one reason only: So IBM could wrest control of the market back from the cloners.

      IBM assumed the public would rush to adopt whatever they, IBM, deemed The Standard. But they were wrong-- the PC was already well on its way to becoming a commodity thanks to the cloners, and the cloners were also beginning to innovate on their own instead of just cheaply copying what IBM did.

      Ignorant of this, IBM expected stiff royalties from cloners who wanted to put MCA in their machines (including royalties for a percentage of the original IBM PC-compatible machines the cloners had previously shipped). The cloners balked at this two-for-me-one-for-you deal, the bus fizzled into obscurity, and IBM's ploy failed. It was at this point that IBM essentially became a nonfactor in the desktop personal computer market.

      ~Philly
  • Of course... (Score:2, Insightful)

    by wouterke ( 653865 )

    Of course innovation isn't always a good idea. As a common sysadmin's saying goes: "if it ain't broke, don't fix it".

    That doesn't mean there should be no innovation at all; however, most "innovations" from commercial software writers are just there to make you open your wallet, not to improve the quality of your IT-infrastructure.

  • Seems to me that the author is simply stiring the pot.

    Move along.

  • Renember how UNIX started out with all these researchers from all these companies working together and sharing code, and building a unified OS standard until AT&T decided to claim back the copyright. Then all of a sudden DEC, IBM, HP, and AT&T all started to have their own flavors of UNIX that became notorious for fragmenting and impossible to keep together on future standards. In fact, inspite of constant industry pushes for a standardized UNIX and MS taking advantage of this fragmentation to march in and kick everybody's but, the only thing that really brought UNIX innovation back onto the same page was Linux with it's free license.

    The fact is, this happens everywhere in the software industry, and is an "intellectual property" problem, not an innovation problem. IP may have been bearable when all it involved was a librarian saying what you could xerox, but in the information age it is not workable and causes us to come up with crazy philosophies - like trying to define certain types of innovation as bad.
  • by Anonymous Coward
    Changes made to perpetuate one's monopoly / undercut the competition != Innovation. What a wild mis-interpretation of the word.
  • The trick is to stay with the things you know inside your company, and sell the new stuff to your customers.

    Low "innovation" costs, high profits.

    Added benifit: When customers like and adopt it, you can follow later on, having your customers discover problems so you can circumvent them.

    "Plain text" is the standard I beleive in... ;-)

  • Although I agree with most of what the author says, I believe he is missing some important points. MCA failed for a variety of reasons, most of them dealing with marketing. Here is my list:

    1. Wrong type of innovation
    2. Too proprietary
    3. Too expensive

    Wrong Type Of Innovation
    Although MCA was highly innovative, this fact was only obvious to computer geeks. It alleviated the problems with IRQ/DMA/Memory conflicts as well as device identification, and it also added the ability to attain faster bus speeds. But the typical business user didn't understand most of that. From the user perspective the first PS/2 machines were not a huge improvement. The author's examples of positive innovation (CDs, ThinkPad) have significant benefits that are obvious to everyone.

    Too Proprietary
    IMHO, part of the reason that MCA never "took" was because it would have put a good number of computer and peripheral manufacturers out of business. So they fought back by continuing to produce ISA machines and creating the EISA standard. This worked until the PCI and 'plug and play' standards were implemented. If IBM had shared the MCA technology, instead of trying to exclusively license it, the industry may have adopted MCA.

    Too Expensive
    The final nail in the coffin was price, and the article hits this one on the head. This was back when peripherals other than mouse, keyboard, and monitor could be recycled from PC to PC, and upgrading to a PS/2 meant buying all of them over again. Today this is not as much of an issue (every time I upgrade, I have to buy a new type of memory, for example), but back then it was a big deal.

  • It's not just software that has problems in this regard.

    One place where innovation is obvious is in car radio features over the last 10 years. In the old days, you had a manual tuner and a volume knob. You had some push-buttons for half-a-dozen stations. And that was about it.

    A positive innovation was RDS - now I can select a station and keep tracking it even if I move out of range of one transmitter and into another on a different frequency. This is appropriate technology - find a problem and solve it as unobtrusively as possible.

    A negative innovation (on my model at least) is the lack of a volume knob. Instead, there's a multi-function control button marked +/-. Normally, that's not too bad, but the mode buttons are immediately above it. If I hit a bump while trying to change the volume, I suddenly get stuck in a mode where I'm changing the balance, fade, or something else. More functionality, sure, but badly implemented. Setting up tone, fade, etc. is something you do once. Why not have a single, recessed, "setup" button and keep it out of harm's way?

    My mother has had such a bad experience with the radio in my step-father's car that she will not even turn it on. "It never does what I want" she complains. That strikes her as very annoying. After all, she has been working radios quite satisfactorily for several decades.

    All of these problems are UI design problems, not problems with innovation per se.

    Another non-UI negative innovation in the car is the introduction of proprietary protocols for in-car audio. My radio is also a head unit for a CD changer, but my efforts to build or source a MP3 jukebox that interfaces to it have been scuppered because Ford do not publish the protocols (or pinouts) for the interface.

One way to make your old car run better is to look up the price of a new model.

Working...