Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Technology

Worst and Best Predictions on Technology 196

prostoalex writes "Dow Jones News asked several mahor scientists and technologists about their worst and best predictions of the future. The story, republished at Yahoo! Finance Singapore quotes Lester Thurow, Professor of management and economics, Massachusetts Institute of Technology's Sloan School of Management; Nicholas Negroponte, Founder and director, Massachusetts Institute of Technology Media Lab; Glover Ferguson, Chief scientist, Accenture; Alan Nugent, Chief technology officer, Novell; Peter Cochrane, Director, ConceptLabs; Michael Earl, Dean, Templeton College, University of Oxford. There seems to be a common agreement on having overrated the ability of machines to talk back to users and vice versa."
This discussion has been archived. No new comments can be posted.

Worst and Best Predictions on Technology

Comments Filter:
  • predictions (Score:2, Funny)

    by harks ( 534599 )
    I'm still disappointed and waiting for my nuclear powered vacuum cleaner.
  • ...over-rated the ability to make the technology work properly in the first place?
  • Overrated (Score:2, Insightful)

    by Snaller ( 147050 )
    People always overrate the future, its the one constant..
    • Re:Overrated (Score:3, Insightful)

      by richie2000 ( 159732 )
      +1 Overrated
    • People always overrate the future, its the one constant..

      "The future looked much better in the olden days." -- Grandfather
    • Actually, people overrate the changes in the near term and underrate them in the long term.
  • Talkback (Score:4, Funny)

    by richie2000 ( 159732 ) <rickard.olsson@gmail.com> on Saturday September 28, 2002 @09:37AM (#4349763) Homepage Journal
    I predict we need more machines that talk back to authors when they find mahor spelling mistakes.
    • MANY more machines. And this effect, the effect of being drowned in messages grammatical errors and/or spelling errors, should appropriately be called "Slashdotting".
      • No, "Slashdotting" is the DDOS attack initiated by posting a "story" (an euphemism for "target request"). This should more appropriately be called "Slashdoting". ;-)
  • by realgone ( 147744 ) on Saturday September 28, 2002 @09:37AM (#4349764)
    Today's favorite: Biotechnology advances will radically transform our world and our bodies.

    Mr. Thurow says higher IQs and more beautiful children will be among the benefits of biotech advances. "For the first time in history, people will be able to change themselves," he says.

    Will someone please make sure that Christopher Walken is in the balcony with a rifle at this guy's next public speech?
  • What!? No flying cars?
  • by Anonymous Coward
    Nothing really imaginative on the horizon.

    CPUs, for example - who cares anymore? 95% of us couldn't care less about upgrading, and the chances are that the last upgrade was just for the sake of it, not because we really needed a faster processor. Not like 10 years ago, when it was, 'Wow! The Pentium is going to be a big leap forward'.

    Memory - so cheap, who cares anymore? Even 5 years ago, I was thinking, 'Wow! I've finally managed to afford 128 megs of RAM!!!' Most other people had 32 or less. Now, who cares? I could afford a gig of RAM, but what's the point?

    Hard disks - mine is about 20% full, and has been for months. No need to upgrade.

    Monitors - the few people who actually need a screen bigger than 17 inches can now afford them. LCD monitors are no longer a novelty.

    Mice - optical mice are no longer a novelty

    Bandwidth - OK, so ADSL is still 'exciting', but for how long? In two years, anybody who wants it will have it.

    Optical storage - recordable DVD is here. CD-R is rediculously cheap. Who needs more storage than that?

    OK, that's hardware, what about software?

    Linux kernel - it's excellent. However, the excitement of a few years ago is dwinding. Don't get me wrong, Linux is excellent, but now that we've got a really good free *nix, the fun of developing a really good free *nix isn't there.

    GNU/Hurd - maybe oneday this will become interesting :-)

    Windows - I hate Windows, but at least the launch of 95 was interesting. The lack of initial enthusiasm for 98 was interesting. After that, it got boring. Now, it's just more and more waffle about DRM. It's *boring*.

    The only things I can see on the horizon that might be interesting are:

    * IPV6
    * Linux on non-i386 platforms.
    • Since lots of CPU, RAM, hard drive space, and other stuff are now available, who cares? I do. Once more computer capabilities become available, software will be made that will take advantage of them. I know that games are pushing technology forward because they are always finding some way to fill all of the capacity of modern computers. Doom 3, for example, now has souped-up lighting because computers can handle it. Realtime raytracing (and raytracing in general) could definately benefit from faster processors--and as processors become faster, neat new computationally expensive things will become widespread, because they can be.

      I agree with you on some other things that could become interesting. IPV6 could allow IP addresses everywhere, which will probably be taken advantage of. It also supports packet prioritization, which would be very good for VoIP and related technologies.

      Linux already runs on several non-i386 processors, and it is commonly used on these in, for example, embedded systems. Embedded systems, I think, are quite exciting. And Linux (or one of the *BSD's) will probably be the kernel of choice for those, since the idea of putting an OS on one of those is to allow the device to be programmed easily and not be noticed by the user. From that perspective, Linux is obviously superior to any harder-to-develop-for OS that you have to pay for, like Windows.

      I'm still excited about the future. Are you?

    • by Mac Degger ( 576336 ) on Saturday September 28, 2002 @12:08PM (#4350222) Journal
      First off: you don't use your computer for anything intensive, do you? I use it for 3d modeling and animation, and boy-oh-boy do I need the extra cpu-power, the extra ram, that superduper new gfx card. At least, if I want to move the objects at anything but frame-by-frame on my monitor.

      As for the HD...yeah, I photoshop my own textures. You bet that I need that HD-space for something else than divx'.

      And all this certainly comes in handy when I have to do some finite-element analysis for school (or any other simulation for that matter).

      Added bonus: I can play computer games with realistic graphics on it, too!

      Now, secondly; there is more to life than the computer itself. Read the very last line of the article...damn if that's not true, and maybe the most important piece of the whole chebang (sp?). Also, the bottom-up telephone system...that got me thinking bigtime. I like that idea.

      Oh, and just to prove I can't count, here's number three; you want new stuff? There's whole area's of the universe not understood yet, where breakthroughs are coming (just you wait). Just a couple are: the nature of time (we still have no clue!), human nature in mind and body (what is the mind?, the soul? and what about huge breakthroughs in understanding becoming possible by biochips?). There's loads more, all only coming within reach because technology is making it possible for us to simulate/look at/describe these systems and phenomena.

      Trust me, we don't know nothing yet.
  • by CommandNotFound ( 571326 ) on Saturday September 28, 2002 @09:51AM (#4349795)
    ...is those pesky users and their fickle minds. For instance, who would have thought that most people actually don't *want* video phones or flying cars or talking computers? Or at least, they don't want them enough to drive the technical development of these things, since standard phones, autos, and Windows seem to do the job well enough.

    In other words, just because a technology looks like it's the "right" way to progress next, doesn't mean the market will allow it to move along.

    I think we'll see this with Web Services (noted in the artcle as the current Next Big Thing). At it's core it's simply a formalization of how CGI developers have been working for years, yet most people and developers still prefer to use a generic web browser to diseminate most information, vs. using a custom client and a web service. Why? Because developers don't want to support another client program, and users don't want to download another one when they can just enter www.weather.com/my-zip-code to get the current weather forecast. I don't think it's been the lack of a formal parameter/return value standard that has held this idea back.

    Don't get me wrong, I think Web Services are a nice tool, but unfortunately I see it as a problem looking for a solution. For most end-users it will mostly be a poor substitute for a URL (wait until your co-worker comes in to show you his spiffy new .Net web services demo! It will show you the current news and weather! OVER THE INTERNET! Oh, just install this 300MB library+runtime first. Ok, now install my 30MB client app. Oh, yeah, that didn't refresh properly, did it? Exit out and restart. Dang. [this is better than a browser how?]), and for most in-house developers it will be just another call to use instead of dlopen() to open shared routine. And until the Net becomes totally ubiquitous and telecom-reliable, I don't see many shrink-wrap developers linking in lots of remote Web Services on the fly, when most of that functionality can be placed locally during the install.
    • just install this 300MB library+runtime first

      I didn't have to install anything unusual to use the following web services demos [slashdot.org].

      Also, web services are a back-end b2b communication platform first and foremost. Like any other back-end protocol, the front-end can be whatever you want -- http, http + applet, Flash, thin client, thick client, C, C++, C#, C&%^$&*?, whatever.

      I check my mail using both a thick client, and a web browser. Both work fine and have their uses.

      Sounds like your real beef is with .Net. Don't forget, .Net != web services. There are many of us developing web services with no intent to use .Net in any way.
    • Exactly! I think the interesting prediction is how long this hype over web services will go on before everyone realizes it's a bust.

      For me the best prediction in the article was this: "The Internet will ultimately be more about information than transactions."

      I think the IT sectors current fling with web services is just another dot.bomb waiting to happen.
    • For instance, who would have thought that most people actually don't *want* video phones or flying cars or talking computers? Or at least, they don't want them enough to drive the technical development of these things, since standard phones, autos, and Windows seem to do the job well enough.

      I think that all three are more of a case of poor technology than lack of desire.

      I would love a working video phone where I could use my TV as the screen; but I don't have that, and neither have I ever seen it.

      I would love a flying car--a VTOL, efficient, computer-controlled flying vehicle that is no larger than a current large automobile. But I'm not going to get it, because no one can figure out how to make the darn things float when powered down.

      I would love, love, love it if my PC really could hold an intelligent conversation; but the voice-command programs are no better than a keyboard (natural language Command line would be a better place to start) and the voice-recognition programs require too much time to train (and still get words wrong) and voice-speaking programs just sound bad.

      • I would love a flying car--a VTOL, efficient, computer-controlled flying vehicle that is no larger than a current large automobile. But I'm not going to get it, because no one can figure out how to make the darn things float when powered down.

        Flying cars isn't quite the market that they're aiming for but Cartercopters [cartercopters.com] look to have that problem sewn up- they don't float down; but they are designed to survive complete loss of power without any major issues, and can land in any small clearing.

    • In other words, just because a technology looks like it's the "right" way to progress next, doesn't mean the market will allow it to move along

      This should be engraved on the proverbial tombstone of the dot-com era.
  • by stewby18 ( 594952 ) on Saturday September 28, 2002 @09:51AM (#4349797)
    Worst prediction: People would be talking to computers.

    What's he talking about? I talk to computers all the time, especially Windows machines. "What the hell do you mean the zip drive can't be found?! It's right there!"

    • The problem with people talking to computers is that it is not Star Trek.

      I can remember when the first mac came out with the first voice recognition technology of any kind. at least one couple were heart broken that it wasn't "like Star Trek".

      In other words a computer that you could speak to, that would answer your questions, tell you what it needed or what you needed, do all of the calculations, and also have the infinite patience that a machine would have in dealing with a human.

      being able to say things correctly is important too.

      I'm sorry George. The bank says that you do not have enough money in your bank account for that purchase

      is much better than

      stupid, you do not have the money

    • Seems to me that computer speech recognition has gone mainstream in the last year or so. I'm talking about telephone services to confirm reservations or get information for airlines, rail travel etc. Almost all of these now work by voice around here. You say your flight number or departure city and the system retrieves the information.

      They work pretty well in my experience, recognizing even long numbers. Probably helps that they are looking for matches in an internal database.

      So I'd say this is one prediction which is finally starting to come true.
  • Predictions.... (Score:3, Insightful)

    by I_am_Rambi ( 536614 ) on Saturday September 28, 2002 @09:52AM (#4349799) Homepage
    Worst prediction: People would be talking to computers....Mr. Negroponte would welcome a breakthrough. "I've been wrong for a long time," he says. Isn't there a program called something like ViaVoice? Doesn't Office XP come with Voice Reconigtion? Doesn't Mac OS 9 (I believe) have voice passwords? Don't people use it? I don't think this is a worst prediction. Yes, the reconigtion program isn't that great, but it is getting better and better. Where has this guy been living (and what computer has he been using) to say that he is wrong?

    O, and btw, don't we all talk to computers even if we don't have voice reconigtion? "Come on, you can do it", "Stupid Windows", "Good job", "You stupid dimwit" are just some examples. This would be concidered talking to a computer. In light of that, talking to computers is done everyday almost by every person.
    • I predict that in 10 years /. has been ranamed to ./ and focuses on local news on each continent.

      I predict that in 10 years weblinks have become illegal because
      a) they are almost invariably a copyright violation
      b) they can be used to direct slashdot DDoS attacks

      I predict that in 10 years we have moved on and slashdot is being read by another generation of pimple faced nerds.
    • Voice recognition is not sufficiently stable right now to make it easier to use than the old fashioned keyboard. It makes a novel toy, or a useful workaround for people who can't use a keyboard, but it really isn't ready for prime time.
      -aiabx
      • Re:Predictions.... (Score:2, Informative)

        by AvitarX ( 172628 )
        Voice will never be more effective then a good keyboard. A good typist using a bad (QWERTY) keyboard gets 60+ WPM, try talking that fast and keeping your thoughts together, or even accuratly reading that fast. Even A modest typer can type better then speak. Imagine an office with everyon talking to their computers.

        Even in star treck there was very little actual voice command, they had keyboard things all over the place. I would say most voice interaction was information lookups to the point that google will be able to do in 15 years. But for real commands and interface it will be non voice.
        • ...Even A modest typer can type better then speak. Imagine an office with everyon talking to their computers.

          Even in star treck there was very little actual voice command...


          Yeah, but imagine how much smarter everyone will sound.

  • 30-year rule (Score:2, Insightful)

    by mmoncur ( 229199 )
    Most futurists follow the same "30-year rule" that science fiction writers follow: If you want to predict a sweeping change that will revolutionize everything, place it about 30 years in the future. If you doubt this, just look at virtually every mainstream sci-fi flick that takes place in the future. This might have started with George Orwell's "1984", first published in 1954.

    I think people tend to come up with 30 years because (a) it sounds far away enough for anything to happen, and (b) it's soon enough that they might be alive to see it.

    [obPrediction: by 2032, Slashdot will have its own TV show]
    • Nice theory.

      Shame it is flawed.

      "1984" was published in 1949, not 1954.

      Oh, and Orwell set it in 1984 because he wanted to pick a time reasonably in the future, and as he was writing it in 1948, he just swapped the last two year digits round, thought it sounded like as good a future date as any, and used it.

      No "30 year rule".

      Nothing to see here.

      Move along.
      • Oh, and Orwell set it in 1984 because he wanted to pick a time reasonably in the future, and as he was writing it in 1948, he just swapped the last two year digits round, thought it sounded like as good a future date as any, and used it.

        Actually, the title and year it was written has more significance than that, but if you've read it and decided to accept that it just 'sounded as good as any', I don't think I want to try to explain it.
      • Shame it is flawed.

        Someopne points out that most sweeping-social-change sci-fi is set about 30 years in the future. You point out that when 1984 was written, it was set about 30 years in the future. Conclusion: the "30-year" rule is wrong.

        ?????
      • > he just swapped the last two year digits round, thought it sounded like as good a future date as any

        Think of the fictional future as a reflection of the (then) present.
      • There is (probably) a 30 year rule though. Things really do seem to take 30 years. That is probably because the people who get control/power/money want to maintain the status quo. They take that long to get old and retire. They need to be out the way for progress to be made, in many cases.
    • Re:30-year rule (Score:2, Informative)

      by Chuq ( 8564 )
      Maybe you've watched too much Back to the Future? (1955 - 1985 - 2015).

      Zemeckis and Gale said they chose 1955 in the first one because its the typical generation gap - a typical age at which married couples have children. They wanted to choose a time where Marty's parents would be teenagers. They were mid-40's in 1985 (47 to be exact), so 1955 was a nice round number, and would put them at 17 in the past.

      Similar reason when they went to 2015 - they wanted Marty and Marty Jr to be the same age (as Michael J Fox played both of them).
  • by Snaller ( 147050 ) on Saturday September 28, 2002 @09:58AM (#4349818) Journal
    Make a subject where the users can enter their predictions about the future - then we return in ten years and check it out :)
    • by InfoVore ( 98438 ) on Saturday September 28, 2002 @10:48AM (#4349979) Homepage
      You should check out the Foresight Exchange [ideosphere.com].

      Basically it is an idea stock market. When you become a member, you receive a small amount of fake investment money. You can then buy and sell against ideas posted by other members. The premise is the the closer an idea is to being true/possible, the higher its value will be in the market. Ideas do have adjudicators who are responsible for judging when and if a stock has met its criteria and can be pulled off the exchange.

      Here is an example of the top 10 traded ideas on Foresight Exchange now:

      Rank Volume % Symbol Short Description

      1 26234 83.4% T2007 True on Jan 1 2007

      2 1034 3.3% BBRP Bal Bdgt 2002 w/2000 GOP Pres

      3 803 2.6% USIraq US attacks Iraq in a year.

      4 437 1.4% HURR02 Atlantic Tropical Storms 2002

      5 371 1.2% ObL1yr Osama bin Laden 1 year after

      6 275 0.9% $bill U.S. Prints New Dollar Bill

      7 222 0.7% SCHRDR Schröder Remains Chancelor

      8 193 0.6% Clone Human Clone before 2005

      9 160 0.5% King Prince Charles remains heir

      10 154 0.5% SLvl 1 m rise in Sea Level

  • This sounds like a good time to sort out an old (possibly apocryphal) quote I heard, it was allegedly attributed to Issac Asimov. He said something to the effect that "any time an expert says something is absolutely impossible, it is certain to happen, eventually. Any time an expert says something is possible, it will happen sooner than anyone expected."
    Now I'm sure I've mangled that, Asimov could spin a phrase much better than that. But it does sound like Asimov, an ironic skepticism against skeptics, disbelief in pundits, and a belief that we are most infallible when we claim things are impossible.
    • Clarke's Law (Score:2, Informative)

      by ColdGrits ( 204506 )
      You mean Arthur C Clarke's First Law - "When a distinguished but elderly scientist states that something is possible he is almost certainly right. When he states that something is impossible, he is very probably wrong."

      Do at least try to attribute the correct author!
      • Thanks for the attribution, that's what I was looking for. But I'm not so sure that what I heard was Clarke's Law. I think Asimov was spoofing Clarke's Law with one of his own. Hard to recall after all these years, that's why I'm tossing this out to the /. hive-mind.
        • But I'm not so sure that what I heard was Clarke's Law. I think Asimov was spoofing Clarke's Law with one of his own.

          Asimov's Corollary to Clarke's First Law: When the lay public rallies round an idea that is denounced by distinguished but elderly scientists, and supports that idea with great fervor and emotion -- the distinguished but elderly scientists are then, after all, right.

          • Asimov's Corollary to Clarke's First Law: When the lay public rallies round an idea that is denounced by distinguished but elderly scientists, and supports that idea with great fervor and emotion -- the distinguished but elderly scientists are then, after all, right.

            Asimov hedged his corrolary with the statement that the distinguished elderly scientists are "quite probably right", and cited vaccination as one of the rare exceptions.

            Of course, the distinguished but elderly scientists were (mostly) brought around in the face of hard evidence that a cowpox innoculation really did confer immunity to smallpox, which distinguishes that case from the various newage popular enthusiams that were Asimov's intended target.

    • Its Clarke's 1st Law (Score:3, Informative)

      by InfoVore ( 98438 )
      You are close. What you are refering to is known as Clarke's 1st Law. Arthur C. Clarke (scientist, futurist, and one of the great Science Fiction authors of all time) came up with 3 laws:

      Clarke's 1st Law

      When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

      Clarke's 2nd Law

      The only way to discover the limits of the possible is to go beyond them into the impossible.

      Clarke's 3rd Law

      Any sufficiently advanced technology is indistinguishable from magic.

  • "Telecom could invert itself and become a bottom-up phenomenon," is a deliciously subversive idea. Like the local currency [zmag.org] systems - LETS [u-net.com] and whatever comes after major labor music distribution this promises to really shake things up in a good way (read: shaft the bad guys) and is also right around the corner.

    Then I'll get a cell phone

  • Glover Ferguson, Chief scientist, Accenture;

    I predict [fuckedcompany.com] that Mr. Ferguson might need to find a new job before too long.
  • Tech predictions (Score:2, Insightful)

    by craigeyb ( 518670 )

    When it comes to making predictions regarding technology, it is typically much safer to predict the possibility of something than the impossibility of it. Human ingenuity is truly amazing.

    Perhaps that's what makes all these old predictions about talking, thinking computers so intriguing. Computers have advanced in so many ways as people have boldly predicted (perhaps the most astounding of which is that Moore's Law continues to hold true), yet AI has accomplished very little. And unfortunately, speech recognition and AI (which might be the same) are probably the most important for making computers truly useful for the ordinary end users that don't have to time to learn complex interfaces.

    This sig is false.

    • And unfortunately, speech recognition and AI (which might be the same) are probably the most important for making computers truly useful for the ordinary end users that don't have to time to learn complex interfaces.

      Well, that's the problem, isn't it? I agree with this when the interface is overly complex or cumbersome, but if a person doesn't want to learn the concepts of a technology, job, or process, then a talking computer won't help them anymore than a teacher or helpful co-worker. Better to just make the computer a little bit smarter and leave this non-learning user out of the loop altogether, IMO. For the classic consumer uses of computers like kiosks and teller machines, touchscreens work just a well, keep information more private, and cost less.

      This notion of the talking computer making everything possible reminds me of the notion of self-programming computers. The problem is this: most users can't describe their software needs to human analysts and developers. Why should they be able to describe ("program") the software to a machine any better? Or, to put it another way, who cares if the Enterprise mainframe can increase your Tachyon emmission field by 75% if you have know idea what a Tachyon field is, or how increasing it 75% can save you from the enemy ship (the computer said it was a "Klingon". What's that?).
    • Hard problems (Score:5, Interesting)

      by Gerry Gleason ( 609985 ) <gerry@@@geraldgleason...com> on Saturday September 28, 2002 @11:50AM (#4350161)
      Yes, I think it is very interesting that so many AI problems continue to be much more difficult than many predict. Even successes like chess playing programs beating all human players, although it has happened, the way it is done is not particularly satisfying.

      Thurow is an economist, not a scientist or engineer, which is why his predictions about biotech are particularly bad. The science is on the edge of a lot of new understanding and breakthroughs, but that will only put us up against the really interesting and hard problems. As if we would be able to find genes that more or less directly influence something as subtle as IQ.

      I find the predictions about the future importance of web services and the junk about "insight" to be particularly inane. On the first, nobody should forget that GM and Ford are still about the only companies that represent a percentage of the U.S. economy. Manufacture of physical goods (and commodities production, etc.) will continue to be the drivers of economies.

      In my opinion, the most important trend is a favorite of this forum. The growth factors that have been working for Free software are fundamentally exponential, even if the constant factor is small. If it isn't killed off by legal/social influence of current big players, and I don't think this is likely if it is even possible, then the exponential term will eventually dominate.

      When this plays out, the companies that make their reputations by being the best at efficiently building and servicing products that are mostly designed in the "Creative Commons". People will pay for quality in goods and services, and there will always be value in good execution. Customers do not value "insight" as described in one prediction. They find this sort of thing invasive and manipulative, and you won't be able to keep it secret.

      It was when I was chasing down some secondary links from the GNUradio interview that I came across the stuff about the value of a network increasing at greater than linear rates. You get O(N) for broadcast networks, O(N^2) in peer to peer networks, but the exponential (O(2^N)) comes in when you have group forming networks (GFN).

      When you think about it, this is what drives the GPL software phenominon. Every project fork or new initiative forms a new group or groups in the network, and every project is a nucleus for new group formation. The only way this could be stopped is to destroy to possibility of the group forming that leads to the exponential growth. While this might be possible, our robust institutions that support free speech make this very difficult if not impossible.

      So my prediction is that Linux on the desktop will overtake Windows in the next ten years, and the RIAa and MPAA will finally lose out to the best interests of the actual artists they claim to support. Also, derivitives of GNUradio will be core technology in establishing cooperative wireless mesh networks. This is the only prediction of any of the pundits in the article that will come true.

  • Good! (Score:3, Funny)

    by chrisbro ( 207935 ) on Saturday September 28, 2002 @10:19AM (#4349884)
    There seems to be a common agreement on having overrated the ability of machines to talk back to users

    This is a strong point. Now I don't have to worry about getting yelled at by my girlfriend and my computer, which the two combined occupy 95% of my time.

    "You moron! Windows XP is SO not my look!"
  • My Fav (Score:4, Funny)

    by Gregg M ( 2076 ) on Saturday September 28, 2002 @10:20AM (#4349890) Homepage
    "The internet will collapse in 1996." -Bob Metcalf, Ethernet inventor and 3Com founder.

  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
  • by edgrale ( 216858 ) on Saturday September 28, 2002 @10:23AM (#4349895)
    "I think there is a world market for maybe five computers." -- Thomas Watson Senior, Chairman of IBM, 1943
  • "Surround sound is going to be increasingly important in future offices"

    Actually, the article [slashdot.org] is FULL of horrid predictions IMHO. Especially the office being dark and hushed? So we'll get eye strain? Great idea!
    • Actually, I read an article serveral years ago which confirmed my preference for a dimly lit (but not dark) office. The article claimed that significant eye strain is created due to glare on monitors from the ambient light in bright offices. I asked my optometrist he agreed with the opinions in the article.

      I've always found it more comforable to work with dim lighting, and thankfully most people at my company agree. We have desk lamps for when you're not looking at your monitor or for people that insist on having bright light. Works out better for everyone, and let's face it, with the decor of most modern offices, the less you see of it the better.
  • by JohnTheFisherman ( 225485 ) on Saturday September 28, 2002 @10:25AM (#4349904)
    Like Microsoft collapsing in 6 months [slashdot.org] back in 2000, and more recently, Windows becoming obsolete [slashdot.org] with the advent of the new $299 Linux boxes from WalMart.
  • "The Internet will ultimately be more about information than transactions."

    Heh...I don't think this is much of a prediction as this has always constituted the Internet as I've known it.

  • I wonder if Microsoft's Vision of the Future Workpace [slashdot.org] qualified in time for this competition... Just remember that...
    "Surround sound is going to be increasingly important in future offices," says group marketing manager Tom Gruver in leading a tour of the new facility.
    I'm just waiting for those days where I come into the office and the person in the next cube is BLASTING DVD movies in their full 5.1 surround sound glory for everyone in the entire office (and possibly for everyone in a 1/2 mile radius) to hear... Those will be the days...
  • by wowbagger ( 69688 ) on Saturday September 28, 2002 @11:14AM (#4350049) Homepage Journal
    I wish the article had presented a bit more background on these guys predictions than "Here's the worst, here's the best, here's the current". That really doesn't let me gauge whether these guys are making good predictions or not.

    Consider Slashdot posts: You might say that my highest rated post is 5, my lowest -1, and my most recent is 3. But, does that give you any real feel for whether you want to read my posts? Now, if you said that my mean post value was 3.5, my mode was 4, and that only 10% of my posts are rated less than 2 (NOTE: all figures are made up - I don't keep that close track on my moderations) then you might be able to judge better.

    Simillarly, when judging someone's ability to predict where things are going, I'd like to know what their ratio of hits to misses are. If somebody is right no more often than they are wrong, then I can weight their prediction accordingly.

    That's one of the problems I had with Tomorrowland at Disney - it's nothing but a bunch of predictions from the past. I'd rather they have done a "Yesterday's Tomorrow" - for every decade show what people thought the future was going to look like, along with a reality check. Show the things they got wrong (flying cars), the things they got right (television), and the things they completely missed (computers).

    OT: is anybody else having problems getting to /.? For the past week I've had a timeout on about 1 in three connections to /., both from work and from home.
  • From Alan Nugent, Chief technology officer, Novell

    "Like Mr. Negroponte, Mr. Nugent thought people would be conversing with their computers years ago. He also thought computers would be able to emulate human thought. He says IBM's champion chess-playing computer is evidence of the progress that has been made, but the field still falls short of early expectations."

    IBM's chess-playing computer was just a massive parallel search assisted by human generated heuristics. It was not progress into emulating human thought. The only thing it progressed was building a computer to play chess.

    If this guys is Novell's CTO, that explains Novell's problems.

  • I think Lester Thurow was wrong on Japan because their economic collapse showed that Japan's cultural norms could not accommodate the changes necessary to improve their economic systems.

    Look at South Korea--after the horrid experience of the Asian financial crisis of 1997-1999 this country was willing to take drastic steps to improve its economic system; as a result the country is doing quite well indeed.

    Here in the USA, the fact we're more than willing to make changes in our economic system to correct problems show why the USA will do well economically.
  • by Anonymous Coward
    Each of them predict an outcome, while the coin is in the air.
    10 of them are correct in that their predictions of either heads or tails came true. And they believe that their high level of intelligence led them to the correct conclusion.
    These 10 geniuses are given HUGE book contracts for their obvious ability to tell the future.
    Millions of other monkeys soon believe in SUPER 10s abilities and send these geniuses millions of dollars.
    Things are looking good.
    5 years pass. Another flipping of the coins is called for and the original SUPER 10 attempt to repeat their original success.
    But NONE of them succeed. In fact only 7 monkeys predict the correct outcome of their coin toss.
    The original SUPER 10 retire to the Caymon Islands.
  • Flying phones
    Video cars

    in the next - well - real soon...
  • Peter Cochrane

    Director, ConceptLabs; former chief technologist, British Telecommunications PLC

    Worst prediction: Voice over Internet protocol technology would fall flat.

    Mr. Cochrane says 10 years ago he was extremely skeptical of the voice over Internet protocol systems that let people make voice telephone calls over data networks. He thought the networks couldn't handle it. Now he concedes that it's been successful at least on single data networks, like those used within a company.


    Shouldn't he be working on a warp drive instead of making these stupid predictions?!?
  • In the next 30 years:

    Personal transportation will be more efficent and quite possibly cheaper

    Processors will become much much faster they are are today. It is likely that processor powered devices may become smaller.

    There will be people in the general public interested in space travel.

    Most of the world will use the Internet. Some may even use it for pornography.

    Now where are my bags of money?

  • Just over a century ago he said:
  • take a look at this [min.net].

    (old compuserve ad)
  • by Sabalon ( 1684 )
    I remember our college had a CASE lab setup in around 1992-1993. It had some pretty high end machines (at that time) and still ran slow. I remember all the CIS majors and IS grad students were always in there using it.

    Use CS people just wrote our damn programs and moved on.

    It seemed kinda stupid at the time and I had forgetten about it until reading this article - still sounds stupid.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...