Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Has Productivity Peaked? 291

Putney Barnes writes "A columnist on silicon.com is arguing that computing can no longer offer the kind of tenfold per decade productivity increases that have been the norm up to now as the limits of human capacity have been reached. From the article: 'Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent. I just can't work any faster'. Peter Cochrane, the ex-CTO of BT, argues that "machine intelligence" is the answer to this unwelcome stasis. "What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time." Perhaps he should consider a nice cup of tea and a biccie instead?"
This discussion has been archived. No new comments can be posted.

Has Productivity Peaked?

Comments Filter:
  • Cough (Score:5, Interesting)

    by caluml ( 551744 ) <slashdot@spamgoe ... minus herbivore> on Monday November 27, 2006 @09:01AM (#17000326) Homepage
    Cough [wikipedia.org]
    • Re:Cough (Score:5, Insightful)

      by Shaper_pmp ( 825142 ) on Monday November 27, 2006 @09:45AM (#17000622)
      Indeed. I don't mean to trivilaise what may become a serious problem as society gets ever-more-complex at an ever-increasing rate, but this article basically boils down to:

      1. Technology has now reached the point where it's increasing faster than I can keep up.
      2. I now need technology to make up for deficiencies in my intellectual processes, as well as my work processes.

      Happily, many kids today don't seem to have nearly as much problem as their parents/grandparents do with futureshock/infomation overload - having been raised in an age of rich media, near-ubiquitous networking and information-overload as a daily part of their lives, kids these days seem perfectly happy to keep up.

      I don't see this as a huge problem for society, so much as for the older segment of it.

      Of course, as development accelerates the age before which one can stay relevant is likely to drop, with interesting consequences - either we develop some kind of mental process-prosthesis to enable adults to continue interacting usefully with society, or we learn to live with the important decision makers of technology being pre-pubescent teens.
      • Re: (Score:2, Interesting)

        I agree! At my current employer, the processes in the accounting department are in need of help. Ugly Access databases that have hideous queries. People creating and distributing three different versions of the same report. People producing reports that no one uses. "This is the Tuesday report. I don't know what it is, but I run it on Tuesday. Definitely, definitely Tuesday. It's the Tuesday report." Don't ask the drones what it is, and God help you if something goes wrong; like a spreadsheet that
      • Re:Cough (Score:4, Insightful)

        by Anonymous Coward on Monday November 27, 2006 @12:21PM (#17002540)
        I don't see this as a huge problem for society, so much as for the older segment of it.

        Of course, as development accelerates the age before which one can stay relevant is likely to drop, with interesting consequences - either we develop some kind of mental process-prosthesis to enable adults to continue interacting usefully with society, or we learn to live with the important decision makers of technology being pre-pubescent teens.


        What mindless babbling. In an age where we have to go to school longer and longer to acquire the skills for the technical and academic jobs, you honestly think that the ages are getting younger and younger?

        Oh, wait, these kids grow up with computers. I forgot. What a technical wonder it is to run Windows. I often have to teach my kids how to do certain things on the computer that goes beyond surfing a web page. And these are teenagers.

        But it's true - the older generation might be a little lost when it comes to myspace or whatever the next fad is.

        BTWo, it's not a matter of "keeping up", it's a matter of ignoring/blocking more and more irrevelant information in your life. The signal to noise ratio is growing ever higher. I can spend time keeping up with the news, but 99% of that is a waste of time, especially since I'm not a politician. So it is with /., unless something truly revolutionary comes about, once in a blue moon.

        Seriously, if I haven't read /. in the last 5 years - for all that not keeping up, I would have missed maybe a day's worth of reading that's truly relevant to my situation and applicable. Big whoop.
      • Bad interfaces. (Score:3, Interesting)

        by MikeFM ( 12491 )
        The biggest productivity limitation of today's computers is the user interface. The desktop metaphor simply is not powerful enough to make accessing and manipulating large amounts of information effecient. Anyone that is good at using the command-line, working through scripts, etc knows that you can accomplish much more using these methods than you can using a desktop enviroment and that when the task is even possible on the desktop it's quite a bit slower than working on the command-line.

        What we need to do
    • by neoform ( 551705 )
      You trying to say it's too early for the government to reveal that we actually live in the matrix?
  • On the Other Hand (Score:5, Insightful)

    by Anonymous Coward on Monday November 27, 2006 @09:02AM (#17000330)
    One might argue that such access to information actually decreases productivity. We're easily distracted creatures, after all. Maybe productivity peaked after the introduction of the personal computer, but before ubiquitous Internet access.

    I wonder how many people spend their entire working day browsing MySpace or Slashdot. ;-)
    • by Alien54 ( 180860 )
      "What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time."

      Sounds like he's had an overdose of techno babble. The bean counters are telling him that he needs to upgrade the humans, and he can't. You can only lay down so many lego blocks in a day. Humans are not Robots, upgradable, programmable, disposable. We do not yet have an underclass of robots serving the billions of human masters that exist today on Planet E
    • by iocat ( 572367 )
      Maybe the reason productivity is going down is the need to attach fashion to computing. My problem isn't so much too much information, it's the fact that both Windows and Macintosh OSes waste precious processing cycles and seconds trying to play stupid, pointless, animations when I want to do things like close a window or drop down a menu bar. The "fade in" menu bar is probably responsible for a significant, measurable drop in human productivity.

      And yes, I know that on Widnows at least, you can turn all th

    • Effective training (Score:2, Insightful)

      by msobkow ( 48369 )

      Do you have any idea how few people know how to use a search engine effectively? Without the vocabulary to use the right search terms and narrowing characteristics, they get back page after page of irrelevant drivel. It takes them an hour or two to find what I can locate within a page or three.

      I dislike the periodic push for AI enhancements. The approach encourages the further dumbing-down of the population, when what we need is to increase the education levels and effective intelligence (i.e. wise us

  • by Digital Vomit ( 891734 ) on Monday November 27, 2006 @09:05AM (#17000356) Homepage Journal
    Unfortunately I am now approaching stasis. Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent.

    Obviously Mr. Cochrane has never tried using Microsoft Vista.

    • Re:Obviously... (Score:4, Interesting)

      by name*censored* ( 884880 ) on Monday November 27, 2006 @10:55AM (#17001296)
      Any amount of basic machine upgrading, and it continues apace, won't make a jot of difference, as I am now the fundamental slowdown agent.
      So HE'S the one slowing us down? Well that's easy, we just get rid of him. Problem solved.

      In all seriousness, the computers have only reached a point where the interfaces are now outdated in comparison to how much data it can simultaneously accept and act on (eg, i can click on an icon and it will be told both "click", and "open program" fast enough that I don't have to wait for it). Seems to me that it's just calling for the UIs to be upgraded - we could start using other body parts (cue jokes) such as eye focus for mouse pointer position (not my idea, another slashdot pundit). Or, as has been suggested in this topic, better voice commands, and audiable hotkeys (like that light-clapper thing, except it opens your web browser instead of turning the light on/off). Or we could have interfaces that have more complex meanings than only one ascii value - such as the internet keyboards with buttons for various programs, or with hotkeys speeding up productivity.

      OR.. we could have interfaces that don't rely on physical movement, since even the fastest typist (keyboard) or gamer (mouse) are still much slower than their own brains. All the real life influences - the actual physics of arm momentum (don't go for the numpad too fast or you'll overshoot), appendage-anatomy limitations (RSI anyone?) and taking into account other obstacles (don't knock that coffee over!) slow them down. Perhaps we could have more intuitive machines, as the post suggests. Perhaps we could just have MORE task-queueing technology, which performs background tasks while waiting for user input (indexing the hard disk for searching, defragmenting, virus scanning, etc) so that the machine is ALWAYS waiting for user input, and we cut out that last little bit of having the user wait on the machine. Maybe we could enlarge UI areas, like the control centres in the matrix or minority report - it might be especially useful for coding (grab a variable/etc name or three from one place and a chunk of code from another window of related code) or graphics/design type work (grab colours, picture segments, morph shapes, you could assign a different line thickness to each finger! Perhaps body alterations - installing extra "memory" for multitasking, a telly in your tubby, a USB in your knee, bluetooth in your tooth or WIFI in your thigh..
      • Re: (Score:3, Insightful)

        First somebody mod parent up as +1 on-topic-in-the-increasingly-puerile-sea-of-/.-ir r elevance.

        Second, My view is that the author has a sterile view of what productivity is. If we limit productivity to typing in sales figures - then sure, were well into the diminishing returns; however if you're talking about recording multi-track music, or godforbid editing HDTV, then we're still a long way from the end of the trail. The question might be WHAT technology is expected to improve in the mid-term. Personally,
  • Centuries-old saw (Score:5, Insightful)

    by gvc ( 167165 ) on Monday November 27, 2006 @09:06AM (#17000366)
    At the end of the 19th century it was commonly thought that pretty well everything that needed to be known about science and technology was known; that only incremental development would occur from then on.

    Similar lack of imagination has been expressed in many contexts over the years.

    And, by the way, who says that 'productivity' is a useful measure of anything?
    • Re:Centuries-old saw (Score:5, Interesting)

      by FooAtWFU ( 699187 ) on Monday November 27, 2006 @09:56AM (#17000732) Homepage
      Economists, since productivity determines how much stuff will get produced, which determines how much stuff per person there is, and that's pretty much a measure of the standard of living that will result ("real GDP per capita").

      When you're talking about productivity in the entire economy, you can draw a graph - on the Y axis is "real GDP per capita" while on the X axis is "capital / labor" (K/L for short). If you add more capital (machines, computers, tools) people get more productive, but less so as you add more and more and more. This means the line you graph will start somewhat steep, but then level off as you get higher (not entirely unlike the graph of sqrt(x)). The rough guideline for the economy at present is the "rule of one third" - if you increase your capital stock by 100%, you'll get about 33% more output. This sort of rule determines how much capital we end up having - we will increase our capital stock with investment until we have reached the "target rate of return", which is actually a slope of this productivity curve. This is the point at which investment pays for itself.

      Then there are wonderful things like increases in technology. These end up shifting the productivity curve upward: people can do more with their technology than they could before. This increases real GDP per capita directly, but it also means that for the same level of capital, we're below the target rate of return, and can invest in all sorts of new capital, which will pay for itself - so we increase our capital stock as well.

      The good news is that technology keeps coming, and while it may not be quite the same Spectacular Breakthrough as the introduction of computers, there is plenty happening in a variety of industries. Take, for example, Wal*Mart (the company everyone loves to hate, yes...) They have achieved a substantial portion of their success by becoming more productive with managing their warehouses and inventories, and are actively looking to increase their productivity in this area. (In fact, I've seen studies that claim they were responsible for the bulk of retail productivity growth in the late 90's, directly or indirectly). "Supply chain management" is trendy. And perhaps some day we will see RFID tags at the check-out line (to replace the last great checkout productivity enhancer, bar codes).

    • by jellomizer ( 103300 ) on Monday November 27, 2006 @10:02AM (#17000784)
      Well I still see a lot of places where people are doing easily programable repetitive tasks that take them all day to do. I bring up making them a program to do that I get 2 responses.

      1. You can do that on a computer!
      2. Nah it is easier this way.

      #1 is just from ignorance and assume if the Job is difficult for them to do that it will be difficult for the computer to do. Conversely they also assume if it is simple for a person to do it is simple for a computer to do.

      #2 I normally get that if it is the persons primary job or they like doing these tasks. So a program will improve their lively hood.

      A common fallacy is that computer makes our lives easier. It makes us more productive by doing the work on all the easy mind numbing tasks. Giving us more time to focus on the hard stuff, that requires more thinking. There is much room for improving productivity. Technologies such as character/speach recognition, Improvements in robotics, Business Intelligence.

      Go and ask almost any mid size company if they can give you list of the top selling items by State, or by City. I bet most wouldn't be able to do that. And that is just some simple database queries. There is a lot of room for expansion. We tend for fail to see it because we are now use to the speed that things change. Just think about the power the newest laptops now. And compare them to the servers 5 years ago. Each core is now over 3-4 times faster and now we have duel core laptops. So a system back in 2001 with that amount of juice would cost over $10,000 (Figuring 8 CPU Systems with 3 GB of RAM, 100GB Drives, DVD/CD RW) 17" LCD Screen (Well lets make it 2 to match the resolution...) That is just 5 years ago. A single person now has enough power to run a mid size company 5 years ago. We just don't realize the change because we are use to moving up at the same speed. As computers are improving so is our skills with our job. So as we get better at our job we also get better tools that help up improve them.

      All this is assuming that your company is not one of those cheap bastards who don't want to get new programs because they don't see value in it.
       
      • by gad_zuki! ( 70830 ) on Monday November 27, 2006 @11:21AM (#17001650)
        >#1 is just from ignorance and assume if the Job is difficult for them to do that it will be difficult for the computer to do.

        I'm not sure about that. The difficulty lies in getting a good programmer and whether or not a program is worth the cost.

        I think there's no shortage of consultants who do nothing but fleece small business by coming in with an automated solution that is either an excel macro or some craptackular access database which are usually flakey, crash-prone, half-assed, and difficult to backup properly. Not to mention it ties them more into the MS monopoly.

        Even if you find yourself a good app developer there are costs to consider. If it still cheaper to do it by hand, then why bother? Especially considering the glut of labor in the US. Heck, people go to college, get saddled with loans, and are happy to take 30,000 a year jobs. Toss in all the foreign workers chopping at the bit to come here too. From a business perspective having them do the same old makes financial sense and I'm sure some people look at automation with some amount of fears as it might make them redundant.
        • Re:Centuries-old saw (Score:4, Interesting)

          by Skim123 ( 3322 ) on Monday November 27, 2006 @11:36AM (#17001854) Homepage

          The difficulty lies in getting a good programmer and whether or not a program is worth the cost.

          I agree that it is too difficult to get a skilled programmer, but I think almost always it will be worth the cost.

          Even if you find yourself a good app developer there are costs to consider. If it still cheaper to do it by hand, then why bother? Especially considering the glut of labor in the US. Heck, people go to college, get saddled with loans, and are happy to take 30,000 a year jobs. Toss in all the foreign workers chopping at the bit to come here too. From a business perspective having them do the same old makes financial sense and I'm sure some people look at automation with some amount of fears as it might make them redundant.

          In the short term, yes, it may make sense to stick with a person doing the job. But in the long run, automation will be more profitable. For example, imagine it takes $90K to write the software to replace the job of a $30K/year worker. That will pay for itself in three years and by year four, the investment will have a positive ROI. While you're still paying that $30K worker, I'm getting the work done for free. Also, since I'm assuming this $30K worker has some intelligence, some ideas, and some skills in the marketplace, by automating his mundane job, I can now turn him lose on more interesting projects. He can help lead new product lines, while you are still paying his equivalent to just do repetitive tasks that are only fit for a computer.

          I think the real challenges and hesitation from people to move to an automated system is from familiarity with the old system or fear/experience of failure with an automated system. All it takes is one bad experience - a poorly written program that crashes one day and wipes out weeks of data since the backups weren't setup properly, for example - and many decision makers will insist on more manual approaches. Another factor may be that some business partner or regulating agency requires that work be performed in a particular mannere or that certain items be made available that essentially have to be done by humans. I work on software for the health care industry, and some of the "complexities" in dealing with the county and state agencies greatly reduce the amount of automation that can be applied to a given task.

    • by elrous0 ( 869638 ) *
      Sometimes I wish I had a nickel for every "Has x peaked?" story on /.

      No, come to think of it, I'd rather have a nickel for every "Why don't they design games for women?" or "Why aren't there more women in IT?" posts.

      -Eric

    • I'm not sure why you've been marked insightful, your points are completely orthagonal to TFA.

      Anyone who deals with the real world understands that if you're looking for the next big thing, you should look at existing technological concepts. The ones that will receive widespread adoption as costs comes down (e.g. mobile phones), or where someone figures out the "trick" (e.g. Google), and generally where it gets past the early adoptor phase. (note: situation is different if you are actually the technologist t
      • Whoops, bollocks, that should be "if you've not met anyone that finds it useful".

        Note to self: Don't fuck up when flaming another. And read the goddamned preview post before hitting submit.
    • And, by the way, who says that 'productivity' is a useful measure of anything?

      I say that 'productivity' is a useful measure. It measures how much stuff you make or do with your time. If two people make toy blocks, the more 'productive' one ends up with a bigger pile of blocks at the end of the work day. If you want a way to measure that, 'productivity' is your guy.

      It's useful, e.g., because you might employ block-makers. You need something to base performance reviews on. You might decide to pay the

      • Re: (Score:3, Interesting)

        by gvc ( 167165 )
        Productivity measures money. A Manhattan lawyer is more productive than one in Grand Forks because he or she bills more per hour. The argument that the Manhattan lawyer makes more "stuff" other than money is tenuous at best.
    • This one feels appropriate:

      Everything that can be invented has been invented.

      Charles H. Duell, Commissioner, U.S. patent office, 1899 (attributed)

      (And yes I know that he probably didn't actually say that. But I saw it on the Internet so it must be true.)
  • by techmuse ( 160085 ) on Monday November 27, 2006 @09:08AM (#17000380)
    It sounds like he just hasn't attempted to run Vista yet...
  • by kahei ( 466208 ) on Monday November 27, 2006 @09:10AM (#17000392) Homepage

    My local lawyer, for example, used to get about 20% of the town's law traffic 10 years ago. It's now computerized and processes far more documents and communications, at a far faster rate, than it ever used to. It still gets about 20% of the town's law traffic, as its competitors have upgraded in exactly the same way. The courts, of course, recieve far more documents and messages from these lawyers than they ever used to, but the courts themselves have also computerized (just barely) and can handle the extra traffic.

    In terms of 'productivity', I'd think that the lawyers, paralegals, court administrators and so on have improved by 10 times. In terms of how much useful stuff gets done, it's exactly constant.

    So yeah, by all means integrate Google technology with your cornflakes to achieve a further tenfold increase in productivity. Go right ahead.

    In more important news, I currently have a co-worker who spends all day reading his friend's blogs (which doesn't bother me) and giggling over the witty posts he finds (which is driving me fucking mad). Can any slashdotters suggest a solution that will not result in jail or in me being considered 'not a team player'?

    • Re: (Score:2, Funny)

      by Anonymous Coward
      > In terms of 'productivity', I'd think that the lawyers, paralegals, court administrators and so on have improved by 10 times. In terms of how much useful stuff gets done, it's exactly constant.

      And in a law office, that constant is 0.
    • Re: (Score:2, Funny)

      by tjrehac ( 951304 )
      Solutions: 1. Recommend him for a position somewhere else in the company or beyond... 2. Tell him his laugh is a little distracting at times 3. Tell him that girl in marketing keeps asking about him, you've heard she wants him to ask her out - but that she'll probably turn him down the first ten or so times he asks 4. Enlist him in the National Guard 5. Add a link to a porn site in one of his friend's blogs - and turn him in when he follows it 6. Ask to be moved, or ask to have him moved 7. Pay to have a l
    • Hopefully he has some sort of supervisor that you can report this to. He's wasting company time doing that. Unless you work somewhere that being 'Team Player' is given more importance that a concept I call 'Getting Work Done'. In which case, you should be wasting your time on monster.com instead of slashdot.org

      Or you can do what I do: Headphones + Viking Death Metal.

    • by bytesex ( 112972 )
      Start posting at his friend's blogs. A lot.
    • Write a slashdot storry about his friends blogs. Once they are slashdotted, he won't be able to access them, and that should stop the giggeling.
    • If you're dealing w/ a person-interaction-limited profession, yes, but if it's just a function of sales, then no.

      For example, here at work, I created a several thousand line WordBASIC macro which 16 times a year crunches a ~200 pg. Word manuscript, setting about 90% of the text to have the correct style / formatting --- the remaining 10% has to be done by hand, but it's a quick search to find it, and I've created a button bar to make each style assignment a single click (why there isn't a default Word tool
  • Obligatory (Score:3, Insightful)

    by tttonyyy ( 726776 ) on Monday November 27, 2006 @09:11AM (#17000396) Homepage Journal
    I, for one, welcome our new tea and biccie munching AI overlords.

    Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant. Sounds like the next logical evolutionary step. They'll look back on us as The Flesh Age and perhaps keep a few of us as pets (or stuffed humans in a museum). Beyond that, our usefulness is exhausted.

    I love the smell of optimism burning in the morning.
    • Re: (Score:3, Interesting)

      by lawpoop ( 604919 )
      "Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant. "

      Unless that AI can self-replicate, our new jobs will be building and maintaining that AI.

      We are now in the situation you describe, except with machines and labor. It used to be that we toiled in the field with sticks and rakes, smacking oxen on the back to keep them moving. Now, we ride in air-conditioned cabs of giant combines, listening to satellite radio and resting our buttocks on a leather seat, wat
      • by qwijibo ( 101731 )
        The majority of the human race always has been and always will be redundant. In your example, it takes fewer people to operate the machines that produce more, but that is only possible now because the top 1% have created leather seated combines and GPS satellites. There will continue to be improvements over time, but the majority of the work that needs to be done is going to be dull and repetetive. The sad part is how many companies have only recognized half of this equation and think the lowest common d
    • Re: (Score:3, Insightful)

      once we've invented AI that can do our jobs, the whole human race is pretty much redundant.

      Not quite. There are lots of things that we could use AI for to help us do our jobs better -- as technology is supposed to do for us in the first place. Think of a plow, or a tractor, or even the computer in the first place. How the hell do you think programming or systems administration was done before computers?
    • Anyway, once we've invented AI that can do our jobs, the whole human race is pretty much redundant.
      Except that the purpose of life is life itself, to replicate our genes, and the purpose of business is to allow us to impress our potential mates with what good specimens we are by defining our place in the social hierarchy. When machines can replace humans in every business endeavour, we'll replace business with something else, it'll become irrelevant.

       
  • by cucucu ( 953756 ) on Monday November 27, 2006 @09:12AM (#17000406)
    He states it clearly that he is talking about hardware (not that I agree). He says by himself that software can still bring improvements. From TFA:



    So if raw processing power, storage and bandwidth can't help, what will? What is it I need to leap forward by another factor of 10? In a word: intelligence. In two words: machine intelligence. I need something that monitors my activities, anticipates my next move and automatically satisfies my needs.



    I think the current trend in software is not intelligent software, but software that allows us to enlist our collective intelligence, or collaboration software, such as wikis, sharepoint, simultaneously edited spreadsheets, etc.
    The author of TFA that makes so much use of the word I: he should start to think in term of us, and install the software that allows him to productively do so. Then he will see he starts departing the stassis he feels he is in.
    • The author states: "I need something that monitors my activities, anticipates my next move and automatically satisfies my needs."

      He deserves a paperclip jabbed in his eye, or even worse, somebody turn on his MS Office assistant and unlease the fury of Clippy on his ass!
    • The flip side of productivity is the value gained. If my standard of living either increases relative to my income, or stuff becomes cheaper I gain. Likewise, stuff like Wikipedia represents an increase in value relative too an encyclopedia. We can argue the usefulness and accuracy later.

      There is an increasing amount of free valuable stuff created by people for next to nothing. I wouldn't want to be a publisher ten years from now, but anticipate huge shifts in how we assign value to effort and increases in
  • Little things get better and help productivity - simple example: something like spotlight. No matter what I'm looking for: command-space, type the first few letters of it and it's there for me to use...
  • AFAIK, computers have not been shown to produce a 10-fold increase in productivity. Productivity has been increasing slightly over 1% per year, and computer technology has been only a small part of it. It takes about 40 years or so for an invention to create a leap in productivity. This held true for the steam engine electricity, telephone, fax machine, etc., and each one of them changed substantially from the time of their invention to the time of elegant use. My guess is that computer aided intelligence I
    • Uhhg..Should have read the parent closer. The article says 10-fold per decade...which is about right.
    • by ThosLives ( 686517 ) on Monday November 27, 2006 @09:31AM (#17000516) Journal

      I'm just curious as to what is meant by 'productivity' anyway. I hate the numbers that are thrown around in the media. I want to see hard numbers like "bushels of produce per man-hour" and things like that - not something in silly relative units like dollars of economic activity (especially when a lot of economic activity is actually not 'productive' at all - for instance, selling a house in my mind is not productivity, but building a house is. Heck, if selling a house was 'productive', I could just keep selling a house back and forth between two parties and be the most productive real-estate agent in the universe - except that nothing actually changed. Note that I don't mean that selling a house isn't valuable; it's just not, in my mind, related to productivity).

      • Productivity is an economic term. It comes from the theory that governs how much stuff we all have - essentially, the amount of stuff we have is the difference between the speed of stuff breaking (called depreciation) and the rate at which we can build stuff (called production). The rate at which we can build stuff changes with 3 variables - the amount of stuff we have held back in investment (we could have made cars, but instead built car factories), the number of people around (as population increases,
      • by RevMike ( 632002 )

        a lot of economic activity is actually not 'productive' at all - for instance, selling a house in my mind is not productivity, but building a house is. Heck, if selling a house was 'productive', I could just keep selling a house back and forth between two parties and be the most productive real-estate agent in the universe - except that nothing actually changed.

        Selling a house can be productive, in that it helps two parties both better align their resources with their needs. By facilitating this alignmen

      • I agree 100%. Lost in all of these productivity stats is any attempt at differentiation between "value productivity" and "physical productivty". The former is how valuable my output is per unit time, while the latter is how many physically-measurable units I putput per unit time. An example I like to give is: "Between 1978 and 1998, I went from being able to make three pairs of bell-bottom pants in an hour to twelve pairs an hour. What happened to my productivity?" Well, I can produce more units per h
  • This is true to a degree, there is only so much that one person can do by themselves. Yes, there is tasks that can be completed by a single person, but it will get the point where we will change as people and they way that we work and we will be able to collaborate better. Virtual teams and the likes exist now, but we will still have a requirement for interaction. When mobile, although this is possible, it isn't as possible as face to face, this change will allow us to still give even more output while movi
    • by pubjames ( 468013 ) on Monday November 27, 2006 @09:43AM (#17000604)
      Having been to one of Peter Cochrane's talks before, and having spoken to him, I know this guy is many years ahead of the rest of us.

      I've been to one of his talks as well. He is not years ahead of the rest of us, he is full of bollocks. Have you read one of BT's future predictions documents? (Which I believe come out of Cochrane's department) They are full of things like "in 20 years time, we will control computers with our minds, and we won't have lunch, we'll eat a pill!" If you find the stuff he says to be visionary, you don't have much imagination...
      • From what I saw from his talk 10 years ago to today, I can see a lot of things that haven't played out and a lot of things that have. WAP was one of them. The interfaces sucked, and that was brought up. Everyone is using wireless data (In all forms) to get access to their standard interfaces essentially. Wireless is getting faster, and it's being used to replace a cable. That's not visionary at all, anyone could have seen that happening, we also know that the cable is always going to be faster as it's a kno
        • by pubjames ( 468013 ) on Monday November 27, 2006 @10:34AM (#17001046)
          From this article on the BBC website [bbc.co.uk]:

          The latest technology timeline released by BT suggests hundreds of different inventions for the next few decades including:

                  * 2012: personal 'black boxes' record everything you do every day
                  * 2015: images beamed directly into your eyeballs
                  * 2017: first hotel in orbit
                  * 2020: artificial intelligence elected to parliament
                  * 2040: robots become mentally and physically superior to humans
                  * 2075 (at the earliest): time travel invented

          So, according to BT research, in 14 years time we are going to have computers sitting in parilament, in 34 years time there are going to be robots that are mentally superior to us and I may see time travel invented in my lifetime. Sorry if I don't take this stuff seriously. Wasn't it fashionable to predict this kind of thing in the 1950's?

          Yes, some of their shorter term predictions are better, but I can make good short term predictions too.
  • Productivity (as reported by the BLS) is measured in dollars per hour per person. Since the Federal Reserve has the ability and the desire to expand the monetary supply without limit, productivity can likewise be increased without limit. Or, at least, as long as China keeps buying our treasury bonds...
  • Why (Score:5, Insightful)

    by teflaime ( 738532 ) on Monday November 27, 2006 @09:35AM (#17000536)
    do we need continued 10 fold increases in productivity? If we are a society that is going to require work from our citizens, then we need to provide work for our citizens to do. We only need increased productivity if we are, as a society, going to support at a reasonable level those persons who have been automated out of the work force and can't be retrained (and there are a lot of them). Business has a social obligation to support the societies that it parasitizes. Besides, if it doesn't support the society that it feeds off, soon it will have exhausted its food supply.
    • by tezza ( 539307 )
      You are right in terms society only _really_ needs enough work to keep everyone going.

      Why people worry about productivity is because the World Economy is related to it.

      Inflation is closely linked to productivity. If workers are more efficient, there is lower cost of paying them. Which mean that costs of services are lower, and inflation is stopped. Currently, it is looking as though producitivity can no longer be relied on to keep inflation low... i.e. a recession/depression is coming.

      Also people like to

    • For a business to remain competitive it really does need to look at continual gains in productivity. A business also has a client base that grows, and a customer base to support. To support those users effectively you need IS. Try managing 1,000,000 users in an ISP environment without some form of automation. It's essentially impossible. It is impossible to do so while keeping a price that makes the product accessible. There unfortunately isn't a way around it.

      Think about the associated human costs, are you
    • I need continued productivity improvements to keep my job from being outsourced.

      William
    • by khallow ( 566160 )

      Business has a social obligation to support the societies that it parasitizes.

      And most businesses do. But a business can only support a parasitic society, if it is competitive.
  • "In the twentieth century one did not have to be a pontificating pundit to predict that success would breed success and the nations that first were lucky enough to combine massive material resources with advanced knowhow would be those where social change would accelerate until it approximated the limit of what human beings can endure."

    John Brunner, The Shockwave Rider [wikipedia.org]
  • by f00Dave ( 251755 ) on Monday November 27, 2006 @09:44AM (#17000616) Homepage
    Sounds to me like the old "information overload" phenomenon. The solution-pattern to this situation is never going to be found via incremental improvements in information processing, as the growth is exponential. Nor will an "add-on" approach solve the problem; while hyperlinks, search engines, and other qualitatively-impressive tools are awesome in their own right (and do help!), they only add a layer or two to an information-growth process that adds layers supralinearly ... they're another "stop-gap measure", though they're also the best we've come up with, so far.

    So how to solve an unsolvable problem? Rephrase it! IMO, the problem isn't "too much information", as that's already been solved by the "biocomputer" we all watch the Simpsons with: our senses/brains already process "too much information" handily, but with lots of errors. No, the problem is that we're using the wrong approach to what we call "information" in the first place! We're rather fond of numbers (numeric forms of representation), as they've been around for around eight thousand years, and words (linear forms of representation) go back even farther. Pictures, music, etcetera store far more information (qualitative, structural forms of representation), but usually get mapped back to bitmaps, byte counts, and Shannon's information theory when this discussion starts. And that's the heart of it right there: everyone assumes that reducing (or mapping) everything to numbers is the only way to maintain objectivity, or measure (functional) quality.

    Here's a challenge: is there a natural way to measure the "information-organizing capability" of a system? Meaning some approach/algorithm/technique simple enough for a kid or grandparent to understand, that most human beings will agree on, and that puts humans above machines for such things as recognizing pictures of cats (without having to have "trained" the machine on a bajillion pictures first). [Grammars are a reasonable start, but you have to explain where the grammars come from in the first place, and what metric you want to use to optimize them.]

    A constant insistence/reliance on numeric measurements of accomplishment just ends up dehumanizing us, and doesn't spur the development of tools to deal with the root problem: the lack of automatic and natural organization of the "too much information" ocean we're sinking in. If we're not a little bit careful, we'll end up making things that are "good enough" -- perhaps an AI, perhaps brain augmentation, [insert Singularity thing here] -- as this is par for the course in evolutionary terms. But it's not the most efficient approach; we already have brains, let's use 'em to solve "unsolvable" problems by questioning our deep assumptions on occasion! :-)

    Disclaimer: the research group [cs.unb.ca] I work with (when not on "programming for profit" breaks, heh) is investigating one possible avenue in this general direction, a mathematical, structural language called ETS, which we hope will stimulate the growth of interest in alternative forms of information representation.
    • ---Sounds to me like the old "information overload" phenomenon. The solution-pattern to this situation is never going to be found via incremental improvements in information processing, as the growth is exponential. Nor will an "add-on" approach solve the problem; while hyperlinks, search engines, and other qualitatively-impressive tools are awesome in their own right (and do help!), they only add a layer or two to an information-growth process that adds layers supralinearly ... they're another "stop-gap me
  • I remember reading a column Peter Cochrane used to write in a newspaper many moons ago. IMO the man is definately a paid of member of the Kevin Warwick (Reading uni , "notorious" AI professor) Pie In the Sky Club whereby they both take bog standard science fiction topics that can be found in 101 paperback books written in the last 40 years, mix in a large amount of 1960s technology can solve any problem attitude , ignore any negative aspects or complicated social issues of what they're proposing and then se
    • What?

      Much of our Sci-Fi has come true because WE aimed for it. Star Trek Comms anybody? They're called cell phones now. When will they link back to a desktop supercomputer? I figure it wont be that long.

      And look at how powerful our computers are now. They're amazing. Id dare to claim that just one of our desktops are just as powerful as the whole world of machines 20 years ago. How much more powerful do they become before they equal human intelligence? From that "Moore's Law" (which was an observation, not
      • Re: (Score:3, Insightful)

        by Viol8 ( 599362 )
        "Much of our Sci-Fi has come true because WE aimed for it. Star Trek Comms anybody? They're called cell phones now."

        Actually most of it hasn't, we just notice the stuff that has. What about antimatter driven warp engines? Transporters?
        Anyway , star trek comms were little advanced from the walkie talkies that existed in the 60s anyway!

        "Id dare to claim that just one of our desktops are just as powerful as the whole world of machines 20 years ago"

        I'm guessing you weren't around 20 years ago then. The supercom
  • Time to upgrade ourselves.

    (only a solution if you think the increase should continue)
  • Microsoft's "New World of Work" initiative is all about this. If you look beyond its short term goals of selling and deploying more Office Software, there is a very compeling vision of the future, with widescale automation of low-value tasks. There is an extremely cool BMW video around this, with not a single MS logo in sight, but some ultra-cool hardware (desks and walls that are montitors with optimal transparency) that makes "Minority Report" look terribly crude.

    Of course nobody can deliver on this today
  • but I see a SOA (Services Oriented Architecture) solution to the problem. These building blocks will be used to scale the productivity of the developer. As more and more services like Flickr, Google Maps, and the like continue to provide key services to the developers, our mashups will become more compelling. Just remember to make your mashup a service so that someone may build upon it when you are done.
  • Joel Spolsky [joelonsoftware.com] makes some good arguments about the best programmers being siginificantly more productive than the rest.

    In France, the government found that some surgeons were able to acheive 12x as many procedures as others at the same quality level[1]. This is the basis of the NHS reforms in the UK [bbc.co.uk]. i.e. Provide a system to encourage the 12x surgeons/other-staff to succeed.

    So one way to increase productivity is to identify those 12x people, and find less demanding work for the <12ers. This is done by h

  • Cochrane? (Score:3, Funny)

    by Chemisor ( 97276 ) on Monday November 27, 2006 @09:59AM (#17000762)
    I'd like to see what his grandson Zephram has to say about this...
  • by Pedrito ( 94783 ) on Monday November 27, 2006 @10:03AM (#17000790)
    Sure, for most people, productivity isn't going to increase 10-fold. Hell, as a software engineer, I can't imagine getting 10 times as much stuff done in the same period of time anytime soon. Faster computers wont' help and about the only thing that would speed up my productivity as a programmer is software that would write the code for me, putting me out of a job.

    There are a lot of people working in the sciences who think differently, though. Chemists, biologists, physicists, could all do well with, not just smarter programs, but faster computers. As a couple of simple examples: Molecular mechanics modeling for chemists and protein folding modeling for biologist (particularly the latter, and both are related), are insanely computationally intensive and if computers were able to provide the results in 1/10th or 1/100th of the time, it would make a big difference in their ability to get things done. So I think it kind of depends what you do. I mean, let's face it, if you're a secretary, a faster word processor isn't going to make you 10 times more productive. Maybe a faster copier would help...
    • You spend 100% of your time coding?

      You don't spend any time searching for solutions to problems, dealing with customers?

      Not knowing your response, do you think by any chance that it would be possible to save time doing the above?
  • So, increase the limits.

    "Improve a mechanical device and you may double productivity. But improve man, you gain a thousandfold."
    -Khan Noonian Singh

    We've already got a good start on it. [wikipedia.org]
  • There are human limits on things like how many items we can simulataneoulsy hold in short term emmory (~7) or how fast our brain works, but that doesn't equate to a limit on "productivity". The key here is chunking and levels of abstraction - we can overcome the number of items we can manipulate by "chunking" simpler items into groups that we then consider as a whole (e.g. memorize a phone number as 3 chunks vs 10 digits), and can gain power in our thinking by thinkign at a higher level in terms of more pow
  • We're at the point where pretty much everyone is familiar with e-mail and a web browser now. We're a long way from the end of productivity. When the regular office staff are able to run a query on the fly to get the data they need, rather than manually cutting and pasting from some predesigned job because the legacy systems don't interact, we might get closer.

    There is still a ton of manual busy work that could be automated or sped up in most corporations. There's a lot more that could be done.
  • by hey! ( 33014 ) on Monday November 27, 2006 @10:52AM (#17001250) Homepage Journal
    I've never seen or heard of anything like a blanket ten fold increase in productivity come from the introduction of a new system or even new technology. Perhaps in certain tasks were speeded 10x, but he volume of revenue generation does not increase 10x. Of course there are cost reductions by staff reduction, but for some reason it seems rare to have large scale downsizing as a result of introducing IT (as opposed to new manufacturing technologies or new business practices).

    Mostly we are talking about marginal improvements -- although these are often not to be sneezed at. Margins are where competition takes place; they're where they difference between profitability and unprofitability, or positive cash flow and negative cash flow are determined. For things that are done on massive scales, marginal improvements add up. But even doubling actual productivity?

    What IT mainly does is shift expectations. When I started work in the early 80s, business letters and memos were typed. Now we expect laser printed output or emails. A laser printed letter doesn't have 10x the business impact of a typed letter. An email gets there 1000x faster than an express package, but it seldom has 1000x the business impact when looked at from the standpoint of the economy as a whole. You only have to use email because your competition is using it as well, and you can't afford a competitive differential in speed.

    Many changes created by information technology are imponderable. For example, one great difference between the early 80s and today is that there are far fewer secretarial staff. Memos and letters used to be typed by specialists who often were responsible for filing as well. Now these tasks are most done by the author, arguably eliminating a staff position. On the other hand, the author spends much more time dealign with computer and network problems; not only is his time more expensive than the secretarial time on a unit basis, he also needs the support of highly paid technical staff.

    Some technology mediated changes are arguably negative: We used to set deadines for projects based on the delivery time plus a margin for delivery. Now it's common for proposals and reports to be worked on up to the last possible minute.

    There are, no doubt, many substantial business savings created by new practices enabled by technology. Outsourcing, specialization, accurate and precise cost of sales, these are things that over time may have a huge impact.

    • I write data proccesing and graphical interpretation software for oil exploration. With this the average worker can find drilling sites about a hundred times faster than before 1990- its more like they look at 20 times more data with one fifth of the people. There have been similar gains on the "smart" drilling side where rigs accomplish in week what it took a season to do. The consequence was the US oil industry cut its workers 75% in the past 20 years, a cut that puts the auto industry to shame.
      Much
  • :::Takes a glance around the office:::

    Hell, no.
  • our current end-objectives at the time.

    At a concept-expansion factor of 2.0000 (end = objective, current = at the time), we have reached the semantic Schwarzchild radius. Make yourselves ready and prepare to cross and traverse the meaning horizon!

  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Monday November 27, 2006 @11:22AM (#17001670) Homepage
    IRTFA.

    Yes, productivity has peaked if we adhere to his non-standard assumptions and definitions. His first assumption is that everyone is like him. I assume he's a writer, which involves a higher ratio of higher thinking to mundane tasks than average. I see EAI and data processing put people out of work (or allow an exisitng team to process more data) every day in the busines world.

    Even if we focus on his narrow world, he says that a better search engine would help his job. But he labels all such improvements as "machine intelligence" and declares them out of bounds for the point he's trying to make which is that hardware alone will not improve his personal productivity. He's basically declaring all software improvements as out of bounds in order to declare the "peak of productivity".

    Finally, I bet his productivity has improved since 2004 despite his protestations to the contrary. Wikipedia is much faster than search engines to get a neutral concise summary and handful of the most relevant links. Shall we take away the author's access to Wikipedia? He obviously doesn't need it.

  • Ah ha (Score:3, Funny)

    by proxy318 ( 944196 ) on Monday November 27, 2006 @11:28AM (#17001744)
    What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time.
    Oh is that what we need? Maybe we can synergize our core-concepts to think outside of the box, thereby ensuring we work smarter not harder, and then we can leverage our paradigms to holistically obtain next generation perspective.

    Is it so damn hard to to say "we need a new approach"?
    • by Viol8 ( 599362 )
      He's yet another idiot that seems to think that by trawling a thesaurus and using every alternative
      word and phrase he can find it'll somehow give his ideas more gravitas. To me it actually says
      style over substance but each to their own.
  • I forget - what exactly were we trying to accomplish, such that a slowdown in productivity growth is a problem?

    I was never too keen on helping McDonald's require fewer people in the production of Happy Meal toys, and I'm not too sure I want AK-47 production (or M-16 production, for that matter) to be much cheaper either.
  • What we need is a cognitive approach with search material retreated and presented in some context relative to our current end-objectives at the time.

    What the hell am I supposed to do with that statement? If this is the mindset that will produce the next generation of "thinking machines" then sign me up for the Butlerian Jihad.

  • by srobert ( 4099 ) on Monday November 27, 2006 @12:28PM (#17002636)
    If productivity per man-hour has increased so much, then why the hell are we still working over 40 hours a week? Where is all this new wealth accruing? Why am I working more hours with a college degree to have a lower living standard than my father had 40 years ago? And he didn't even graduate from high school. We should have been on a 32 hour standard workweek many years ago.
    • Re: (Score:3, Insightful)

      by zCyl ( 14362 )
      You have a laptop, internet access, a flat-screen TV, and a microwave (or at least you COULD have these things). So you've traded your shorter work week for more toys than your father had back then.

      If you were willing to give up all these advanced toys which have now become ordinary, you might be able to get by just fine on the salary from a shorter workweek.
  • Teleworking (Score:4, Insightful)

    by wikinerd ( 809585 ) on Monday November 27, 2006 @12:58PM (#17003122) Journal
    I study for an MSc in Management, and my Management books say it clearly: Telecommuting and teleworking increase employee productivity at least by 20% without exception, if implemented right. This is what we learn at a government-funded university. Therefore, productivity, at least in business, has not peaked, as most businesses are still requiring to lose 3 hours in commuting to your cubicle farm, where you sit all day in front of a computer similar to the one(s) you have at home, often doing exactly the same things (programming and Slashdot), only at a different place. It's crazy.

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...