I used to think like, that there is a threshold, a stage of development, after which the machines would start developing so fast. It was going to modify Moore's Law as instead of "every two years" it would be "every two minutes".... Now I think there is a problem with that approach. The efficiency of computing resources are dropping. I am writing this message on a computer which has maybe ten times of storage and computing power, that my whole school had back in 1996, when I was graduated. I used to be responsible of such statistics there; 13 k undergrads, 4 k masters and above level students and maybe 300 full professors with lots of lower levels and assistants were both making scientific research and use those devices from Internet to games. At the moment, aside from writing to
/. most serious thing I use this computer is reading mail, and some gaming.
In short we are producing and using serious amount of IT resources, but those are not being utilized as efficient as expected in the past....