Electric shock in a game is not new. Tekken Torture from 2001: http://www.eddostern.com/tekken_torture_tournament.html
A capitalist economy partly guards against oversupply. However, oversupply has resulted directly from Chinese policies: http://www.nytimes.com/2012/10/05/business/global/glut-of-solar-panels-is-a-new-test-for-china.html
Now both American and Chinese solar companies are failing. Further private investment in this oversupplied economy seems unwise; there is a distaste for subsidizing failed business models in the US (at least where green tech is concerned). Perhaps university research is the best alternative investment.
With respect to throughput and multitasking, your desktop OS may be better. Theoretically, a focused game OS may take steps to reduce worst-case latency (real-time OS techniques) and optimize operations for game-related workloads (possibly game-tuned memory allocators?). Unfortunately, console makers are very secretive on how their OSs are designed and implemented. I would be interested to hear from anyone who is familiar with modern game OS development. Is there any secret sauce?
Maybe this isn't a solution for FPS games, but I would love to be able to play Civilization V from the cloud with all the graphic bells and whistles.
Kind of makes me wonder why slashdot almost never links the REAL articles and instead just links some fancy news sites with second hand information.
Maybe because of the paywall?
The thief will have to sell the goods on the black walnut market.
Normally, I would agree, but I must disagree in this case. The vast majority of people in the U.S. are science-illiterate and easily swayed by sensational headlines (For example, last week slashdot posted a story on how the background radiation in Fukushima is less than that of Denver, yet people panic over radiation exposure in Japan, but not Colorado.). I worry that a similar backlash against GM crops could negatively affect the world's food supply.
While we can disparage crops that have been crafted to withstand copious amounts of insecticide, please keep in mind that there are 7 billion people on the planet, and all of them need to be fed. Much of the world depends upon the United States' agricultural output. GM helps boost this output. While the American consumer can withstand a few cents increase in cost due to decreased food supply, the same increase can trigger food riots in less fortunate countries. If the United States' agricultural output is enhanced by GM, then I'm all for it. I worry that shunning GM food in the US could hurt further investment/development.
Yes, and the rover could have sent "I'm on a boat!" for its first message home.
CUDA was released, supported by NVIDIA GPUs, in early 2007. The first OpenCL specification was not released until late 2008 (OpenCL has not been around for 4 years, as you claim). As for which is more popular, I'm afraid that you have this backwards too. The dominant market force for GPU computing is supercomputing. How many of the top 5 supercomputers used AMD GPUs? Zero. How many use NVIDIA GPUs? Three. And they're all using CUDA because it's more feature rich---it can do fancy things like direct memory copies between infiniband interconnects and GPU memory.
FYI: OpenCL on NVIDIA is implemented on top of CUDA, so you're still using CUDA if you're using OpenCL on NVIDIA.
Surprise, surprise, I have the feeling that most of you haven't actually read the article. The article is not arguing that GPUs are inherently flawed. Also, the article is not an NVIDIA-vs-AMD competition. Rather, the author tests software on each platform. It's the software that is bad, not the GPUs themselves. For instance, the NVIDIA GPU does quite well with Arcsoft and Xilisoft; this wouldn't be possible if GPUs were somehow broken for transcoding. After all, as others have pointed out here, floating point support is actually quite good on modern GPUs.
Still, poor software shouldn't come too much as a surprise. While CUDA and OpenCL certainly make GPU-based computing easier, it is still a relatively new technology that only a few programmers know how to use efficiently. I'm also not sure that the market pressure is there yet from consumers for efficient GPU-based applications (how many of them actually know what a GPU is?).
I suppose this is an improvement over a design from another Dutch firm for residential towers in South Korea: http://www.dailymail.co.uk/news/article-2072308/MVRDV-architects-reveal-plans-South-Korean-buildings-look-eerily-like-Twin-Towers-exploding.html
SLI is absolutely useless for CUDA-based (Cray's uses NVIDIA GPUS) GPGPU.