Comment paywalled. (Score 2) 139
Kind of makes me wonder why slashdot almost never links the REAL articles and instead just links some fancy news sites with second hand information.
Maybe because of the paywall?
Kind of makes me wonder why slashdot almost never links the REAL articles and instead just links some fancy news sites with second hand information.
Maybe because of the paywall?
The thief will have to sell the goods on the black walnut market.
Normally, I would agree, but I must disagree in this case. The vast majority of people in the U.S. are science-illiterate and easily swayed by sensational headlines (For example, last week slashdot posted a story on how the background radiation in Fukushima is less than that of Denver, yet people panic over radiation exposure in Japan, but not Colorado.). I worry that a similar backlash against GM crops could negatively affect the world's food supply.
While we can disparage crops that have been crafted to withstand copious amounts of insecticide, please keep in mind that there are 7 billion people on the planet, and all of them need to be fed. Much of the world depends upon the United States' agricultural output. GM helps boost this output. While the American consumer can withstand a few cents increase in cost due to decreased food supply, the same increase can trigger food riots in less fortunate countries. If the United States' agricultural output is enhanced by GM, then I'm all for it. I worry that shunning GM food in the US could hurt further investment/development.
Yes, and the rover could have sent "I'm on a boat!" for its first message home.
Jim.
CUDA was released, supported by NVIDIA GPUs, in early 2007. The first OpenCL specification was not released until late 2008 (OpenCL has not been around for 4 years, as you claim). As for which is more popular, I'm afraid that you have this backwards too. The dominant market force for GPU computing is supercomputing. How many of the top 5 supercomputers used AMD GPUs? Zero. How many use NVIDIA GPUs? Three. And they're all using CUDA because it's more feature rich---it can do fancy things like direct memory copies between infiniband interconnects and GPU memory.
FYI: OpenCL on NVIDIA is implemented on top of CUDA, so you're still using CUDA if you're using OpenCL on NVIDIA.
Surprise, surprise, I have the feeling that most of you haven't actually read the article. The article is not arguing that GPUs are inherently flawed. Also, the article is not an NVIDIA-vs-AMD competition. Rather, the author tests software on each platform. It's the software that is bad, not the GPUs themselves. For instance, the NVIDIA GPU does quite well with Arcsoft and Xilisoft; this wouldn't be possible if GPUs were somehow broken for transcoding. After all, as others have pointed out here, floating point support is actually quite good on modern GPUs.
Still, poor software shouldn't come too much as a surprise. While CUDA and OpenCL certainly make GPU-based computing easier, it is still a relatively new technology that only a few programmers know how to use efficiently. I'm also not sure that the market pressure is there yet from consumers for efficient GPU-based applications (how many of them actually know what a GPU is?).
I suppose this is an improvement over a design from another Dutch firm for residential towers in South Korea: http://www.dailymail.co.uk/news/article-2072308/MVRDV-architects-reveal-plans-South-Korean-buildings-look-eerily-like-Twin-Towers-exploding.html
SLI is absolutely useless for CUDA-based (Cray's uses NVIDIA GPUS) GPGPU.
Well, Apple, it looks like you'll be the last major OS still running a terribly out of date file system. Ditch HFS+!
I think the time of the PS3 clusters has past. The Cell processor was released back in 2006! IBM released a few upgraded processors, mostly improving double-precision performance, but those systems are really cost prohibitive.
Assuming you can deal with PCIe latency, GPUs are the way to go.
"Only the hypocrite is really rotten to the core." -- Hannah Arendt.