Dr. Damage writes: The TSX instructions built into Intel's Haswell CPU cores haven't become widely used by everyday software just yet, but they promise to make certain types of multithreaded applications run much faster than they can today. Some of the savviest software developers are likely building TSX-enabled software right about now.
Unfortunately, that work may have to come to a halt, thanks to a bug—or "errata," as Intel prefers to call them—in Haswell's TSX implementation that can cause critical software failures. To work around the problem, Intel will disable TSX via microcode in its current CPUs--and in early Broadwell processors, as well.
Dr. Damage writes: AMD introduced it Mantle API for graphics in order to reduce CPU overhead and to create a means of programming that better maps to modern GPU hardware. Now, it appears Microsoft and the industry forces that collaboratively drive OpenGL may be moving to offer similar capabilities. Several GDC session listings suggest that a new version of Direct3D and "zero driver overhead" OpenGL will be introduced at the Game Developer's Conference next month. The "future improvements in Direct3D" promise "an unprecedented level of hardware control and reduced CPU rendering overhead."
Dr. Damage writes: "AMD's next-gen Bulldozer architecture hasn't performed up to expectations--it's relatively power-hungry and has weak performance, especially in desktop-style applications that require strong single-thread performance. Fortunately, AMD revealed today at Hot Chips that it is working to improve single-thread performance and energy efficiency with Steamroller, an upcoming architectural refresh."
Dr. Damage writes: "Nvidia continues to fill out its lineup of Kepler-based graphics products. Today, it plugs the hole at $299 with the GeForce GTX 660 Ti, which is based on the same GK104 GPU as the $499 GTX 680, only with a few functional units disabled. The Tech Report's review of the GTX 660 Ti goes beyond average FPS numbers, looking at frame latencies to get a better sense of hiccups and stutters that interrupt smooth gameplay. The verdict? The 660 Ti offers roughly equivalent performance to the Radeon HD 7950, a card that recently gained clock boosting firmware but costs $50 more than the GeForce. The 660 Ti is quieter than its Radeon counterpart, and it consumes less power when playing games. As the conclusion notes, though, any of the cards in the GeForce's price range is more than adequate given the demands of today's games."
Dr. Damage writes: Nvidia first unveiled the more expensive graphics cards in its new GeForce lineup, but today, the GeForce GTX 670 arrives, and The Tech Report says there's no reason to buy anything else. They prove it by driving a six-megapixel, triple-monitor array competently with a single video card and measuring performance using some intriguing, latency-focused metrics.
Dr. Damage writes: Nvidia's new GeForce GTX 590 poses an interesting question to the subset of folks who buy $700 dual-GPU graphics cards: does performance rank above all else, or do considerations like board size and noise levels matter more? This latest high-end GeForce isn't quite as fast as AMD's similarly outrageous Radeon HD 6990, but it's smaller and substantially quieter. Based on the numbers, the Radeon's louder fan may be easier to hear than the card's slightly higher frame rates are to see.
Dr. Damage writes: Less than two years after introducing its quad-core Core i7 processors, Intel will soon unveil a six-core CPU for the desktop that works as a drop-in replacement for older Core i7-900-series parts. The first previews of the six-core "Gulftown" reveal a chip with 50% more cores and cache that fits into the silicon area and power/thermal envelope as the quad-core it replaces. Performance in multi-threaded applications scales up nicely, but clock speeds—and thus single-threaded performance--remain the same. Do we really need six cores on the desktop? That depends, it would seem, on what you do with your computer.
Dr. Damage writes: AMD has a new $239 graphics card out, the Radeon HD 5830, that might be a good upgrade for some folks. What if you're upgrading from a graphics card in that same price range that's two, three, or four years old? How much of an improvement can you expect? And is the new Radeon a good value for the money? This review compares it, and a host of today's other graphics cards, to products dating back up to four years ago, including a couple of GeForce 7900s.
Dr. Damage writes: How do current $74 CPUs compare to the $133 ones? To exclusive $1K Extreme Editions? Interesting questions, but what if you took a five-year-old Pentium 4 3.8GHz and pitted it against today's CPUs in a slew of games and other applications? The results are eye-opening.