Phopojijo writes: While it looks like utilizing multiple GPUs in DirectX 12 will take off, much of this setup was exposed to game developers for years. Previously, it was held back by the low number of machines with (available) mismatched graphics, the lack of compute shader support in consoles, and the difficulty in using OpenCL for a video game product. While the editorial elaborates on each of these points, it doesn't mention Explicit Linked Multiadapter in DirectX 12, which allows game developers to make similar assumptions that AMD and NVIDIA do for CrossFire and SLI, respectively, making their job even easier.
JoshMST writes: PCPer was able to conduct an in-depth interview with the Game Lead for the new DiRT Rally simulation. The interview is more technical than most when it comes to what makes the engine tick, where they are going, and how that relates to the user experience.
Vigile writes: NVIDIA announced its latest dual-GPU flagship card, the GeForce GTX Titan Z, at the GPU Technology Conference in late March with a staggering price point of $2999. Since that time, AMD announced and released the Radeon R9 295X2, its own dual-GPU card with a price tag of $1499. PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamers view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K) the R9 295X2 offered higher and more consistent frame rates sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design. The Titan Z is quieter and uses much less power, but gamers considering a $1500 or $3000 graphics card selection are likely not overly concerned with power efficiency.
Phopojijo writes: Guennadi Riguer, chief architect of Mantle at AMD, answered a few questions about the technical details of their new graphics API. Of particular note, he discussed the potential for game developers to load balance across mismatched Mantle-supporting GPUs (for example, if an end user purchased a new video card and installed it alongside their old one). He also discussed how the graphics pipeline is evolving and the possibility of fixed-function hardware doing the same.
Phopojijo writes: So, you can encrypt your password library using a client-side manager or encrypted file container. You could practice your password every day, keep no written record, and do everything else right. You then go in for a serious operation or get in a terrible accident and, when you wake up, suffer severe memory loss. Slashdot readers, what do you consider an acceptable trade-off between proper security and preventing a data-loss catastrophe? I will leave some details and assumptions up to interpretation (budget, whether you have friends or co-workers to rely on, whether your solution will defend against the Government, chance of success, and so forth). For instance, would you split your master password in pieces and pay an attourney to contact you with a piece of it in case of emergency? Would you get a safe deposit box? Some biometric device? Leave the password with your husband, wife, or significant other? What can Slashdot come up with?
DavidGilbert99 writes: Within the next 10-12 years, the chips powering our PCs will have more transistors on them than our brains have neurons — and that's around 100 billion in case you were wondering. However, Intel's Mooly Eden told an audience at CES 2014, than adding more transistors alone won't make computing better, that in order to do that we need to make computing more natural, intuitive and immersive.
Lasrick writes: This is a great read--as the author writes: 'Today, emerging military technologies—including unmanned aerial vehicles, directed-energy weapons, lethal autonomous robots, and cyber weapons—raise the prospect of upheavals in military practice so fundamental that they challenge assumptions underlying long-established international laws of war, particularly those relating to the primacy of the state and the geographic bounds of warfare. But the laws of war have been developed over a long period, with commentary and input from many cultures. What would seem appropriate in this age of extraordinary technological change, the author concludes, is a reconsideration of the laws of war in a deliberate and focused international dialogue that includes a range of cultural and institutional perspectives.'
mikejuk writes: Bribe.io announces itself as: A super easy way to bribe developers to fix bugs and add features in the software you're using. Recognizing the fact that a lot of open source projects are maintained by developers working alone and in their spare time, the idea is to encourage other developers to by specifying a monetary value to a bug report or feature enhancement. Once an initial "Bribe" has been posted others can "chip in" and add to the financial incentive. Obviously there are problems to overcome — will it lead to devs introducing bugs at the same time as new features just to get paid to fix them? Also how does this fit with the underlying ethos of open source software? I Can hear RMS already....
Phopojijo writes: The recently released AMD Radeon R9 290X has an advertised shader clock rate of "up to 1GHz". The card brought formerly $1000-level performance down to a $550 price point. Its benchmarks tend to fluctuate wildly, however, based on the card's ability to maintain an intended maximum temperature of 95C. By analyzing across a variety of fan speeds, AMD's default settings are characteristic of a 727 MHz base clock with an average boost to 850-880 MHz. At these defaults, the card will not maintain 1GHz for more than a couple of minutes (or less).