Phopojijo writes: The recently released AMD Radeon R9 290X has an advertised shader clock rate of "up to 1GHz". The card brought formerly $1000-level performance down to a $550 price point. Its benchmarks tend to fluctuate wildly, however, based on the card's ability to maintain an intended maximum temperature of 95C. By analyzing across a variety of fan speeds, AMD's default settings are characteristic of a 727 MHz base clock with an average boost to 850-880 MHz. At these defaults, the card will not maintain 1GHz for more than a couple of minutes (or less).
Vigile writes: AMD is releasing its fastest single GPU graphics card today, the $549 R9 290X based on a new, 6.2 billion transistor GPU called Hawaii. The brand new part has 2,816 stream processors and has a peak theoretical performance of 5.6 TFLOPS. PC Perspective has done a full round of testing on the card to see where it stacks up and it does in fact beat the GeForce GTX 780, a card that costs $100 more. In fact, it also compares well to the $999 GTX TITAN flagship. Maybe more interesting is the completely redesigned CrossFire integration that no longer uses a bridge and fixes the CrossFire + Eyefinity/4K pacing issues that have plagued AMD for some time. As it turns out, with this new hardware, 4K tiled display CrossFire appears to be corrected.
JoshMST writes: So why are we in the middle of GPU-renaming hell? AMD may be releasing a new 28 nm Hawaii chip in the next few days, it is still based on the same 28 nm process that the original HD 7970 debuted on nearly two years ago. Quick and easy (relative terms) process node transitions look to be a thing of the past with 20 nm lines applicable to large ASICs not being opened until mid-2014. This covers the issues that we have seen, that are present, and that which will be showing up in the years to come. It is amazing how far that industry has come in the past 18 years, but the challenges ahead are greater than ever.
Vigile writes: A new system for testing performance of graphics cards that has been in the works for over a calendar year is being fully unveiled today, called Frame Rating. This technology uses hardware-based capture to record the output from the graphics card and GPU directly and then uses post processing to measure performance and experiences as the user would see them, not through basic logs recorded on the gaming system itself. This much more accurate representation of performance has revealed some interesting highs and some unfortunate lows for graphics vendors already. AMD's CrossFire and Eyefinity technologies take the brunt of the damage: Frame Rating proves that in many games adding a second GPU to your system will result in essentially zero improvement in performance, frame rate or animation smoothness. PC Perspective has detailed the new testing methodology and posted the first sets of data across several PC titles.
Vigile writes: When OCZ purchased Indilinx back in March, there were a lot of questions as to what the SSD vendor would do with a controller company that seemed so far behind the performance leaders like Intel and SandForce. Today, with the release of the OCZ Octane, those questions are answered as the new Everest controller is in fact a performance beast. According to the testing done over at PC Perspective, the Octane is able to beat out even Intel's controllers in terms of low latency and high IOs per second, a feat no other controller had done, while also offering average sequential read speeds as high as 505 MB/s! Pricing is very competitive as well with the 256GB model MSRP set at $370 and a pending release of a 1TB 2.5-in model!
Vigile writes: While $1200 graphics cards might get a lot of attention from enthusiasts, the majority of PC gamers fall into the sub-$200 world and NVIDIA's latest graphics card fits perfectly into that niche. The GeForce GTX 460 comes in both 1GB and 768MB versions and will sell for $229 and $199 respectively. Based on a new design of the existing GPU, the GF104 chip also goes through a fairly dramatic architecture shift that includes rebalancing CUDA cores (shaders) in relation to the tessellation engines and texture units. In the end though what matters is performance and value and the GTX 460 delivers on both counts handily beating the $199 HD 5830 from AMD.
Vigile writes: An editorial over at PC Perspective posits that even though the iPad is unquestionably the first successful modern tablet computing device, Apple deserves less of the credit for that success than most are putting on them. As evidence the author points to the applications that ship with a stock iPad including the mail, notes, calendar, contacts and even the iBooks program, all of which are considered poor examples of touch-enabled applications. Instead, when users brag about the iPad they are usually referring to programs like Netflix or the ABC Player or even casual games and PC Perspective believes that there is no technical reason that these couldn't have already existed on other tablets had Microsoft and others recognized the need for a slightly altered user input scheme. Apple may have done a great job with the iPad hardware but it is really third-party developers that deserve all the credit for the iPad's success.
Vigile writes: AMD has been having a difficult time in the last year or so keeping up with Intel on the consumer CPU front. While the Phenom processors have been decent, since the introduction of Intel's Core i5 and Core i7 lineup of parts AMD has never really had a chance in the performance segment. They are hoping to change that with the release of the Phenom II X6 1090T processor, a 6-core CPU that will sell for about $285. Compare that to the 6-core offering from Intel: the Core i7-980X that retails for $999 or above. No, the 1090T won't run as fast in the benchmarks as the i7-980X but it does do well in media encoding tests and is one of the best available CPUs for performance/watt and performance/dollar. Add to that mixture the new Turbo Core Technology that automatically takes the 3.2 GHz part up to 3.6 GHz when three or fewer cores are loaded, and the AMD 1090T is the best competition Intel has seen in some time.
Vigile writes: While solid state drives, including those from Western Digital itself, continue to be the star of storage technology today, platter-based traditional hard drives still remain the device of choice for most storage. Before there were SSDs, enthusiasts and PC gamers depended on the Western Digital VelociRaptor brand to keep their computers fast and the brand is back with a new 600GB SATA 6Gb/s model released today. Although seek, noise and power ratings remain nearly identical to the previous model there is a nice boost in transfer rates over other standard hard drives. As for price, the new VelociRaptor's cost per GB makes it a better deal than previous iterations though it remains well behind the ratio of any of the tested 2TB hard drives.
Vigile writes: The solid state disk market keeps crowding but the Western Digital SiliconEdge Blue SSD marks the first offering from a player that currently dominates the market of traditional spindle-based hard drives. It was a year ago this month that WD purchased SiliconSystems for $65m, a small enterprise level SSD vendor that has developed its own storage controller. Western Digital obviously made the move to prepare the company for the inevitable situation it finds itself in today: solid state has surpassed traditional media in performance and will likely soon become the mainstream storage choice for computers. PC Perspective has put the first consumer-level SSD option from one of the kings of HDDs through the wringer and found the drive to be a solid first offering with performance on par with the some of the better solutions in the market while not quite fast enough to take away the top seating offerings from Intel and others. Western Digital has seen the writing on the wall; the only question is when the other players in the hard drive market will as well.
JoshMST writes: NVIDIA is late to the party with DX11-ready graphics hardware, but they are hoping to make up for it with a very unique and different design than the competing AMD Radeon GPUs. The GF100 GPU is essentially the same hardware that was detailed in the Fermi release late last year, but only now do details emerge on how the hardware is targeted at the gaming market. A total of 512 processing cores are combined with a new PolyMorph Engine that performs the necessary tessellation for DX11 but in a new, less serialized manner. The chip is going to be big and we already know that it is going to be power hungry, but until actual products and benchmarks are revealed, the final answer between AMD and NVIDIA this generation will be up in the air.
Vigile writes: The new AMD Radeon HD 5670 launches today and is the first graphics card to bring DirectX 11 support to the sub-$100 market and truly offers next-generation features to almost any budget. The Redwood part (as it was codenamed) is nearly 3.5x smaller in die size than the first DX11 GPUs from AMD while offering support for DirectCompute 5.0, Eyefinity multi-monitor gaming and of course DX11 features in upcoming Windows gaming titles. Unfortunately, performance on the card is not revolutionary even for the $99 graphics market though power consumption has been noticeably lowered while keeping the card well cooled in a single slot design.
Vigile writes: The AMD GPU team has definitely been dominating NVIDIA lately with both the release of the Radeon HD 4000-series and the Radeon HD 5000-series that was the first to include DX11 support and new Eyefinity multi-monitor gaming technology. AMD's latest addition is a dual-GPU variant called the HD 5970 that basically runs a pair of 5800 cards in permanent CrossFire mode on a single PCB. The gaming performance is incredible but might be overshadowed by the significant amount of overclocking headroom AMD left on the card for users that want to get 15-20% more out of their rig at the cost of 33% additional power consumption. If 400 watt GPUs and $599 price tags are something you can deal with, this card will impress just about any gamer.
Vigile writes: When Intel's consumer line of solid state drives were first introduced late in 2008, they impressed reviewers with their performance and reliability. Intel gained a lot of community respect by addressing some performance degradation issues found at PC Perspective by quickly releasing an updated firmware that solved those problems and then some. Now Intel has its second generation of X25-M drives available, designated by a "G2" in the model name. The SSDs are technically very similar though they use 34nm flash rather than the 50nm flash used in the originals and reduced latency times. What is really going to set these new drives apart though, both from the previous Intel offerings and their competition, are the much lower prices allowed by the increased memory density. PC Perspective has posted a full review and breakdown of the new product line that should be available next week.
Vigile writes: More than simply a faster 6.0 Gb/s data throughput speed, the SATA 6G standard offers improved NCQ support, better power management and a new connector to support 1.8-inch drives. While modern day spindle-based hard drives struggle to keep up with SATA 3G speeds, the advent of modern SSDs are nearly saturating the existing standard and a move to SATA 6G was welcome in the hardware community. It looks like that technology will be delayed though as the only chip supporting the standard today, the Marvell 88SE9123, is having major issues. Motherboard vendors including ASUS and Gigabyte, who had planned on releasing SATA 6G technology using the chip on Intel Lynnfield platform motherboards later this summer, are having to remove the Marvell 88SE9123 and redesign their boards at the last minute due to significant speed and reliability issues.