You're kidding, right? 60 to 90 is a noticeable jump in smoothness, and many people can distinguish between 90, 120, 144, and even 165 Hz.
All testing was performed with default (disabled cache). Further, cache settings have little effect on NVMe RAIDs on Z170. Additionally, our minimum latencies were 6us *longer* in an array vs. single SSD, so clearly no caching taking place.
PC Perspective's new testing demonstrates the triple RAID-0 array having just 1/6th of the latency of a single drive.
That was with a queue depth of 16. Not exactly representative of a normal desktop user.
It's reasonable for peak power user load. Folks running / considering triple SSD RAIDs are not exactly 'typical desktop users'
Yup, it's been corrected. Should have been 6 micro (u) sec.
Yup, we had a scale error as our Excel-fu was not as strong as we'd hoped when we made the custom format for the axis, and I totally fell for the error. I've updated the article with corrections.
That's pretty much it. The trick was showing it properly, which has not previously been possible without our new test method.
Storage Editor, PC Perspective
The SSD controller already does a form of this, as it is talking to multiple flash memory dies over multiple channels. RAID is just another layer to get even more performance out of more parallelism (and as we figured out in testing, to considerably drop the latency under load).
Storage Editor, PC Perspective
1. That is a false claim - Gamenab didn't even cite the correct FPGA model when he made that DRM claim.
2. G-Sync is actually good down to 1 FPS - it adaptively inserts additional redraws in between frames at rates below 30, as to minimize the possibility of judder (incoming frame during an already started panel refresh pass). FreeSync (it its most recently demoed form) reverts back to the VSYNC setting at the low end. Further, you are basing the high end of G-Sync only on the currently released panels. Nothing states the G-Sync FPGA tops out at 144.
3. I use the word 'experience' because it is 'my experience' - I have personally witnessed most currently shipping G-Sync panels as well as the FreeSync demo at this past CES. I have also performed many tests with G-Sync. Source: I have written several articles about this, including the one linked in this post.
5. I believe the reason it is not yet released is because Nvidia wants to be able to correctly cover more of the range (including the low range / what happens when the game engine hitches).
Gamenab stumbled across the leaked driver and tried to use it to spread a bunch of conspiracy theory FUD. I hope most people here can correctly apply Occam's razor as opposed to the alternative, which is that he supposedly designed those changes, those changes going into an internal driver build that was inadvertently leaked and happened to apply to the exact laptop he already owned.
ExtremeTech picked apart his BS in more detail: http://www.extremetech.com/ext...
1. The FPGA *was* required for the tech to work on the desktop panels it was installed in.
2. FreeSync (as I've witnessed so far) as well as the most recent adaptive sync can not achieve the same result across as wide of a refresh rate range that G-Sync currently can.
3. Nvidia could 'make it work', but it would not be the same experience as can be had with a G-Sync module, even with an adaptive sync panel (as evidenced by how this adaptive sync panel in this laptop intermittently blanks out at 30 FPS or when a game hitches.
5. The driver was not a release driver, and was not meant to call the experience it gives 'G-Sync'. It was meant to be internal.
Conclusion - Adaptive sync alone is not the same experience you can currently get with a real G-Sync panel, which is why any possible future G-Sync that does not need a module it's not yet a real thing.
Care to tell us how you know that? Better yet, care to cite it?
Third pic down: http://www.ev1.org/
To do nothing is to be nothing.