Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Multi-Monitor Support in 2013?!? (Score 5, Informative) 278

Not quite. I used to work on the windows display management kernel and did a ton of testing when we brought back heterogeneous in Win7. In XDDM (XP Display Driver Model), heterogeneous was allowed, but it had issues when drivers would conflict. You could find some setups that worked and some that didn't, largely based on the drivers, cards, and the alignment of the planets.

When Windows Vista came out the drivers moved to WDDM (Windows Display Driver Model). This model initially disallowed heterogeneous configurations. In Win7, heterogeneous support was again allowed, partially because the OS now tracked monitor connectivity state (CCD - connecting and configuring displays). Previous versions of windows had left that to the individual drivers, which could cause conflicts and loops of bad behavior ("value add" software from vendor x sets "clone" mode, then from vendor y sets extend mode, and they fight back and forth, for example).

So in Windows, it was allowed for every release except Vista, though it wasn't really supported or tested well until 7 and beyond.

Comment Re:Goes to show ya (Score 2) 175

Not quite. If we start reading patents, it opens up liability for treble damages should we be found in violation of a patent. For example, we're investigating patents, there's that doesn't have prior art, a few months/years later we're found to be in violation of that patent. At that point their lawyers say "hey, you guys were looking at patents and should have known about this one. Triple the damages!"

Comment Re:What does it tell you? (Score 1) 274

Don't get me started on the "absence of evidence" line. It's a BS saying by those who want to support a particular claim without having any way to show others they're not just making stuff up. This isn't even absence of evidence, though. It's poorly-stated evidence, or possibly even misleadingly-stated evidence.

When a company that is "an exceptionally competent developer of high performance, low-level graphics software" fails to communicate test results in a fashion which can be meaningfully compared by people who would be interested in such a developer's blog, it either calls into question their competence or the content of their communication. A competent group that realizes "hey, that's really cool, we got a higher average FPS" and also sees "that's weird, there are a lot of sections of benchmarking where the actual FPS drops to 15-20", might choose to communicate the first result and hide the second in order to drum up support. Simply reporting the average FPS does nothing to assure us that the performance is better for an end-user (other considerations include micro-stutter, periods of low playability,

Additionally, such "obvious test practices" do really need to be spelled out, and conformity results reported, for a reader to even infer a meaningful comparison. With the system they described, if they were pumping this out to a 1920x1080 display, the test results mean something completely different than if they were driving 1, 2, or 3 2560x1600 displays. I (and others in the field or even just interested in the field), would like to know this information so that it is meaningful, and not merely an "ePeen number".

Perhaps it's due to [H]ard|OCP that I've come to expect better benchmark reporting, but what is shown here disappoints. A test case can be designed to test a number of factors, so simply stating it's for "testing" indicates you're unaware of testing in the graphics area. Those factors include, but are not limited to, conformance, pixel throughput, vertex throughput, fill rate, average fps, memory utilization, GPU utilization, CPU utilization, inter-frame jitter, and average dropped fps. Not all of these always need to be reported, but some of them are linked, and should be reported together.

So yeah, one factor is marginally faster and was useful for finding a previously-missed bit of overhead. Cool, great work, etc. But don't report a single number which may or may not be misleading (and will obviously be touted by some a place to claim platform or API set x is better than y), without giving meaningful context to the results. As you said, Wraithlyn, they're experts. How about they demonstrate that in a meaningful fashion for the other experts who might gain some insight from their blog?

Comment Re:What does it tell you? (Score 4, Insightful) 274

Clearly not. They give a bare number that doesn't indicate whether it is a maximum FPS or an average FPS. They provide neither the test setup (screen resolution, detail settings, etc), nor meaningful analysis of overall performance. For example, if the average FPS is lower on one platform, but the variability is also lower, the actual user-perceived performance will be better. No meaningful details are provided, just some ePeen number which is abstract of context. The statistician in me weeps.

Slashdot Top Deals

A large number of installed systems work by fiat. That is, they work by being declared to work. -- Anatol Holt

Working...