Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Part of the problem will self-correct... (Score 1) 180

Right now, I'd say a substantial part of the problem is insurance protection against cyber attacks.

If a company can go to a bog-standard insurance company like Travelers or AIG and spend a small fraction of both the real breach cost and the cost of actually securing things, they will - the profit motive demands it.

What the profit motive DOESN'T demand is the insurance company look at their costs with a blind eye. Right now, I'm sure a large number of those policies are untriggered, so in aggregate, they are still profitable. But when those costs become comparable, and a company factors in the lost productivity and PR issues (both of which are hard to quantify), they will actually secure things. Partially to save money on or qualify for their cyber insurance.

That's part of why news coverage of breaches and forced disclosure laws are so important - right now, to both businesses and insurers, the productivity and PR costs are too easy to ignore, and the insurer has little motive to force compliance. (In fact, it's theoretically more profitable to 'prove' to their customers that attacks happen and no tightening will prevent all attacks - both of which are absolutely true no matter what happens.)

Comment Re:Goodbye Windows. (Score 1) 585

This! Mind you, it's a bit worse than that - ie, Intel won't make a signed driver package that will allow Kaby Lake to work on Windows 7/8/8.1, because Microsoft will not make new drivers.

But let's play devil's advocate for a second - is this just Microsoft pushing Windows 10 for the sake of Windows 10? Or... is this because the driver model has changed since Win7/8 and supporting both is a higher cost for driver makers (who, by definition, would have to spend more or split quality/features to support all of these platforms)? Is it because Microsoft doesn't want to sign a driver unless it goes through the whole WHQL certification process (to ensure it's a clean build, it's stable, it's malware-free, etc.), and Microsoft can't financially justify/support keeping up the WHQL pipeline given Win7/8's current levels of popularity and general downward trend (since they no long sell or support Win7/8, Win 10 doesn't necessarily have to grow, but Win 7/8's installed base will drop due to natural attrition)?

But there's nothing here stopping manufacturers from making Linux or Mac drivers, nothing here preventing third-party, open-source drivers (albeit requiring users to allow unsigned drivers and the inherent security risks), and nothing here about Microsoft artificially pushing Win10 for the sole sake of pushing Win10.

Comment Re:Latency (Score 5, Informative) 159

This!

Even with a stable framerate, this technique intentionally delays the next frame to add compensation frames.

As an example, let's have a magic VR helmet running at 120Hz and instant processing (ie, 0ms GTG time, which doesn't exist) and a video card capped at a perfectly stable 30 FPS (aka 30Hz).

We will split a second into ticks - we'll use the VR helmet's frequency of 120 Hz, so we have 120 ticks, numbered 1 to 120. (Just to annoy my fellow programmers!)

We therefore get a new actual frame every 4th tick - 1st, 5th, 9th, etc.

Without motion compensation, we would display a new frame every 4th tick - 1st, 5th, 9th, etc.
With ideal (instant) motion compensation, we can't compute a transition frame until we have the new frame. So we could, theoretically, go real frame #1 on 1st tick, computed frame based on #1 and #2 on 5th tick, real frame #2 on 6th tick, computed frame based on #2 and #3 at 9th tick, etc.

This would also be jerky - 2 motion frames then 3 at rest? We could push the frames back a tick and fill the interval with three compensation frames, but then we increase the delay, which is always higher than this example (and is multiplicative). So we'd have frame #1 at 5th tick, computed frames at 6th/7th/8th, frame #2 at 9th tick, etc. You've now introduced a minimum 4 tick delay, which at 120Hz is 1/30 of a second, or 33ms! To an otherwise impossibly-perfect system!

What about historical frames instead to PREDICT the next blur? Well, then, when something in-game (or, really, on screen) changes velocity, there would be mis-compensation. (Overcompensating if the object slows, undercompensating if it speeds up, and miscompensation if direction changes).

There's more problems, too:
- This doesn't help when the video card frameskips/dips.
- Instant GTG and instant motion frame computation do not exist. At best, they're sub-tick, but you'd still operate on the tick.
- Input delay already exists for game processing, etc.
- Increased input delay perception would be exponential to the actual length of the delay. For example, 1-2ms between keypress and onscreen action? Hardly notable. 50ms delay just to start registering a motion on screen and course correct? Maybe OK, maybe annoying. 150-200ms? Brutal.

Slashdot Top Deals

You have a tendency to feel you are superior to most computers.

Working...