Comment Re:ok .. (Score 1) 195
If only an open platform like Linux actually had enough marketing to stretch its wings.
Its not all about resolution.
What about polygon count, and memory size?
They wouldn't matter without extra resolution to display that extra detail... assuming of course that the input detail is equal or greater than the output detail the screen resolution allows.
They must of been drinking the same, with all due respect to an otherwise extremely bright programmer, software rendering kool-aid as Tim Sweeney.
"The End of the GPU Roadmap" http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf
And while Real-Time Ray Tracing is the "Holy Grail" and is achievable, there is no way VRAM is going away to be replaced with traditional CPU memory. There are so many memory optimizations in the rendering pipeline that it would be stupid to suggest that it all should be tossed out and use slow DRAM instead.
He was actually talking about something like CUDA or OpenCL programs that look similar to a typical software rendering engine.
GPUs would still be there... but you would "talk to them" in a similar way you would a CPU. Only with slightly more simple commands that are parallelized across thousands of cores.
Basically Tim Sweeney is annoyed at all the DirectX and OpenGL quirks they need to dodge and would want to program each engine basically from first principles -- but still use the GPU for calculations that could be split into hundreds or thousands of independent parts.
Yet every iPhone susceptible to that exposure could be updated within 2 weeks. I would like to see Android pull that one off...
If Apple gets around to it, of course.
They've been known to let vulnerabilities go until they can roll them all up into a nice 250MB-or-so patch.
Hey, what's the rush? They're not a target.
How can you do 'New Math' problems with an 'Old Math' mind? -- Charles Schulz