Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:My 16 bit games cost 50 bucks (Score 1) 323

I don't know where you pulled those numbers from, but for those exact game types, I paid nowhere near that much at release. I think you need to re-examine your historical pricelists, son.

Of course you didn't - nobody pays future inflation-adjusted prices 20 years in advance unless they are completely insane!

Comment Glad to see "HD"TV is not killing DPI advancement. (Score 1) 155

For many years now I have been very disappointed by stagnation, and down-right reversal in DPI trends. 13 years ago I was running a 19" CRT at 2048x1536, now to find a computer display with similar DPI is very difficult; to find a CRT is even more so, despite the venerable technology's superiority in virtually all image quality metrics.

TVs continue to get larger and larger, while 1080p is likely to remain the standard by which they are all measured for several years to come, and penetration of media with a native resolution > 1080 will take even longer. So there is little incentive for mass produced panels (e.g. TVs, and even computer monitors) to improve on DPI; I am glad however, that the emerging mobile device market is not shackling itself to the same philosophy.

Viva la DPI!

Comment TiVo has a valid point in securing their platform! (Score 1, Redundant) 251

TiVo would be obsolete if it publicly allowed modification of its software, because Cable Labs would withdraw TiVo's permission to transfer recordings in any capacity. And this capability is one of the reasons I will never use a cable company or IPTV's inferior DVR solution. The TiVo software is outdated (granted, I do not own a TiVo premiere, because without the OLED front display, I view it as a downgrade), but it still does what it does better than anything else on the market. I do not get why people rag on TiVo for this - all of my TVs all the way back to my 30" SONY Trinitron FD (circa 2004) run a Linux kernel, and none of them are modifiable either, yet nobody ever complains about that. I am sure SONY has sold more TVs running Linux than TiVo has subscribers, and SONY is not the only manufacturer to run Linux, so.... what gives?

Comment Re:John you are blowing smoke (Score 1) 405

I do not know how you got all that from the article... he clearly states that he has no intention of switching APIs.

More or less, all he stated was that Direct3D has been driving the evolution of commodity real-time computer graphics hardware for the better part of a decade now. It is a very attractive API if you can limit yourself to a single platform, and deserves respect. This contrasts his (and virtually everyone elses) views on Direct3D from 15 years ago, when the API was a horrible mess. Direct3D was playing catch-up back then, but the situation has been reversed for a very long time.

The last truly amazing thing I saw (for its time) come out of the OpenGL development pipeline, that could have driven the development of the next generation of GPUs, was ATI's proposal for Uber/superbuffers. Said extension was shot down, and by now the functionality is easily accomplished using a myriad of other extensions. The fact is, Direct3D is the driving force behind what features a new class of hardware will support, and to claim otherwise would be truly foolish. This in no way implies that OpenGL cannot or will not utilize the new functionality, only that Direct3D evolves more quickly.

Comment APIs are gradually becoming irrelevant anyway... (Score 1) 405

I completely agree with Carmack, but as Tim Sweeney has stated we are gradually moving to a point where the graphics API is irrelevant. As time goes on, more and more of the GPU is exposed to programability and with each release of DirectX and OpenGL (to a lesser extent), the fixed-function pipeline is shrinking. If the trend continues, we will eventually get to the point where everything is abstracted into programs that run on a GPU and buffers... even state management will become a task managed completely by the engine.

If you work on cross-platform console engines, chances are you already wrap the state machine into irrelevance to make up for differences between APIs. Multi-threading is simpler in DirectX, but that is to be expected from an API whose design only requires it to work on a single platform. If you code close to the metal, of course you will see benefits for a specific combination of API / hardware.

OpenGL has traditionally been a monolithic API that tries to maintain legacy support, while accommodating new hardware through extensions. It has a lot of baggage that DirectX does not have, since DirectX routinely drops legacy portions of its API with each major release. Even so, OpenGL comes in multiple flavors now, with the embedded API being much closer to the feature-set modern commodity hardware offers.

Performance wise, Direct3D has held the crown for many years. Portability wise, OpenGL has always been the king. I do not see this changing anytime soon, the two APIs tackle real-time computer graphics from two very different angles. I, like many engine developers, do not have the luxury of committing to a single API, so it is very difficult to effectively exploit the theoretical advantages of either API.

Comment Storage medium? (Score 1) 244

The real question is how will they be distributing games to the system? If it is, as some have speculated, download-only, I will skip this platform altogether.

If it uses optical storage exclusively, the battery life and load times will likely be even worse than the PSP. What would be nice is if they actually made use of "Magic Gate" to allow optical games to be installed to compliant MemoryStick for better load times and battery life (given of course that the optical disc is present); probably will not happen, but would be nice.

Comment Re:Original Source and Actual Paper (Score 1) 462

The paper refers to none of these parts of an Operating System, however. You can safely assume that anytime the paper mentions "Operating System," it is talking about the kernel. User-land components have almost nothing to do with _how_ the kernel deals with resource allocation, aside from requesting a subset of the resources.

And there are plenty of Linux-based Operating Systems that do not use GNU, for your information. Embedded devices being the king.

Long Dev Time Equals Better Game? 88

Via a GameSetWatch post, a piece on Treyarch Producer Stuart Roch's blog. He discusses the long development time of Shadow of the Colossus, and what four years of work did for that title. From the article: "Granted, it's a bit of a stretch to make a simple correlation between more development time and higher quality product based on this tiny product sample, but I have to admit, there is certain attractiveness to the argument. Can it be that in a given number of development cycles, those that had more time with less resources would create better games than those that had short dev cycles with monster teams? One might think that having more time would allow for more polish and iteration and therefore yield higher quality product, but as I'm sure you're thinking, examples can be made of both good and bad games that were in production for long periods of time."

Slashdot Top Deals

What is research but a blind date with knowledge? -- Will Harvey

Working...