[...]I don't see how rendering at a higher resoltion is necessary, though...I thought anti-aliasing basically gives you all the benefits of that without the drawbacks. [...]
That is how a supersampling anti-alias filter works. The initial image is generated at a higher resolution (usually power of two) and then filtered to the desired target res.
All this is usually done automagically by the 3D graphics card, but some of the data I'll be viewing won't get rendered directly by the card. I see it as an opportunity to write my own AA filter that cross-samples the stereo images and balances the AA effect between the two views (sometimes allowing a higher frequency signal on one-side and letting the brain interpolate between the eyes). I'm hoping the result will be a sharper overall anti-aliased *stereo* image.
Even my crappiest stereo3d rig was 1600x1200.
I'm assuming you are referring to a monitor/shutter glasses type system. That's an inaccurate comparison with a head-mounted display. You have to remember that the screen is < 2in. away from your eyes. Most of the entry-level 'pro' stereo head-mounted displays ($5000+) don't get over 1024x768, so $500 for 640x480 really isn't that bad.
I think the motion tracking adds more to the experience than absolute resolution. Your mind can fill in the visual blanks with a little imagination. It can't make up for the lack of basic orientation tracking, parallax effects, and changes in sound with head position. You can add basic head tracking/parallax to a crt/glasses system but the tracking is heavily constrained by the user having to face the monitor. If you want to look skywards you have to adjust your avatar's view with the mouse/keyboard vs just looking up to see above you. This constrained tracking really isn't useful for anything more than simple parallax effects.
My interest in head-mounted displays is for exploring 3d data visualizations and for performing experiments. I can see how 640x480 might be a disappointment for a gamer, but it would work just fine for my uses... Although 640x480 sounds really low it doesn't look that bad. If you render at a higher resolution and scale/filter to 640x480 the results can look quite nice
... However, no affordable, high resolution headsets are available on the market today. (and when I say affordable, I mean for any reasonable price. You cannot get a high resolution head mounted display for even $2000)
Depends on what you consider to be 'high-resolution' for a head mounted display. The Vuzix iWear VR920 boasts dual 640x480 displays and 3DOF head tracking for $400 US. Add a Wiimote to the mix and you can get 6DOF head tracking for $450 + some time getting it all to work together. That's not to shabby when compared to the $2000+ pro-headsets or say a $15,000+ tactical HUD visor
[...] Show me a file-oriented data storage method that automatically handles application level semantics.
Any modern relational database has minimal semantics support [...]
I should have been clearer and said 'non-database'... But yes you are correct a formal relational database is great way to store static information it can automatically enforce it's schema and basic semantic integrity. Anyone who uses XML as a replacement for a database is using to much of it. Like I've stressed before XML is a poor format for static data storage. If the data isn't going to be transformed, aggregated, filtered, or translated at some time in the future; then XML may not be best choice for storage. To state it differently: XML is meant to be used for transportation vs. storage of data.
For example, if the message contains a tree data structure that is meant as an update to the data defined as a graph (what is a common application of XML now), standard has to include a definition of all operations that may have to be done when it arrives, including lookup and update, consistency check, etc. The actual implementation that will perform those things in application may be generated from it, and may be extended to perform more data handling [...]
Essentially what you are describing is IDL with a touch of COM. This approach is fine when you have control over your deployment environment and software ecosystem, as is the situation at Google. They [Google] are not advocating Protocol Buffers as a replacement for XML. Anyone who got that impression didn't RTFA. They had a well defined problem and found an effective solution for a process that traditionally might have been done with XML. They are making this technology available in the hopes that others may find it useful in solving their own well defined data transport problems. They are not advocating it as de facto competitor to XML. Protocol Buffers is a framework for creating new binary protocols. To quote the article (emphasis mine):
And now, we're making Protocol Buffers available to the Open Source community. We have seen how effective a solution they can be to certain tasks , and wanted more people to be able to take advantage of and build on this work.
Unfortunately the internet is not well defined and homogenous. The degradation abilities of XML allows disparate clients and processes to interact without having an explicit contract. It means a well designed document can be extended without breaking older clients and processes that were never explicitly designed to talk to each other can.
This a big shift in program design (specifically what constitutes a 'program'). Instead of monolithic code bases you have distributed servlets and transformation processes. A 'program' can be an abstracted service or it could be the description of the processing and filtering chain combining the resources of third party data sources and services.
If you are writing components that only talk to each other and could care less about interoperability or openness(*) then use whatever fits your needs. Nobody is forcing you to use XML (I don't care for evangelism of any kind). But don't poo-poo a technology because it doesn't fit your needs or your style of programming. If your gripe is with people doing stupid things with XML more power to you (but please be clear).
* Yes a published IDL and API libraries are open... One convenient thing about XML is that all you need are a parsing lib and network libs and you can transact with any XML service you don't have to install/compile/maintain a new library for every service (vs. every service having it's own unique protocol). Yes this does mean that XML trends to a lowest common denominator solution, but to allow the creation of laze-fare processes it has to be. Accommodation for the lowest common denominator is whole point of degradation.
[...] "WHO WOULD THINK THAT SUCH A THING MAY HAPPEN?" from format developers every time it is discovered that standard is ambiguous [...]
It called documentation... Who creates a service, defines a document schema (enforced or otherwise), and doesn't release usage and implementation documentation?
Microsoft?
--
[this sig left intentionally blank]
A computer without COBOL and Fortran is like a piece of chocolate cake without ketchup and mustard.