Currently, the color model used on workstation computers is very lacking in dynamic range. It cannot reach levels that are nearly bright enough. The real world contains sun sparkling on the water, car headlights, and the outdoors. Currently, a normal computer environment can only reproduce the brightness of a sheet of paper. This is a blocking issue to producing realistic virtual worlds and images.
This is a legacy problem dating back to old monochrome monitors, where there were only two shades of color to work with. Naturally, text was represented as full white (or black) and the text's background as the opposite color. This meant that white's brightness could be turned far up, but it would be very uncomfortable for the user -- if the monitor had a white as bright as car headlamps, then anyone reading a sheet of text would have an entire page of incredibly strong brightness blasting at them.
This color model continued to be used in the era of 16 color and 256 color workstation environments. In each case, all existing software was written with the expectation that white was the expected color for paper backgrounds. As a result, anyone that produced a new software package that used a darker color for a paper color would appear dingy and unusable on any monitor calibrated for the masses of existing software. It was impossible for software vendors to move away from an old color model, since they would run into visual compatibility problems with old software.
The only way to fix this is to introduce a compatibility mode into graphic display systems (like Xorg and the console framebuffer) where a lower brightness is used for all software that does not flag itself as understanding that it is running on a "full dynamic range" display. Then the monitor brightness may be turned up by, say, 50% (perhaps aided by a software calibration utility displaying patterns on the screen). All old software would have the brightness of its images reduced by the display system by an amount chosen to bring the brightness of 100% white down to the level of a piece of paper. All new software would be allowed to use the "extended" range of colors, with more brightness available. Common interface images and the like would not be extremely bright, but games could have bright explosions.
Images are all currently calibrated for the traditional "white is paper" setting. Image formats would need to be extended to contain a "use extended colorspace" flag, much like PNG contains a gamma setting.
Since this takes some work to implement, and there are two other changes to the workstation color model that need to be made, I would suggest that all three changes should be made at once.
The first additional change is the move to calibrated color systems. Currently, desktops are almost never callibrated to a particular level, unless they are being used by graphic designers. This results in an inability of designers to produce images that may be viewed by end users (if they so desire) at a correct brightness/contrast level. I strongly suspect that this is because even basic color calibration is made difficult (and often expensive) rather than the normal operation that it could be. There is a need for operating system display configuration tools (like the Monitors control panel on Mac OS, the Display control panel on Windows, the Screen Resolution capplet on GNOME, and the KDE Control Center Display tab) to contain a set of callibration images like the ones here to help ensure that the majority of monitors are properly callibrated (again, if the user so desires).
The second additional change is the move to 64-bit color. 64-bit color has been talked about for some time (at least for framebuffer use inside video cards), but hasn't yet caught on. 64-bit color will probably the last color move ever made, as it should pretty much max out the human visual system's ability to distinguish colors. 32-bit color (at least with 8 bits used for each channel) produces banding faintly visible to most people even with a conventional "paper is white" calibration. If you want to see if this affects you, use an image editing program (such as the GIMP) to create a fullscreen gradient, black to white, upper left to lower right, and see if you can see bands of brightness appearing in the image, despite the use of every 256 levels of gray available on your monitor. I can easily see bands (especially in the dark part of the gradient) on my monitor. Moving to a larger dynamic color range will exacerbate this problem if the number bits used for each channel do not incrase, since each brightness level will be further apart. Since apps must be updated to use 64-bit color, the 32-bit to 64-bit transition is an excellent time to introduce a broader range of brightness levels -- one would simply map old 32 bit colors to a range running from, say, 0 to 2^63 (or half of the available brightness range).
By simultaneously adding calibration images to display control panels, increasing the dynamic range, and increasing the bit depth available, end users can be given truly realistic (rather than the equivalent of ink-on-paper) visual reproductions of the outdoors, visually-identical images across workstations, and freedom from banded, perceptibly-imperfect images. There is no reason that users should be able to say "that looks like it's on a computer monitor" -- a computer monitor, properly set up, can properly reproduce everything that is seen in the real world.