Just about everyone that uses a multi screen setup uses 3 screens in order to avoid the bevel in the middle. Usually the left and right screens are angled inward as to for a viewing arc. That is actually not a bad idea. Especially if the angles are such that the optical axis of each screens intersect at the user viewing position.
BUT...
Eyefinity or Nvidia surround don't work that way. They simply fool the rendering engine in believing the aspect of the rendering context is much wider. The result is that the virtual camera in the game uses a wider angle lens (Not quite but it will do to make my point). This causes the edges of the left and right screen to look rather distorted. Adding more screens width wise is really not worthwhile.
What is needed is multiple 3D contexts like you can have in Microsoft Flight simulator where each camera looks at a slightly different heading. But, why bother to solve that at the game engine level. NVidia and ATI pay attention this tip is free!
It should be possible to build true multiscreen logic into graphics drivers. If NVidia can do stereo they ought to be able to render outputs at different angles. Not only that, each output should not even assume that the optical center is in the middle of the screen either. Enter head tracking logic.
I did lots of experiments with multiscreen and what it would take to have the ultimate multiscreen experience. I even wrote some demo software to prove the point in these old videos show that I made four years ago.
http://www.youtube.com/watch?v=ZBdtPz2V_vY
http://www.youtube.com/watch?v=ku76aHq3pps
(Sorry about the cheesy sound track)
And still we are stuck with dumb distorted multi monitor widescreens!