Cameras can be used to apply various blurs and lighting effects to photos on the spot by using the lense and the flash before recording the image data. Most are used such that the subject of the photograph is more visible from the background.
But what if you could switch between these effects on a 'raw' photo after the picture has been taken? All it would take is having a camera that is optimized for photographing with minimuim distance blur and with high darkness sensitivity. It would also reduce the complexity on camera controls since you wouldn't have to switch between modes as often because everything is done in the postprocessing.
On top of this, photoshop filters would be able to do many interesting effects by just using a single slider and minimium tom-foolery. Want the background blurrier? Just pull the 1 slider right. Want the background less saturated? Pull this other slider. Since the photo is 3D, all of the needed data for distance should be able to be able to be automatically extrapulated with no fuss.
It seems that there are more perks to 3D photography than just for sterioscopic displays. So why hasn't this tech appeared yet, Slashdot? Am I missing something? Or are people worried so much about the glasses that they can't see the effect 3D photography will have on 2D photos?