Depends on what you're willing to compromise. I have Nvidia Optimus, everything works perfectly for me. I had to use bumblebee and not Nvidia's own hacks, since NV's don't work yet, and bumblebee does, but it's pretty close to the Windows experience. While it doesn't auto-detect apps and select Intel/Nvidia automatically for me, it does allow me to manually force Nvidia usage much more simply, so I score that a wash.
Intel for 2D/desktop - works great.
Nvidia for performance 3D - works great.
Auto-power-off of Nvidia when not in use - works great.
The only thing I get in Windows that I don't get in Linux at the moment, when I'm docked with 2+ monitors connected in Windows, The drivers seem to understand the connection setup, and the Nvidia chip will stay hot/lit-up all the time, to drive monitor #3. In Linux, if the Intel graphics can't make all the connected displays go, they don't. I'm told that I could fix this too, with quite a bit of hackery/tinkering, but I just plain can't be arsed.
Setup literally consisted of 3 or 4 commands and some testing to verify it was working correctly. No reboot needed, just an X restart once everything was in place.
Now, since I have a nice high end Dell, I can bypass the entire issue by making the Nvidia chip the primary/only graphics adapter, but that kills my battery runtime (from 5-6 hours easily, down to 2-3), and makes the machine hotter and louder.