As far as I'm concerned they never left the game, only lost popularity. I'm running a pile-driver core chip on my stuff at home and it doesn't get saturated on my day to day stuff and is quite speedy on my heavy duty stuff as well. I'm not by modern definition a gamer so I'm not pushing it as hard as I can with Windows on the latest AAA titles, but I do use Linux and 3D games - my limitations seem to revolve around my out of date GeForce 750 Ti.
I just built a pile-driver core machine for work, it's being used for video editing - the editor originally wanted a Mac but we talked him into a custom Windows machine instead since Adobe actually caters to the Windows side more considering their war with Apple. He's exceedingly happy - in fact he's asked us to remove his older Mac from his desk he likes the Windows/AMD combo we built him so well.
I'm still a fan of the AMD/Nvida combo, from the old ATI Rage IIc almost always having issues in laptops in the late 90's, to my first Radeon literally smoking after playing Alice for about an hour and half, to the conference room machine where I now work having to run on an older version of the Radeon driver if I want sound over HDMI to work I've never been able to bring myself around to liking ATI/Radeon/AMD graphics with the exception of saying they did great in the Wii and GameCube.