Firstly, graphics hardware makers prefer to keep parts of the driver proprietary, so they are not tied to published interfaces. This reduces the amount of documentation they have to develop, and allows them to make minor changes in the HW/SW interface when they need to without impacting anyone else.
Also, this is an area where Microsoft (and to a lesser extent Apple) have huge experience. Both companies put a lot of resources into making their platforms easy to develop for, and that includes a lot of help for people writing drivers. Microsoft's WHQL (or whatever they call it these days) runs training sessions and labs with Microsoft experts attending, so the hardware folks can get their questions answered by people who really know the code, and have helped many other people do the same thing in the past. This is nitty-gritty stuff, and costs a lot of money to staff, prepare, and keep going.
Of course the motivation behind this is to make it easier to code for Windows (or MacOS) than for other O/Ses, thus keeping the quality higher and improving the consumer experience. The hardware folks are only too happy to take advantage of this, so vendors write drivers for Windows first (both for the size of market and because all this assistance is available). And so the cycle continues.
I think it's unfair to say
Until graphics card manufacturers take Linux seriously
It would be more fair to say "Until the Linux community provide equivalent tools and assistance". And for that, someone has to see money in it. Don't expect the hardware guys to pay extra for the priviledge of adding Linux support; they get paid the same for their hardware no matter what the O/S chosen is.
Base 8 is just like base 10, if you are missing two fingers. -- Tom Lehrer