For people who have run into those sorts of situations, they tend to remember it. The fact that 99.9% of people can install with no problems doesn't counteract the fact that they spent 12 hours banging their head against the wall trying to fix a "simple" issue with their installation.
How is that different from Windows, though? When I got my GA-MA770-UD3P 1.0, trying to install XP produced a black screen with a broad variety of video card options, and two different known-good power supplies. Eventually a BIOS update fixed the problem, which is why I single out the motherboard. My CPU and RAM both might have played parts, oddly. Gigabyte told me they couldn't explain it and they wanted me to pay hourly for them to figure it out, but eventually they must have figured it out because a BIOS update cured the problem.
Meanwhile, Linux installed just fine.
A Windows update is also staggeringly likely to send you back to the store to replace your peripherals. For people who don't have any, whatever, but MFDs and scanners and whatnot often don't work on the new Windows for some dumb reason. Usually they speak the same protocol as still-supported devices... which is handy if you're a Linux user.
I've had Windows just mysteriously refuse to play ball on machines where Linux works great. Just trying to find a driver for my Renesas USB3 card for Windows is ugh, but obviously, the driver comes with the Linux kernel. Blah blah blah. Anecdote, data, whatever.
If you have some sort of edge case, any OS can crap on you.