Partly, and primarily, latency. For a lot of modern SaaS applications, you're already pretty close to the so-called Doherty threshold ( 400 ms response time) just from the latency of the application itself on both ends (on the servers, plus communication between servers, and in the user's browser). Add a significant amount of latency from the round-trip to Alaska or northern Ontario and back for a user someplace like Seattle or Florida, and you're risking users perceiving your dating app or whatever as slow even if it's reasonably fast in actual execution.
Then too, locating in a rural area doesn't save you from impacting the people who *do* live there (and there are always at least a few, unless you're in the middle of the Sahara or something) by raising their cost of water and power, at least temporarily, because those places have utilities sized to the existing demand. In fact, the fewer people there currently are on the local grid or water system, the worse the impact of even a moderately-sized datacenter is, because it may represent multiple times the existing demand, whereas if you add "1 moderately-sized datacenter" to the load of someplace like the Northeast Corridor, Con Ed won't even notice. (They will, actually, but one datacenter doesn't represent an increase of multiple times the existing demand in say, northeastern New Jersey, so a whole region's infrastructure doesn't have to be reengineered just to accommodate you.)