1. Because those sites get a crap ton of users. You don't just build a site, you have to adjust for number of users which is difficult.
A billion registered users.
Suppose each of them visit the site once every week (come on it's not Facebook)
Each time visiting 20 pages, perhaps each page has ~10 images
1000000000 / (7*24*60*60*20*10)
~= 8.27 requests per second.
Unless you're doing really expensive operations, most modern servers can handle that kind of load and more.
Sure, it doesn't say anything about spikes, which is a harder problem. But usually the claims about server load of having so damn many users is really exaggerated.
(Of course, once you even *attempt* to *scale* beyond the single server - single point of failure model, you'd need somebody who's experienced enough not to shoot themselves in the foot trying.)