I never said it required a higher bitrate
You literally said:
higher resolution requires a higher bitrate
do you know of anybody that builds their media library to have their 1080p files be lower quality than their 720p?
I do not know anybody who does this intentionally, but most people I know do not encode their own videos. If they get their videos from many different sources that have different standards of quality, it is likely that they will have some videos that are both higher resolution and lower bitrate than other videos in their collection.
Furthermore, you are assuming that all these files are encoded with the same compression algorithm. Some compression algorithms have better performance. It is very possible for a 1080p video that is encoded with an efficient algorithm (e.g. h264) to be both better quality and lower bitrate than that same video encoded at 720p with a lower performing algorithm (e.g. cinepak)
The point is that the the ability to stream a video is directly dependent on the bitrate of the video. The fact that higher resolution videos tend to have a higher bitrate is just a correlation.
Absolutely.
I do encode my own videos (well my colleagues videos), professionally. they are watched by literally millions of people literally every day. I aim for a constant quality, not a constant bitrate. Most items I encode at a given subjective quality will add about 50% to encode at 1440x1080 25i over 576x768 25i, however I've encoded some low light stuff in SD that's hit a peak bitrate of 45mbit for a given quality on a given h264 video stream. The 3 minute piece came in at an average 25mbit.
That was originally off a DV25 piece.
Recently, 5 minutes of glorious looking pictures of floods, with a lot of running water, waving reeds, etc; rather than a typical 4-8mbit over 3-4 minutes depending on the clip, this stuff came in at 18mbit. It was well lit, but had a lot of detail and a lot of motion. The piece is a brilliant test, far better than people skipping through a forest.
Now encoding something simple like south park will use a lot less bits for a given quality.
So for the same subjective quality and same resolution you can have a bitrate varying by an order of magnitude at the same resolution. Even if bitrate scaled linearly with resolution you'd still have a minute of some SD material taking more bits to encode than a minute of other HD material.
I haven't run any tests, but my gut would tell me that something cartoonish like southpark at 1080p60 will use a lot fewer bits per second than encoding Blue Planet at 480i or 576i -- for starters I think southpark is only 15fps, so you're only really encoding 1080p15 (given the inter-frame changes on each frame will be zero).