I'd argue that 100Mbps is woefully inadequate for a household where there are multiple users.
...
For reference, YouTube 4k is up to around 50Mbps, although the average is lower.
I think GP was accurate: "most consumers have little use for more than 100Mbps on a residential line in 2024."
Your own example would support 2x 4k streams of 50Mbps with no issue. And yet, 4k streams are generally around 25Mbps (https://www.highspeedinternet.com/resources/how-much-speed-do-i-need-to-watch-netflix-and-hulu), and are not the norm. Most content is streamed at a lower rates, like:
SD is 1 - 3 Mbps
HD (1080p) is around 5 Mbps
4k is 16 - 25 Mbps
You could have 10 active HD streams going and still have an extra 50Mbps left over.
The example of downloading a 50GB day one patch is, IMO, silly. That's not a bandwidth figure; it's a total bytes transferred figure.
50GB at 10Mbit takes 11hr
50GB at 50Mbit takes 2hr 13min
50GB at 100Mbit takes 1hr 6min
50GB at 242Mbit takes 27min
You'll just wait longer if you don't have the bandwidth, and your upstream is unlikely to be delivering a 50gb day one patch to all users at those rates anyway.
And your home users won't be impacted very much. Traffic prioritization is in effect on most lines and, even without it, your media streams will get plenty of bandwidth still. None of those streams needs low latency, so it'll all be fine.
Now, upload a fraction of that 50gb on your average US home cable connection and your latency will go to shit. But this /. post was about fiber, and if we're talking fiber then one likely has a symmetrical line (same upload and download speeds), and 100Mbit will still be so much that your average user will rarely if ever see any issues related to their connection bandwidth.