This is so ridiculous that LTE-A (or R10) is technically 4G only because it has a specific category (that defines the possible peak rate) 8 that goes up to ~3 Gbps actually. But it's a paper category that nobody implements. Not practical. An hint about the level of ridicule: Cat7 before is 300 Mbps, and Cat9 after is 450 Mbps peak rates. Spot the odd value!
Anyway, 5G could still open the door to much higher peak rates. The key is that 5G is supposed to leverage millimeter waves, or the high-end of the spectrum (10-100 GHz roughly). That makes a lot of bandwidth available. But there's still a lot of work to do to get to workable practical implementation that can fit in a pocket and has a reasonable battery life (forget the early 5G demos made in a van powered by a big engine giving lots of power).
And if anyone cares, here's what I consider as good generation transition points: 1G is analog cellular, 2G is narrow band cellular (both TDM GSM and narrow band CDMA), 3G is wide-band (5 MHz) CDMA, and 4G is wide-band OFDMA (10 MHz or more). This makes WiMAX and LTE 4G from the start IMHO. It doesn't match the IMT definition, but it's certainly a better match for reality. And those steps do represent qualitative changes in user experience too (noticeable higher throughput and lower latency).
Lastly: the only people caring about peak rates at the network operators are the marketing folks. What's important is really getting a good average throughput, an acceptable worst-case throughput and having the highest capacity (served bits/s/Hz by cell). The later is what makes the system cost effective. It turns out that many (but not all) techniques that increase capacity also increase peak rates, and peak rate is easier to sell. Hence the emphasis on big peak rates numbers in the general press. But what matters most to end user and operators is really what I listed before.