First and foremost, there's a practicality issue here. Listening tests are very hard to conduct - but they become considerably easier at lower bitrates. Everybody would love to see a 192kbps test or a 256kbps test or whatnot, but actually ABXing differences at those rates becomes so rare that it largely becomes a contest of which encoder outwits a set of brutally difficult problem samples that hardly ever occur in reality (or which encoders favor the artifacts heard by a very selective set of golden ears). The meaning of the results is thus compromised. At 128k the differences are significant enough to accurately compare encoders amongst themselves with more samples.
Second, low bitrates still matter for cell phones, flash players, iPhone/iPod Touches, etc... Applications shift as storage space increases. There will likely always be devices out there with under 30GB of storage space, and there will always be people who want to put 1,000 albums on said devices at high quality. So there will always be a use for low bitrates. (Heck, I don't even think my music collection would fit on my 60GB iPod at 256k!)
Third, bandwidth still matters for online music distribution. MySpace does most of its streaming with 96kbps CBR (blech!). While this test specifically isn't that useful for MySpace encoding, the general question - of how to eke out maximum quality for nominal bitrate - will be important for as long as bandwidth costs money.