There is a real question regarding how much it can really keep improving though. Setting aside the 'symbolic vs statistical' issue, we are already seeing model collapse starting. Machine learning depends on training with real data... not synthetic, and no the output of other models. As AI dumps more and more content into the same places that AI trains from, weird things happen to models. The only way to stay ahead of this collapse is bigger and bigger models, which we see in this massive attempt to build out. The unknown part is what these curves will really look like.. can they build fast enough to stay ahead, stay in place, or slow the decay?
Economically, it is not looking good. AI projects are consuming vast amounts of wealth for incremental gains, but that is a one time burst of investment during the one time period where you have large datasets that have not yet been contaminated. If they cut over from investment to self sustaining today, it would likely collapse.