Comment Re:What's the size again? (Score 1) 22
My point is that English text might be a weak indication of the state of the world. People learn how the world works from a lot of extra data, and a very large amount of it if we count everything our senses pick up. So a method that optimizes for heavily compressing a small amount of text may face inherent limits before it becomes useful in an AI context.
To put it differently: suppose that the text was 10 bytes long. Someone might come up with a brute force approximation that tests every program of size 10 bytes or less to find which one represents the text with a smallest size, within some given time. But this brute force approach does little good and the optimal compressor for 10 bytes wouldn't be very useful for AI. What I'm saying is that if people absorb so much more data than 1GB of text to be able to reason about the world, then an excellent 1GB compressor could still be too small.
Hardware lottery doesn't really matter in this context: perhaps there exists a compressor that's slightly worse at 1GB but has a much more amenable asymptotic complexity. It doesn't really matter whether this compressor could benefit from dedicated hardware.