But the things you listed aren't features of intelligence, they're bugs in our brains (or simply, things that natural selection de-emphasized out of comparative irrelevance in your basic cave man survival scenario).
Nope, they aren't "bugs." Learning is fundamentally about prioritizing information, making "higher-level connections," creating abstractions that lead to "understanding," etc. No AI system can do this on even the level of a small human child. But a fundamental process necessary to this stuff is being able to prioritize information, which necessarily entails de-emphasizing most of input that's less relevant. It doesn't NEED to be forgotten, but these "bugs" are probably the most efficient way of dealing with the problem.
If those short term memories were more reliably committed to long-term, or there was no real distinction between those things, would that really be a disqualifyier for intelligence?
Yes, if the "long-term" commitment was not accompanied by an incredibly complex (by current AI standards) abstraction process that effectively renders most of the irrelevant "long-term" data as "background" that would rarely or never be accessed anyway. "Forgetting" again is not essential to the process of intelligence, but it likely makes it a lot more efficient and easier for the algorithms in our brains to work. A computer AI which refuses to prioritize information in this way is always going to lag way behind human comprehension.