I am surprised the parent said he could do 1080i without VDPAU.
Playing MPEG-2 high-definition streams (whether from over-the-air or FireWire) is easy. To oversimplify, video playback involves 1) decoding the compressed video signal and 2) rendering, or displaying, it. As mentioned, my Pentium 4 was fast enough to decode MPEG-2 streams in 2005, and the Xv hardware-assisted renderer (usable from Linux via any Nvidia or Intel video card/chipset made in the past many, many years) quite nicely displayed the video with the more-than-decent Bob 2X deinterlacer. The resulting 50-70% CPU usage I saw is perfectly adequate for a box that doesn't do anything else, and of course the usage would be less with a faster CPU. Before VDPAU, software decoding and Xv render is what the vast majority (I'd guess 95%) of MythTV users used for high-definition playback.
Decoding high-definition h.264 video (such as produced by the Hauppauge HD-PVR, which shipped in May 2008) is much more difficult. My Pentium 4 was able to just barely play 720p 6Mbps h.264 recordings, but no more; people on mythtv-users were reporting in mid-2008 that a the fastest Core 2 Duo boxes were just barely adequate to play 13Mbps (the best quality, more or less indistinguishable from the original) HD-PVR recordings, and sometimes were overstretched even then. In other words, MythTV users were beginning to create recordings they could not play back!
VDPAU has the video card handle everything. The card itself, not the CPU, decodes both MPEG-2 and h.264 streams and renders the resulting video using excellent deinterlacers. Given the dilemma that the HD-PVR created, VDPAU could not have arrived later (late 2008/early 2009) than it did.
 There's still no adequate Xv support using ATi, from what I understand. I don't know whether current ATi Linux drivers have finally solved this; most sensible people on mythtv-users just throw up their hands and buy a $30 Nvidia card.