It's not actually double the temporal resolution, though -- you either get the odd half of the frame, then the even half of the same frame, at which point you've got one full frame every 29.97 seconds, or you get half-frames and experience interlace tearing during high-motion scenes. There's also telecine interlacing that shifts 24FPS content to 30FPS by adding an intermediary frame, even rows from the current frame, odd rows from the next, every 4 frames (that's 6 additional frames for every 24 frames of content, thus 30 frames); if you need 29.97FPS from that, you drop roughly 1 out of every 1000 frames (and you'd better stick to dropping full frames, and then only those that aren't adjacent to your interlaced frames, lest you introduce a noticeable artifact into the video every 33-1/3 seconds). There are a number of other encodings, as well, but they're not really relevant here.
Interlacing isn't actually a thing done by TV hardware when it receives that signal anyway.
Yes, actually... Well, not always, but on non-shit-tier sets, yes... But, I also think you meant de-interlacing.
It's entirely up to the TV to reconstruct the full frame, then, if necessary, scale the result to match the resolution of the display panel. If you don't do that and, rather, just scale each frame and display them as they come in, you get an image that appears to oscillate at your framerate (up on the even fields and down on the odd), which is really super-noticeable on static objects, like the bug most networks put in the bottom corner of the screen, or in still scenes.
It's been a good decade since I've worked with this stuff, but I still know a great deal of it.