Natalie Portman naked and petrified, karma whoring, ASCII Goatse/Penis Bird, only old Koreans use that anymore, BSD is dying, GNAA, The Lone Gunmen Are Dead, OMG Ponies, Twitter sockpuppets, nobody wants an HTPC, hosts file, Buck Feta...
But you will be limited to 20 fps due being all you can copy during the vblank period.
But that's still good enough for nearly PlayStation 1 video quality, as a bunch of PS1 games ran at 20 fps.
a big ass-free CRT is desirable, while a big ass-CRT would be undesirable
That depends on whether or not you're playing an H game. For an H game, you want ass on your CRT.
I want scanlines that actually look like they're on a CRT (simulating how bright and dim scanlines are different sizes)
That can be done in a shader by starting with linear interpolation and varying the gamma on different output scanlines, with a lighter gamma near the center of each input line and a darker gamma between input lines. The darker gamma will stay dark unless an adjacent scanline is light, at which point the signal bleeds over into the higher response part of the gamma curve.
It's surprising to me that nobody has stuck an FPGA between a composite input and an HDMI output and stuck a CRT simulating pixel shader in the middle.
It's not surprising because such a product would be extremely niche.
That the US market had a crappier output possibility, combined with a worst Video standard (nicknamed Never The Same Color
The analog TV standard in Japan was NTSC with a different black level. This is why the Famicom and NTSC NES use the same 2C02 PPU, while PAL regions need a different 2C07 PPU.
The situation is completely different from the first home computer doing "composite synthesis" and achieving more colours on the screen than supported in the GFX hardware.
You're referring to the 7.16 MHz pixel clock of several early game consoles and home computers (Apple II, Atari 400/800, Atari 7800, IBM CGA, etc.), which was exactly twice the NTSC color burst frequency. This let the program synthesize the exact waveform going out the wire. The Genesis's pixel clock, on the other hand, was 15/8 times color burst. At that rate, patterns of thin vertical lines resulted in semi-transparent rainbow effects, which weren't quite as predictable as the but still fooled the TV into making more colors. The NES pixel clock of 3/2 color burst was coarser but had a diagonal bias, allowing games like Blaster Master to create more apparent colors than the four per 16x16 pixel area that an NES game usually has by using small dots of different colors adjacent to each other and to black.
How about instead, using an emulator
That'd be fine if more people had a PC in the living room.
you can also find old TV shows and music that are no longer under copyright
How is that possible? Practical TV broadcasts didn't begin until years after the January 1, 1923, cutoff for the Copyright Term Extension Act. And U.S. copyright law allows state copyright in sound recordings to continue until 2067.
A lot of geographic areas don't have a second provider other than satellite and cellular. In most cases,* switching from a provider with a 150 GB per month cap to a provider with a 10 GB per month cap (source: exede.com) isn't a good idea. Nor is moving to a different town.
* Watch someone come up with an edge case.