I, for one, love my 28" 4K monitor. ~157 DPI was a vast improvement over the 96 DPI my previous monitor had (especially with text, but still visibly so with images & video), & even then, I can still see the pixels in large solid-color areas & especially diagonal lines (even if antialiased). (I think I sit around 2 feet from it on average.)
A multiple of 96 DPI would be a significant improvement for integer scaling of content (unless X & GTK decide to get their act together). But the overall screen size is also nice, so 8K at 288 DPI (about 30.6") would be good.
(I doubt anyone will make a 4:3 or even squarer display with high DPI & good color reproduction any time soon, but a girl can dream.)
My guesses (without knowing about all the systems it supports):
Having a resolution that is a multiple of every supported system's resolution
Alternatively, having a high enough resolution that non-integer scaling is unnoticeable
Imitating quirks of the original displays (like the scanlines on non-backlit GBAs)
I managed to type in a dream I had on a TI-89 Titanium in the dark in a tent in a thunderstorm. You just need muscle memory of the keys' locations. (Granted, it has physically separate buttons, which makes it closer to a PC than a smartphone in that respect.)
(Why did I have a TI-89 Titanium but not a source of illumination...priorities. Why was I in a tent in a thunderstorm...archaeological dig in Russia.)
Companies that dislike emulators being made of their devices (e.g. Nintendo) never managed to argue this with respect to the software interface presented by a device (as distinct from the BIOS code & such), so why should a software interface presented by software be any different?
In Nintendo's case, I seem to remember they also tried to use mandatory inclusion of the Nintendo logo to keep people from even using their interface without their permission, which then failed in court. Which is different from implementing the interface, but I would think if they could compel people not to release emulators they would have long since done so.
I had that feeling at least once when I fell asleep on the way home from school, & my mom drove somewhere I had never been before. I distinctly remember telling her I felt like we had driven through an interdimensional portal when I woke up (& wondered what would happen if we did not go back through wherever the portal was).
So perhaps sleeping while someone else drives you somewhere random would work even better than driving yourself?
Wikipedia says singular "they" dates to the 14th century, although that usage was discouraged (but prevalent) in the 19th century before once again becoming more widely acceptable (& even preferred in some cases) in the 20th.
I do still occasionally need 16-bit support (usually for running DOS programs that for whatever reason do not have Linux equivalents—e.g. to restore a backup in a discontinued format). I have ended up using QEMU because VM86 is not supported in long mode.
I was previously unaware of it, but apparently (I just checked sandpile.org) 16-bit protected-mode tasks are supported under long mode (which does not help with DOS).
Not only that, but Q(uick)BASIC & Visual Basic by default started numbering at 0...the important difference from C is not the lower bound but the upper one: in C, the number of elements, while in those BASICs, the last element's index. They also had a variant where you could specify both the lower & upper bound when declaring an array. E.g. DIM X(-7 TO 7), which was sometimes convenient for things like sprites.
I learned to program using various BASICs (mostly QBASIC, TRS-80 CoCo BASIC, & occasionally Applesoft BASIC.) I still occasionally use QBASIC, because I do not know of any good way to quickly test graphics code in the language I usually use these days (Haskell). But of course when I do (or when forced to use C), my coding style is influenced by functional programming.
"Turn on, tune up, rock out." -- Billy Gibbons