Sure you do. The human eye doesn't have enough rods and cones to resolve the kind of detail you're talking about at the distances you imply so unless you're a genetic freak you're just another tedious troll.
Oh dear... you seem to think that the human eye takes a picture like a camera. That's not at all how our eyes work. The number of rods and cones is not really that important, density of them matters more, but the focal length between the front and back of our eyes matter the most.
You see (pun intended) the human eye and brain work by scanning and filling in the blanks. Your eye is scanning (micromovements) constantly. Your brain is storing what it saw in an extremely narrow field of view and building up a mental image that makes you think you're seeing everything around you in one shot and in high detail. Nobody is. We're all seeing a very tiny sliver of high resolution dead center of our vision (well minus the tiny blind spot where the optic nerve leaves the back of the retina), and the rest is composited together from memory, and some of it is hallucinated (or inferred by experience and the brain being lazy).
So you are correct that it is unlikely the person can actually see such detail at that distance, but in theory, if his eyes are able to focus light just right at that distance onto the important part of his retina, he theoretically could make out the pixels, even if the pixel density is higher than the density of the cones and rods on the back of his eye.
My guess is he really believes he sees the individual pixels at that distance, but what he's seeing is artifacts from groups of pixels that his brain perceives. Not too dissimilar to how many people (myself included) can tell the difference between a CRT monitor refreshing at 60Hz, 75Hz, and even 85Hz. We're seeing secondary effects and claiming we see the flicker. Heck, anything below 400Hz for a fluorescent tube light and I absolutely see flicker, yet physiologically I should not. Clearly my eyes don't refresh at 400Hz. Just like the OP's eyes do not have the ability to resolve the pixels at that distance, but his brain is treating secondary artifacts as pixels, when they are very likely groups of pixels that appear to show banding in his vision (a common issue with human vision, and the reason why sub-pixel blending is used on fonts, to avoid that exact phenomenon).
So yeah, the OP is confusing his perception with what he is physically seeing. Lots of us think we see details we don't actually see. Our brains are fantastic at guessing and seeing patterns, then convincing us that we are indeed seeing them. That's the foundation of most optical illusions. And if he really did have a genetic abnormality where he could resolve the pixels at distance as he claims, then he'd be blind when looking at anything at any other distance as he'd be unable to focus the lens of his eye due to thickness.
Help! I am trapped inside the fortune cookie factory, being forced to write quips that sound funny if you append [in bed]!
I wish to subscribe to your fortune cookie newsletter.
Correction: They've been trained on a lot of UNCHECKED code.
Garbage in, garbage out.
The hidden costs - especially on up-and-coming devs - is the fact that knowledge isn't being retained so a solved problem will end up being re-solved by LLM again. And again. And again.
Worse yet, given how human creativity works, this means we won't see novel applications or solutions. Just layer after layer of mostly functional AI slop
Funny thing is, the more AI consumes all of our CPU and RAM for their data centers, the more necessary it is to build performant and optimized code, both for CPU cycles and for memory constraints, since consumer devices are going to have worse compute resources available going forward due to cost.
The dinosaurs among the devs (of which I sadly count myself as one), know how to write code that squeezes every bit of useful work from every clock cycle, and how to use the least amount of memory necessary to achieve the task at hand. So the more layer after layer the AI slop generates, the more job security actual skilled developers will have.
We've all seen this comic strip before. Microsoft is Lucy. Windows is the football. Consumers (you and me) are Charlie Brown.
I identify with your visualization.
"Ignorance is the soil in which belief in miracles grows." -- Robert G. Ingersoll