Comment Re:Same with Roblox. Why do they need selfies? (Score 1) 85
always on camera recording for AI eyes only.
What's really scary is that we've reached a point where I can't tell if this is a legit suggestion or sarcasm... and that says a lot.
always on camera recording for AI eyes only.
What's really scary is that we've reached a point where I can't tell if this is a legit suggestion or sarcasm... and that says a lot.
It's probably stuff like the age of the account. If it's 10 years old, the chance that the owner is under 13 and registered it as a toddler is quite small.
Can someone please tell this to Ebay. They regularly sent me emails in which they thank me for being a user for more than 21 years, but request I use a credit card to verify my ID to prove my age if I am trying to buy a tool with a sharp edge. (including a pair of scissors with a blade less than 1" long).
I live in the UK, and do not have a credit card. I do have several debit cards. Ebay does not seem to understand that some parts of the world are not in America.
You had me until you said you live in the UK. As a Canadian, it pains me to say, but the UK is even stricter than Canada on this surveillance and violence prevention crap. Even if Ebay didn't demand you prove your age for wanting a pair of scissors with less than an inch long blade, the UK Post would likely demand ID before delivering the 'dangerous goods', to track the movement of 'nefarious instruments'... lol
In case my sarcasm wasn't evident. I feel your plight, and share in it. This nonsense needs to end.
Sure you do. The human eye doesn't have enough rods and cones to resolve the kind of detail you're talking about at the distances you imply so unless you're a genetic freak you're just another tedious troll.
Oh dear... you seem to think that the human eye takes a picture like a camera. That's not at all how our eyes work. The number of rods and cones is not really that important, density of them matters more, but the focal length between the front and back of our eyes matter the most.
You see (pun intended) the human eye and brain work by scanning and filling in the blanks. Your eye is scanning (micromovements) constantly. Your brain is storing what it saw in an extremely narrow field of view and building up a mental image that makes you think you're seeing everything around you in one shot and in high detail. Nobody is. We're all seeing a very tiny sliver of high resolution dead center of our vision (well minus the tiny blind spot where the optic nerve leaves the back of the retina), and the rest is composited together from memory, and some of it is hallucinated (or inferred by experience and the brain being lazy).
So you are correct that it is unlikely the person can actually see such detail at that distance, but in theory, if his eyes are able to focus light just right at that distance onto the important part of his retina, he theoretically could make out the pixels, even if the pixel density is higher than the density of the cones and rods on the back of his eye.
My guess is he really believes he sees the individual pixels at that distance, but what he's seeing is artifacts from groups of pixels that his brain perceives. Not too dissimilar to how many people (myself included) can tell the difference between a CRT monitor refreshing at 60Hz, 75Hz, and even 85Hz. We're seeing secondary effects and claiming we see the flicker. Heck, anything below 400Hz for a fluorescent tube light and I absolutely see flicker, yet physiologically I should not. Clearly my eyes don't refresh at 400Hz. Just like the OP's eyes do not have the ability to resolve the pixels at that distance, but his brain is treating secondary artifacts as pixels, when they are very likely groups of pixels that appear to show banding in his vision (a common issue with human vision, and the reason why sub-pixel blending is used on fonts, to avoid that exact phenomenon).
So yeah, the OP is confusing his perception with what he is physically seeing. Lots of us think we see details we don't actually see. Our brains are fantastic at guessing and seeing patterns, then convincing us that we are indeed seeing them. That's the foundation of most optical illusions. And if he really did have a genetic abnormality where he could resolve the pixels at distance as he claims, then he'd be blind when looking at anything at any other distance as he'd be unable to focus the lens of his eye due to thickness.
Help! I am trapped inside the fortune cookie factory, being forced to write quips that sound funny if you append [in bed]!
I wish to subscribe to your fortune cookie newsletter.
Correction: They've been trained on a lot of UNCHECKED code.
Garbage in, garbage out.
The hidden costs - especially on up-and-coming devs - is the fact that knowledge isn't being retained so a solved problem will end up being re-solved by LLM again. And again. And again.
Worse yet, given how human creativity works, this means we won't see novel applications or solutions. Just layer after layer of mostly functional AI slop
Funny thing is, the more AI consumes all of our CPU and RAM for their data centers, the more necessary it is to build performant and optimized code, both for CPU cycles and for memory constraints, since consumer devices are going to have worse compute resources available going forward due to cost.
The dinosaurs among the devs (of which I sadly count myself as one), know how to write code that squeezes every bit of useful work from every clock cycle, and how to use the least amount of memory necessary to achieve the task at hand. So the more layer after layer the AI slop generates, the more job security actual skilled developers will have.
We've all seen this comic strip before. Microsoft is Lucy. Windows is the football. Consumers (you and me) are Charlie Brown.
I identify with your visualization.
Don't connect it to your wifi.
I have an LG C2 (circa 2019). I disabled everything I could through the menus. I never plugged in ethernet and had wifi disabled. One day my wife turned on the TV and it started doing an update. I flipped out on her for obviously having turned on wifi, how else could it update? Nearly (okay, not quite, but felt like it) got divorced over it, until I realized, no she really hadn't. The damn TV formed a mesh network with my neighbour's TV over bluetooth (I live in an apartment and apparently the concrete floor wasn't enough to block the signal).
These TVs will stop at nothing to phone home and get connectivity. When I phoned LG and freaked out at them, they told me there was some setting about 6 menus deep, totally unrelated to anything mesh or network connectivity that if I disabled would prevent it from connecting to other TVs. It was not obvious, not mentioned anywhere in the documentation, or in the many many pages of EULA that I actually read through when I first turned on my TV after purchase.
I want a TV that takes an HDMI (or DisplayPort) and displays it. No TV Tuner, built-in apps, or internet enabled anything. It aslo needs to support HDCP or else my AppleTV refuses to send video to it. Regular Monitors don't have HDCP support without being a smartTV. It's infuriating.
If it has to be thicker for battery life reasons, that's fine. It's going in a case with a holster anyway, and getting clipped to a holster on my belt. Thickness doesn't matter. Features matter. Width matters. Height matters. Usability matters.
I thought I was the only one that still used a holster! I found my people!
Also, fully agree about the TouchID being superior to FaceID. I wish Apple actually did proper market research, and talked to those of us that are willing to buy a 'pro' iphone, but don't want a bloody tablet. Give me a good camera, storage, battery life, and a 5S or 6S form factor.. Hell, I'd accept a 7! And to your point, it can be a few mm thicker, that's fine!
The only person who always got his work done by Friday was Robinson Crusoe.