Comment Also ironic because of how Reddit got its start (Score 2) 116
https://www.vice.com/en/articl...
> How Reddit Got Huge: Tons of Fake Accounts
https://www.vice.com/en/articl...
> How Reddit Got Huge: Tons of Fake Accounts
Witness Houston Space Flight Center, which was LBJ's quid pro quo for getting the Apollo program through Congress, and is responsible for the handoff of manned launches from Florida to Texas just after the vehicle clears the tower.
It was about the principle of the thing when it was Apple and Google.
Except it wasn't when it was about Xbox and PlayStation.
This is just an extension of the $ituational ethics here.
and "We're not doing that anymore."
And nothing else will happen...
Offer everyone who has either of the old plans a subscription to the new merged service.
Which will cost more since it is "worth more" because both sides now have access to new content.
Anyone who doesn't want to pay more is free to quit.
This will create a bunch of new revenue for the merged service, or a mass exodus.
in the historical training data, so it's hard for an LLM to induce those "rules" from so little example text.
I bet they know a lot more about chemical weapons than they do about biological weapons, for similar reasons.
AI, that doesn't suck, running locally on your phone.
You know who's most likely to get to the doesn't suck stage first?
The phone maker who makes money off hardware, and not ads and surveillance, which inherently suck.
Now, all they have to do is get their software under control again... *sigh*...
He drew on chess as precedent: 15 to 20 years ago, a human checking AI's moves could beat a standalone AI or human, but machines have since surpassed that arrangement entirely.
Chess programs got better because the machines running chess programs got Moore's Law better and Denard-scaling faster, so the same costing machine could search deeper.
Machines running LLMs are not getting better at the same double-every-18-months rate that we had until about 2012. Moore's Law for storage and gates is not quite dead yet, but clock speeds are basically standing still.
Government dragnet searches used to be unconstitutional - Fourth Amendment and all that.
Then we got the third party doctrine which says it's OK for the Government to ask third parties to give it info on people without a warrant, and it's OK for those third parties to comply. Some of the biggies like Apple and Google usually won't comply without a warrant, others cooperate.
Today, there are third parties like Flock who basically sell dragnet searches to anyone who can pay - auto dealers who want to repossess cars, etc. Other third parties like Ring default to taking their users' data and making it available to the Government - you can tell them not to when you sign up, but the vast majority don't see the harm in contributing to mass surveillance.
Short of new overarching privacy laws like the GDPR, we're screwed.
I wouldn't carry a device in my pocket that contains a lithium battery that hasn't been tested for safety in a vacuum.
The real risk isn’t that a generation can’t read code. The risk is that we stop expecting them to. If we treat LLMs as training wheels instead of prosthetic eyesight, we get a generation that ships faster and understands deeper. If we treat them as a replacement for learning, we get brittle systems and brittle people. That’s not a technology outcome. That’s a cultural and educational choice.
The ability to read and write code without support was in decline long before LLMs - FizzBuzz as a low bar dates from Spolsky in 2005.
If we're hoping for the right cultural and educational choices to save us
Until then, it's pure theoretical speculative jabber.
as a meaningful category when chocolate is added to it.
It's like Paula Poundstone's reasoning about Ding-Dongs:
They only have three ingredients- devil's food cake, creamy filling, chocolaty coating.
I don't know how the next Python is going to get any traction, if table stakes for adoption is "language is understood by LLMs".
The current generation of coders won't use it if their LLM of choice doesn't understand it.
LLMs won't understand it if there's no training data, which comes from users.
I've always told people that coding would be automated last, if ever.
Apparently I was wrong, and will have to settle for being part of the last generation of coders that can actually read and understand code without LLM support.
Anyone who owns a Cybertruck bough it relatively recently, so Musk's predilections were priced into that. A shotgun rack in one of those would definitely be on point.
As for older Teslas, there's a lot of angst and buyers regret out there.
Witness the brisk trade in bumper stickers like:
I Bought This Before Elon Went Crazy
Vintage Tesla - Pre Madness Edition
This Vehicle Reflects Climate Consciousness, And In No Way Aligns With The Political Views And Actions Of Its Company's Founder
I have a 2020 Bolt, so I have no dog in that fight other than as a somewhat bemused observer.
The universe is all a spin-off of the Big Bang.