Ah - yes, I did misread it. Jobs did push people and created the Reality Distortion Field, but he could not have gotten much going without Woz.
And, yes, I was talking about Woz' accomplishments.
You're correct that Woz is brilliant, and did brilliant things, but it's completely incorrect to discount what Jobs did.
But what did he do that actually counts as innovation? What new did he bring into the world?
Some of his logic designs were amazing. I was learning digital logic when I got my
I can't remember other examples, but his habits of having to keep chip counts down, so he could make what he wanted when his family didn't have a lot of money, came through in a number of ways in his designs.
Plus Jobs was little more than a used-car salesman.
Hmmm...
That's being a little bit harsh.
He sold *new* cars!
They really should be honoring Steve Wozniak instead. He's the one that did the work, did the innovation, made a floppy disk drive work for a price lower than anyone else could imagine by innovating. He's the one who did the designs and made it all possible. But Jobs was more visible and knew how to capture headlines.
Seriously, Jobs and Apple would have been NOTHING without Woz doing the kind of stuff he can do.
Clicked to update my password - now the Plex site login won't work at all. I don't mean it won't take the new PW. I mean you can't get the login page.
Not surprised, really. So freaking many bugs in Plex that never get fixed I've questioned their code quality for a while now.
I mean sure, the average layperson is gonna fuck it up. But what about professionals, e.g., a PhD scientist?
I use LLM-based models for lit searches (typically, these are dedicated tools for lit searches, but I have tried it on ChatGPT). I don't use the summaries, but I do use the lists of papers it comes up with and generally go through them in whatever ranking it spits out.
Works pretty well, saves a ton of time in *starting* lit searches. Still have to do the reading. The AI sucks at interpreting papers, wouldn't ever trust it as it stands now.
In computing, the mean time to failure keeps getting shorter.