Comment It can't answer basic questions factually... (Score 4, Interesting) 57
...like "Tell me about Tiananmen Square" or "Tell me about Xinjiang".
Is this what you want for the future?
My thoughts back when R1 came out:
...like "Tell me about Tiananmen Square" or "Tell me about Xinjiang".
Is this what you want for the future?
My thoughts back when R1 came out:
Yep. As a grizzled greying-hair, I'm not super worried about myself. I have the experience and knowledge to continue to do things AI simply can't, and probably won't before I retire. For that, I'll continue to command pretty good pay at a phase in my life when my kids are moving out and my expenses are going down. Bully for me.
Automation always shoots for low-hanging fruit. When that's factory assembly line workers, it's not such a big deal (it is for the workers, obviously, but I'm talking systemically) because those types of work don't usually have a significant career ladder, and after you automate, you don't need much of that role at all anyway.
When it's chopping off the bottom rungs of longer career ladders, it ultimately creates real supply chain issues for talent at the high end.
Computer science isn't supposed to be vocational training.
This guy isn't trolling. I remember 25 years ago being told exactly this by a professor at a Big 12 university.
The colleges are flat out TELLING YOU TO YOUR FACE that they aren't going to teach you what you think you're going there for, and you're going there anyway and signing away tremendous amounts of money for the privilege.
Meantime, technical/community colleges often DO try to fill that niche, and far more cheaply, but they're looked down upon as a "lesser education" -- and it's true. It IS a lesser education. They're 2-year degrees, and you can often pay for them in cash. You don't get anywhere near as much "general education" (i.e. the stuff that most people AREN'T going to Uni for anyway), and most of them are every bit as far behind the state of the art as most universities are (i.e. the tech education is BS anyway).
So there's a choice between spending an absolute ASS TON of money on a university that doesn't want to teach you vocational stuff, a technical school that's far cheaper but still doesn't really teach you anything modern and gives you a degree no one respects, or just teaching yourself. In tech, honestly, just teaching yourself is the way to go. You'll learn the programming in a few years on your own, and you won't be in debt for the next 20 years. Whatever couple rungs of the ladder you are docked are easily recovered in your first few years of experience anyway, and then almost no one gives a shit about your degree anyway.
...an article worth considering from Princeton University's Zeynep Tufekci:
We Were Badly Misled About the Event That Changed Our Lives
Since scientists began playing around with dangerous pathogens in laboratories, the world has experienced four or five pandemics, depending on how you count. One of them, the 1977 Russian flu, was almost certainly sparked by a research mishap. Some Western scientists quickly suspected the odd virus had resided in a lab freezer for a couple of decades, but they kept mostly quiet for fear of ruffling feathers.
Yet in 2020, when people started speculating that a laboratory accident might have been the spark that started the Covid-19 pandemic, they were treated like kooks and cranks. Many public health officials and prominent scientists dismissed the idea as a conspiracy theory, insisting that the virus had emerged from animals in a seafood market in Wuhan, China. And when a nonprofit called EcoHealth Alliance lost a grant because it was planning to conduct risky research into bat viruses with the Wuhan Institute of Virology â" research that, if conducted with lax safety standards, could have resulted in a dangerous pathogen leaking out into the world â" no fewer than 77 Nobel laureates and 31 scientific societies lined up to defend the organization.
So the Wuhan research was totally safe, and the pandemic was definitely caused by natural transmission â" it certainly seemed like consensus.
We have since learned, however, that to promote the appearance of consensus, some officials and scientists hid or understated crucial facts, misled at least one reporter, orchestrated campaigns of supposedly independent voices and even compared notes about how to hide their communications in order to keep the public from hearing the whole story. And as for that Wuhan laboratoryâ(TM)s research, the details that have since emerged show that safety precautions might have been terrifyingly lax.
And the likeliest explanation is things connected with the GDPR "right to be forgotten":
Gawd. I felt that.
Though for me it's not that people are moving up faster, it's that companies can't keep focus on anything long enough to see even a medium-term play pay out. If it's not short-term tactical bullshit, then it's taking too long and we need to cancel it. You can spend all your time fixing all the previous short-term tactical shit instead. And anyway, I know we took up all your time on horseshit and told you not to work on it, but why isn't that medium term project delivered yet?
Meanwhile, the people who ARE moving up are the people who "solution" with unsolicited clever-sounding bullshit in meetings, and over-simplify everything that
This isn't why I became a programmer. There's no craft left in it anymore.
The last time this comet came around was the Stone Age. To give an idea of how old the planet is, this comet has had time to visit Earth over 90,000 times!
I don't know if it was actually orbiting at all that far back, but that's the time scale. This uses the Wikipedia quoted age of Earth at 4.543Bn years.
I've had trouble with Zoom the several times I've tried it. Sometimes it worked ok for a little while, but then the camera would get laggy, or the sound would get choppy, or it would crash, or it wouldn't be able to find the camera, or some other weird misbehavior. Granted, it's been a while since I've tried it, because I hate having problems during meetings, so maybe they've worked some of this out. Also, I've now got pipewire instead of pulse, so maybe that'll make a difference. The point is that it just works on my Mac, and I don't have to think about it. I've already thought about it more in this post than I have the entire time I've used it on my Mac. Maybe I'll try it again for some low-risk meetings and see how it goes.
When I say "office" tasks, that's just "generally office stuff required by my company." Nowadays, a lot of that is web-based, but for a long time, I had a lot of mucking about with VPNs, or they required an antivirus to be installed to connect, or we used some software that wasn't available for linux, or a dozen other little things that just made Mac a lesser hassle. Now with Google Docs and Office 365, a lot of the required apps, like Outlook, are web-based, and perfectly accessible from Linux, but it's still nice to have the app installed in some cases. Linux has Teams packaged up on Electron, but I don't think any of the other apps are similarly available.
Point is, you have to choose your battles and pick your tools. I've found that the types of problems that I don't want to have are exactly the ones the Mac does absolutely flawlessly and without a second consideration. 0 brain cells dedicated. That said, I much prefer Linux for my development activities for the exact same reason. I tend to have to screw with things a bit on Mac to get them working, where on Linux they just work most of the time. So I do the tasks on the platforms that make them no-brainers. I've got enough problems.
I've been using Linux since the late 90s. It's amazing, and it's been my bread and butter my entire career. That said, I also really like Apple's stuff for some really pragmatic reasons.
* Macs are the closest "mainstream" thing to Linux. For the first half of my career, I wrote most of my Linux-targeted code in Windows because my companies wouldn't even consider anything else for corporate desktops. Talk about frustrating!! Over time, they became convinced (begrudgingly at first) to allow Macs, which at least is unixy enough to make life a lot easier. Non-tech people just like 'em more, cuz they're pretty and easier to use. Once they had the foot in the door, lots of people wanted them, and they became pretty standard.
* While my main developer workstation is a kickass Linux workstation, I do still keep a kickass Mac on my desk as well. I find that it just does some things better: Media, "office" tasks, Zoom, proprietary software that doesn't support Linux, and development tasks for projects where the usual workflow/scripts assume you're on a Mac. When I'm not using it, it makes a nice jukebox.
* The hardware lasts *forever*. When it starts getting long in the tooth for MacOS, you install Linux on it. I'm still using 2012 Mac Minis running Ubuntu as a Kubernetes cluster. My 2015 Macbook Pro runs Linux Mint like a champ after a simple battery replacement several years ago. Seriously, I almost never throw the stuff out. So what do I do? I buy high-spec'ed hardware when I get it, and spend the money on it, because I know I can reasonably expect to still be using it in ten years. I upgrade my high-touch machines every 3-5 years, and the existing machines go to family members or my server cluster for a second life.
I know MS has changed their stripes somewhat over the last decade, but as an old timer who saw their antics under Ballmer and Gates, I still struggle with considering Windows or its ecosystem for anything at all. It might be better in some ways, but old habits and prejudices die hard, I guess, so, for me, it's not even in the running.
I tend to work for startups and SMBs, most of which do not have the attitudes and drives many people here seem to assume. They don't have these fabled real estate portfolios, and they tend to be pretty light on management. They've almost all not only embraced work from home for current staff, but they've been hiring remote-only workers in states (and countries!) where they have no presence at all.
The place I work now is a truly distributed company with tech workers spread across the world, and very few people going to offices. We have offices in a couple major cities, and some people who live in those cities do go there if they choose, but they sit at their desk on Zoom for most of their meetings anyway, talking to their team mates all over the planet.
This distributed nature was often the case in the huge companies I've worked at too. Seemed like most of our meetings were video conferences anyway, with people sitting in other offices.
Working from home is just an extension of the way most companies were already operating in an increasingly globalized and distributed environment. It's not some new thing, it's just the next step.
What is
Hype.
"So why this hype? Because the cryptocurrency space, at heart, is simply a giant ponzi scheme where the only way early participants make money is if there are further suckers entering the space. The only âoeutilityâ for a cryptocurrency (outside criminal transactions and financial frauds) is what someone else will pay for it and anything to pretend a possible real-word utility exists to help find new suckers."
Ah yes, the disgusting and false refuge of Chinese (and Russian) apologists and propagandists to the worst abuses of authoritarians: âoethe US does it too.â
These abuses are not âoeallegedâ; they are happening, and they are not based on dubious âoeresearchesâ [sic]:
https://www.propublica.org/art...
There is a genocide happening in Xinjiang; one that is erasing an entire culture, language, religion, and history of a people.
https://www.nytimes.com/intera...
When Google killed Reader, I did, and never looked back.
The best things in life go on sale sooner or later.