This does not appear to be holding up in practice, at least not reliably.
It holds up in some cases, not in others, and calculating an average muddles that.
Personally, I use AI coding assists for two purposes quite successfully: a) more intelligent auto-complete and b) writing a piece of code using a common, well understood algorithm (i.e. lots of sources the AI could learn from) in the specific programming language or setup that I need.
It turns out that it is much faster and almost as reliable to have the AI do that then finding a few examples on github and stackoverflow, checking which ones are actually decent, and translating them myself.
Anything more complex than that and it starts being a coin toss. Sometimes it works, sometimes it's a waste of time. So I've stopped doing that because coding it myself is faster and the result better than babysitting an AI.
And when you need to optimize for a specific parameter - speed, memory, etc. - you can just about forget AI.
Hey, industry, I've got an idea: If you need specific, recent, skills (especially in the framework-of-the-month class), how about you train people in them?
That used to be the norm. Companies would hire apprentices, train them in the exact skills needed, then at the end hire them as proper employees. These days, though, the training part is outsourced to the education system. And that's just dumb in so many ways.
Universities should not train the flavour of the moment. Because by the time people graduate, that may have already shifted elsewhere. Universities train the basics and the thinking needed to grow into nearby fields. Yes, thinking is a skill that can be trained.
Case in point: When I was in university, there was one short course on cybersecurity. And yet that's been my profession for over two decades now. There were zero courses on AI. And yet there are whitepapers on AI with me as a co-author. And of the seven programming languages I learnt in university, I haven't used even one of them ever professionally and only one privately (C, of course. You can never go wrong learning C. If you have a university diploma in computer science and they didn't teach you C, demand your money back). Ok, if you count SQL as a programming language, it's eight and I did use that professionally a few times. But I consider none of them a waste of time. Ok, Haskell maybe. The actual skill acquired was "programming", not a particular language.
Should universities teach about AI? Yes, I think so. Should they teach how to prompt engineer for ChatGPT 4? Totally not. That'll be obsolete before they even graduate.
So if your company needs people who have a specific AI-related skill (like prompt engineering) and know a specific AI tool or model - find them or train them. Don't demand that other people train them for you.
FFS, we complain about freeloaders everywhere, but the industry has become a cesspool of freeloaders these days.
"When the chef said, 'Hey, Meta, start Live AI,' it started every single Ray-Ban Meta's Live AI in the building. And there were a lot of people in that building,"
The number of people isn't the problem here.
The "started every" is.
How did they not catch that during development and found a solution? I mean, the meme's where a TV ad starts Alexa and orders 10 large pizzas are a decade old now.
Many want to use rust.
Oh yeah? How many? You lack the ability to quantify it, but you pretend that you have the ability.
As an exercise, just try to develop your own code to support https fully and correctly - including common add-ons as Digest and OAuth security - you must depend upon a library to do that.
OAuth isn't an HTTP extension, it's an OSI application level protocol. Also, of all the protocols you could have picked, OAuth is a really simple one. You could code a custom implementation in an afternoon. HTTP isn't even so bad, even with the HTTP/2 modifications, you could definitely do it yourself, depending on your reading comprehension ability (reading code standards is not a skill every programmer has).
The built-in functions for Javascript (and their typical runtime environment - the browser) are minimal leading to a lot of dependence upon 3rd-party libraries.
Ever since most of JQuery functionality got added to the Javascript standard library, you mostly don't need third party libraries. Something like React can be useful if you are working on a web app with a large team because it gives you encapsulation, but even then, the number of third party library dependencies is small enough that a security team can review them all, which some companies do.
So far, but it looks like Samsung is giving it a go.
"Unibus timeout fatal trap program lost sorry" - An error message printed by DEC's RSTS operating system for the PDP-11