I would expect this article to not spread misinformation. This isn't a new trend. In January 1993 unemployment for all workers was 48% higher than recent grads, but by January 2003 it was 17%. The gap spiked again after the great recession, but unemployment for recent grads has been the same or higher than all workers since late 2018. Before the pandemic or ChatGPT's release (the main culprits in the article).
This has little to do with anything on your list. This is the simple result of too many of the jobs being created over the past decade have been low wage jobs that don't require college educations or much experience. This has dropped the unemployment rate for all young workers by 15% over the past 30 years while unemployment for recent grads rose 30%. The economy overall has gotten better for the bottom 20% over the past few decades (more low wage jobs and a larger social safety net) but for the middle 60% it has been slowly declining for about 50 years.
I wouldn't hold my breath, because this entire article is rubbish. This isn't some new phenomenon. Unemployment for recent college graduates and for all workers have been neck and neck since 2014, and unemployment for recent grads has been slightly higher almost every month from late 2018 to mid 2022, when it started to grow noticeably higher than all worker unemployment. This isn't because of a post-pandemic cooling off period or AI, it is a trend that started in the 1990s with a hiccup just after the great recession that corrected itself in about 5 years.
The more interesting research on this topic that I've read claimed smarter people make better decisions because they are better at deferring decisions. They are comfortable with the uncertainty because of their confidence in their abilities. There is always a tradeoff between making the decision later when you have more information and making it earlier so you have more time to plan your execution. If you need more time to plan execution, you need to make decisions earlier. And your decisions will suffer because of this.
I don't think this research qualifies as "no shit." I think it qualifies as very poorly done and potentially wrong. Or at least the summary makes it seem that way. They asked these individuals a prediction question that existing knowledge significantly helps with. Existing knowledge that many if not most well educated people will have and people with less education probably won't. I know what the average life expectancy is for someone in the US because I have read it. I know that your life expectancy is different when you are 40 because you have already avoided dying for 40 years, and I understand this in large part because of the math education I have had and because I've read actuarial tables before.
If you wanted to test predictive power that is based on innate intelligence instead of education, you need to ask for predictions that a well read adult wouldn't have a significant advantage in making.
Not hiring 30 staff is not the same as firing 30 staff.
It is not exactly the same because of the personal disruption a firing has on someone, but they are pretty close. I have been laid off once, but it was in a strong job market so I found another position with a significant pay increase within a month. Being laid off right now if far worse because it isn't as easy to find an equivalent position.
When looking at the economy overall, not hiring 30 people is the same as firing 30 people. Both result in 30 less jobs in the market. Considering the US population from age 18-65 is growing very slowly right now, technologies that slow the growth of new jobs could be very beneficial. But a shrinking job market, whether from layoffs or a lack of hiring after employee attrition, would still cause some significant damage to peoples' lives.
You two must be using different terminology, because there is no chance you have gone 10 years without seeing a single gas pump at a gas station down for maintenance. Gas stations around me have about 8 - 16 pumps, and the average gas pump is down for maintenance 2-4 hours per week. That means about 20% of the time at least one pump is down for maintenance on average (not accounting for times when more than one pump is down at a time), which aligns with what I would have guessed if I hadn't googled it (I guessed 25%).
I’ve been in charge of hiring and seen supposed mechanical engineering majors with a PhD and 10 years experience fail to explain, in high school physics simplicity, how a hammer works.
My favorite example of this was a mechanical engineering graduate student with a focus on thermodynamics I lived with in college. He bought a large box of FlaVorIce and put it in the freezer expecting them to freeze overnight. I told him they wouldn't freeze quickly if you don't spread them out (like my farmer father taught me as a kid), and he sarcastically stated that a f*ing thermodynamics engineer could figure out how to freeze water. Fast forward to the morning, and the 10 freeze pops I separated from the box were the only ones frozen. The rest of the box was barely even cold.
I don't know the history of IoT devices and don't know a good inflection point such as the release of ChatGPT to compare the speed of adoption, but I agree that is a good candidate. My gut says IoT took a lot longer to gain adoption than gen-AI technologies did after December 2022, but I'm not sure how to really compare the two. IoT is certainly more ubiquitous today than ChatGPT is, but that doesn't say anything about how rapid the adoption was. IoT predates Gen-AI by about 25-30 years. It would be like comparing the number of ChatGPT users with the number of Internet users.
I don't agree the level of hype is unprecedented. I'd point to flying cars, nuclear-powered household appliances, virtual reality in the 90s, and cold fusion as a few technologies off the top of my head that had a similar level of ridiculous hype. I'd agree the hype is more widespread this time, but I think that's just because we are further into the information age so most public discourse is more widespread.
Although as for the impact on my life and job, this feels pretty similar to the advent of the PC, Internet, and smartphones. Not quite yet today but AI keeps getting more useful for me every few months. The PC was mostly a toy for a couple decades, the Internet took over a decade to really become essential, and smartphones took about 5-10 years to become ubiquitous. Each time another radically impactful digital technology is introduced its adoption is quicker than the one which preceded it, so it's not surprising the same is true for the current class of AI technologies.
Which of those technologies do you think had the same rapid adoption as recent Gen-AI technologies like ChatGPT? Many of them were released at a time when less than 5% of the world's population had a personal computer, so how could they possibly have the same level of adoption as a technology released when over two-thirds of the world's population has access to the Internet?
A big reason AI is being adopted so quickly is because decades of advances and infrastructure that makes its adoption possible, but that doesn't change its adoption rate. It just explains it.
The cost of living hasn't affected its popularity.