Comment Where's the upstart? (Score 1) 51
If vibe coding is the way of the future, why isn't software shipping noticeably faster? Jensen Huang is promising that an LLM can do what takes me a year in a weekend with a few prompts and no one will ever need to learn programming. So....where's the proof?
Why aren't my favorite publishers pushing out releases so much faster? OK...maybe they're the dinosaurs who can't keep up...where are the upstarts disrupting the market? If an LLM can replicate an existing program, where's the salesforce.com of AI-generated apps? Where's that upstart that will generate your custom ERP for you in a fraction of the time it takes to implement Peoplesoft, SAP, or salesforce.com?
I (think) I want AI to succeed, but I am personally disappointed because there are SOOOOOO many obvious use cases for a working LLM. For example, my favorite game...why can't an LLM generate new levels for me?...OK, maybe it requires supervision...why isn't my favorite publisher pushing out a new level a weekend or even a month for $5-10 each level? Why isn't some upstart releasing a tool that can take my sloppy legacy code written in COBOL or even Python/Java/JavaScript and porting it rust that runs faster than I could have ever imagined? Why isn't it optimizing my legacy code? Or...a service that promises to patch all vulnerabilities for a hefty monthly fee? It's SOOOO profitable and so easy if these things actually worked.
I think you and I know the answer...there's no useful LLM services tackling real world problems because they don't work. It's a fucking license to print money if they did. For now, all we get are shiny tools that may or may not work...with no guarantee. 3 years is an eternity in tech. We discussed Netscape in 1995 to massive commercial internet in 1998...another example is the iPhone being released in 2007 and how much the market exploded by 2010. There were many revolutionary mobile apps by 2010 that disrupted many markets and created new ones. Also, AI is ANCIENT. They were talking about this in the 60s. I had machine learning textbooks in the 90s. Google had a working LLM in 2018.
You mentioned legal recognition. That's based on theoretical AI. Theoretical AI is AMAZING. Real world AI? Not sure...definitely not useless...but...on the flip side...what is it accomplishing? I can see some disruption in creative industries....but entertainment is a small industry that allows mistakes....a Java compiler doesn't. The frustration is we're getting promises and so many people are using their imagination to picture what AI COULD bring...but then I try it out and see it falls far short of what I imagine (and I have very realistic, low-ambition expectations) and it falls short of what every tech executive tells me it can do.
As much as people are frustrated at Apple for Apple Intelligence, I respect them. They put the cart in front of the horse and announced this...assuming ChatGPT could keep up with their promises of innovation...they tried it out and saw it failed hilariously....and let's face it, the stuff they promised wasn't very ambitious and Apple has an attrocious history with software, so their standards are not very high...1st rate hardware and AWFUL software (think iTunes on Windows). Even Apple, with their billions and partnering with the industry leader couldn't make a useful product with it....so yeah, I am skeptical until I see it solve real-world problems...not just sell me toolkits and tell me to figure out a way to make it useful.