Forgot your password?
typodupeerror

Comment What's the problem? (Score 1) 80

With all of GitHub's great new AI features, it writes all your code for you! It doesn't matter whether the site is up at any given moment; just download your newly completed app at some point then the site is online. You're free to kick back, relax and scroll your social feeds because you don't actually have to do anything anymore. This is truly a golden era!

Comment Re:We need humility, not arrogance (Score 1) 172

Formal verification mathematically proves code implements a specification. It does not catch bugs that are specified.
There are entire classes of bugs (logic bugs) that LLMs can find that formal verification literally doesn't even try to.

So you prompt the LLM to "find all the bugs".

Even if the LLM can find every last bug (which in turn assumes that this type of problem isn't NP-hard or has some issue that Godel would point out), just defining to the LLM exactly what a "bug" is seems to be pretty much the same thing as those formal specifications that you just convincingly dismissed as inadequate.

I don't think that there's anything magical about LLMs that would let them get around fundamental mathematical roadblocks.

Comment Re:The purpose of art (Score 3, Interesting) 90

It makes more sense as a dialogue if we think of it not so much as a one-to-one conversation, but more like an ongoing, global discourse. After all, movies are not made in a vacuum, and they are--generally speaking--not made for a single specific individual to watch. The artist is informed and shaped by their experiences.

I frame it this way because I want to move away from the "maker"/"viewer" framework--this dichotomy of the creator of an experience versus those who experience the creation. There is a kind of feedback at play that is intrinsic to the ability to create art and to enjoy it. We even see this in cinema--the works of actors (which roles they choose, how they play those roles) are invariably influenced by the culture and sentiments that surround them.

In a strict sense, you are right--it's not as if the artist is directly engaging in a back-and-forth literal conversation. But I think that a more encompassing point of view is useful for contextualizing why generative AI being propped up as "art" is so offensive to some. It doesn't feel "real" to us, and it isn't because the tool is "artificial"--we have computer animated films, for instance. It's because it feels disengaged from that feeling of human connection.

Comment The purpose of art (Score 5, Insightful) 90

is not, as many would have you believe, to be found solely in its consumption or appreciation.

Art is a dialogue. It is a conversation between humans--those who feel joy and pain, sorrow and hope; and it is the embodiment of creative expression in which the artist, for all their imperfections and struggle, brings into being something that marks existence--as if to say, "I was once here, in this space that you now observe."

And that is not necessarily pretentiousness or egocentrism. Art is born from a desire to connect with others, across space and time.

The intrinsic problem of "generative AI" as it is presently utilized as a vehicle of artistic expression is that, overwhelmingly, it fails to create a true dialogue, in much the same way that using a chatbot amounts to speaking with nobody but yourself. There may be a director and other humans who are prompting the AI and exerting control over the output, but the lack of human actors and cinematographers means that the result can only ever be a simulation of art, not art itself. It is not until we can create artificial consciousness--machines that experience human emotions and concept of self--that we can ever say that their status can transcend that of mere tools and their product might become art. To be clear, I am not suggesting we should attempt to do so. But what we have today is very, very far away from this.

Maybe a simulation is enough for most people, who think of popular media as nothing more than transitory stories to consume, discard, and forget. That the audience may not have the capacity to respect art as a process, by failing to distinguish what it is and is not, does not invalidate the artist, no more than someone who doesn't understand mathematics or computer programming can decide that it is not worth learning or doing.

The reason why there is a lot of pushback against AI has to do with the preposterous notion that it can (and therefore, should) serve as a substitute for human creativity. Of all of the things that such sophisticated computational models could be used for, the last thing that I would want it to do for me is my thinking and feeling. We should be using technology to make our lives easier and give us more freedom to express ourselves creatively, not less. People who are using it to simulate art have entirely missed the point of why we make art in the first place. Creative expression is not a chore like washing my dishes and scrubbing my toilet bowl. Yes, making art is sometimes painful and difficult and challenging. But that struggle is not something to be eliminated. It is meant to be overcome.

AI apologists--at least, nearly all of those I have met--are, in my view, nearly entirely lacking in understanding of what makes living worthwhile; and those who do understand are intentionally and cynically promoting AI because they stand to gain financially from this position.

Comment Re:Intel's political marketing has always been bad (Score 4, Insightful) 23

If you read this post it shows that AMD stole Intel's design and reverse engineered it.

If you dig deeper, you'll find that AMD originally reverse engineered the *8080*, not the 8086. The two companies had entered into a cross-licensing agreement by 1976. Intel agreed to let AMD second-source the 8086 in order to secure the PC deal with IBM, who insisted on having a second source vendor.

There would have been no Intel success story without AMD to back them up.

(That actually would have been for the best. IBM would probably have selected an non-segmented CPU from somebody else instead of Intel's kludge.)

Comment Re:Clean room? (Score 5, Interesting) 125

Even if you use an AI to extract an extremely condensed specification out of the source code, it's hardly clean room if the LLM was pre-trained on the source code any way.

I once worked at a place that had a clean room process to create code compatible with a proprietary product. Anybody who had ever seen the original code or even loaded the original binary into a debugger was not allowed to write any code at all for the cloned product. The clone writers generally worked only off of the specifications and user documentation.

There were a handful of people who were allowed to debug the original to resolve a few questions about low-level compatibility. The only way they were allowed to communicate with the software writers was through written questions and answers that left a clear paper trail, and the answers had to be as terse as possible (usually just yes or no). Everyone knew that these memos were highly likely to be used as evidence in legal proceedings.

I highly doubt that any AI tech bros have ever been this rigorous, and I'd bet that most of these AIs have been trained on the exact same source code that they are cloning.

Slashdot Top Deals

The only way to learn a new programming language is by writing programs in it. - Brian Kernighan

Working...