Comment Re:AI art is entirely dependent on training materi (Score 0) 58
Grace Slick said the music of "White Rabbit" was inspired by Miles Davis' "Sketches of Spain." Ergo, not art. Copyright denied.
Grace Slick said the music of "White Rabbit" was inspired by Miles Davis' "Sketches of Spain." Ergo, not art. Copyright denied.
art is made by artists, not robots
Can a cyborg be an artist? Can photography be art, or does using a camera disqualify it?
I don't consider myself an artist, but I suppose I could be. Like a lot of other computer dorks my age, back in the day I played around with ray-tracing and the classical mirrored sphere floating above a checkboard plane. (You too, huh?)
Then I tilted camera a little bit, changed the checkboard into a colorful 'Brot. Then multiple mirrored spheres, and a sun-like light source floating above it all (actually many light sources, slightly offset, to give the shadow edges more of a diffusion), a gradually shaded the sky to look like a winter sunset (I remember many January evenings walking home and looking at Albuquerque's evening western horizon, and thinking about parametric functions based on the angle, to recreate that blue-to-green-to-red look), then added more complex solids as I got a little better at the math, sent 4 or 9 rays through each pixel and anti-aliased, and
.. then focus moved away from the composition to performance, where I had a whole Netware network of machines at my workplace (shh, sneaking in there at night) to draw in parallel, using record-locks to control which y values were done/undone. And some of the machines were 486s with floating point hardware(!!) (OMG so fast!), and then
.. ok, and by the time I got bored and moved onto the next thing, I'll admit that what I had was still a cliche pastiche that few people would call art. It was crap, but it was damn fun to make, and that was the whole point. And so ends my story (but not my rant!).
But what if I had stuck with it? What if I had something to say? (Which I didn't.) I didn't draw those pictures, but I "drew" the thing that drew them. I specified them, and there was no limit to the complexity that could have been taken on. If had kept with it and had made something good (which I didn't), but then someone said I hadn't been the creator of my images, or that they were unfit for copyright whereas someone's freehand-drawn picture was fit, I think I would have resented that!
Wouldn't you?
The guy in the story didn't write Midjourney, but if he had, I would totally support his claim.
And waitaminute, so what if I wrote the program? That part of my work was just in getting it to work, and then getting it to work faster, and that's when I got bored because Dammit Jim, I'm a programmer, not an artist. But the other part of the work was the composition, the arrays of "objects" (this was straight C and nothing about the program was OO) and their positions and properties. What if someone else took my program but then modified the arrays to model the scene to their specification? Would their work be unfit for copyright?
The coin should show his Mercedes in a handicapped spot.
I suspect phone manufacturers will attempt to find ways to block installing it on their devices
I suspect they'll just ignore it because no one will want to actually use it.
The problem with these ideology-based projects is that they have no mass appeal. No one out there in userland gives a flying fuck about free software. They want the latest apps. They want a seamless experience. Especially Gen Z who were raised on mobile devices. Tell them that they should give up iPhones and Android phones because "Free as in Freedom is the right way", and they're going to look at you like you're a tentacled thing from Mars. There simply aren't enough nerds on the fringe to make a "free" phone system work.
A 27 year old college dropout who decided that crypto was too boring so he started a gambling website and became a billionaire?
Gambling has been lucrative since a bunch of cavemen got together and started rolling rocks in the back of the cave for the best cut of Wooly Mammoth. That's never going to change. The dropout will more than likely die a rich man.
I was preferred to sacrifice PC gaming alltogether in the process. Turns out Steam runs Windows games in my library using Proton quite nicely
I've been skeptical about the "Windows games on Linux" thing, but I'm hearing nothing but good things about Proton. If it is indeed as good as advertised, it could truly a way for young men to finally get out of the Windows world.
I would have moved my parents to Google's Chrome OS Flex, but while it's super fast and does a few things extremely well, its lack of support for things like DVD playback is a killer in the "Upgrade for Grandma" department.
It may never be a better time, but this is a huge reason why there will never be a "year of the Linux desktop". When Microsoft cuts support, most people will just knuckle under and buy new machines, even if they're happy with what they've been using. They bitch. They threaten. They shake their fist and tell Microsoft they'll go to something else. But most just give in and write the check.
Hope you're up on your Sumarian antivirals because I'm gonna Snow Crash your ass.
You're still alive, I see. Yes, it's true, the lethal payload mentioned in the above video isn't actually included within it. I knew there was little danger in linking to this video, but don't you realize it could have been much worse?
Doesn't removing the artist's signature usually reduce the value of a work?
That this isn't the case for Sora 2, tells me something about Sora 2's reputation.
Indeed. Most readers won't be ancient enough to remember stenographer pools, mechanical typewriters, and telegrams. They'll have seen video but that cannot convey lived experience. They won't have experienced the transition between manual machine tools and vastly mor capable CNC machining, but we all live in the outcomes.
The critical difference was that those old machines, and the software that replaced them, were created to make human workers more productive. To grow company profits through increased worker output. AI is designed to increase profits by flat out replacing those workers, not making them more productive. AI is intended to kill two birds with one algorithm: create software that does human work better and faster than any human could, and then eliminate the costs of human employment.... salaries, insurance and other benefits, training, et al. That's the crucial difference, the intent to replace people, period.
"As a European"...
You have zero room to talk. France has just collapsed. Again. France, Spain, Italy, and Greece all have debt exceeding 100% of their GDP. And you can't even defend your own shores from an army of military age North African men that are coming in waves specifically to sponge off of your welfare systems. Europe is a pressure cooker right now, and you're doing nothing to free any pressure.
"Now there are far, far more kids with degrees than are needed in the economy". I found my Degree enriching in many more ways than in $$ terms.
I heard philosophy grads say the same thing. They were still always short on money.
The Physics departments have been made obsolete by the Engineering departments. I already noticed the trend in the 1980s.
Engineers have always made more money than the pure-science grads, and this accelerated in the 60's. Even the mathematicians jumped over, largely because if you have a talent for math, its fairly easy for you to slide into engineering, with is mostly math anyway. Just math with a real-world purpose. It's funny because, at the end of WWII, there was a big debate about where US science research funding should go. One camp wanted practical research focus with real-world goals... "Build me a generator with twice the output", etc. Lyndon Johnson famously summed up this approach with the question "What will it do for Grandma?". The other side argued for instead funding pure science research based on curiosity, and argued that practical advances would trickle down from those results. The pure science camp won for a short while, but what killed it was the Space Race. The US needed specific machines with specific capabilities on a specific deadline. "Pure Science for the principal of it" fell by the wayside to "We need that rocket to have a 60% thrust efficiency increase, next year". And it's been that way ever since. In the marketplace, and especially in the marketplace of ideas, practical engineering won. And what research we still did tended to be dominated by hyper-expensive physics projects that had practically no commercial applications at all. I think the death of the Super-Conducting Supercollider in Texas was the death knell of big pure science projects in the US. As a result, engineers are actually doing a good bit of our basic research now. It's just folded into their commercial projects.
Engineering spacecraft modules will get you a high income with steady, reliable pay. Choosing to look for particles that may never be found will not.
I do believe that AI will lead to significant dislocation of workers.
But the committee's asking AI to assess AI is GIGO. AI is trained to foster AI, generate additional interaction, etc. Not exactly a dispassionate assessment.
I believe AI is in the overhype part of the tech cycle, and we will see some moderating of expectations as many of these AI companies are shattered by not being able to deliver on their over-promises
"AI" (which isn't really AI, but)... is indeed being overhyped. But it's also still going to kill millions of jobs that won't be replaced by new jobs. Both things can be true at the same time. And while AI will indeed create some new jobs "caring and feeding" for AI, it'll kill off far more in other fields that will never be made up, unlike, say, when the Model T largely replaced the horse and buggy. A major reason for what we're calling AI is to replace human jobs in order for companies to save money on human expenses. It's why these companies backed AI in the first place. Shareholder Value Uber Alles.
God made machine language; all the rest is the work of man.