Comment Re:Bubbles are strange. (Score 0) 70
"Crypto" (you mean crypto coins) have not been a bubble, but bullshit. Everyone who knows some tech knew that from the beginning.
"Crypto" (you mean crypto coins) have not been a bubble, but bullshit. Everyone who knows some tech knew that from the beginning.
"Think of it this way: I didn't read a math textbook for the suspenseful plot."
Talk for the reflexive relation with you and not forall.
If the vendor committed the fraud, that's very bad news for everyone providing software. Your GitHub profile may be a liability, if a user can use your programs to do something illegal and a court rules that the creator of the software is responsible.
I think it is a bad idea to let an AI shop in a (semi-)automated way, but if Amazon can disallow users to use certain programs to access their site, they (and other sites) can also start making rules about adblockers, supported browsers in general and other details about what software may be used with their site.
I wouldn't exclude, that currently companies who demonstratively create experimental AI videos make sure they look like AI and not perfect. You want to show people you're trying something new.
One can also wonder if we will see effects that people start liking images/videos with AI artifacts like they like MP3 artifacts and (minor) JPEG artifacts in "blind" tests.
They probably just pulled it, because Gemma is their open weight model for nerds. It isn't important to provide it to end users, the purpose is to let nerds explore what cool things one can do with it, so Google can take the best ideas for Gemini. They probably just had it in the app because it doesn't hurt and now removed it, because it hurt nevertheless.
Of course it does. Maybe the word is not clear, let's give a few examples:
- cat, dog
- red, blue, pink, black, white
- upside down
- smiling
- horror style
- line art
While some are subjects and others are style elements, these are concepts the model knows and can arbitrarily combine to create something new. It only knows them as concepts, which means you don't have the one horror style, but for every new seed a new horror style (except you combine enough concepts to clearly communicate an unique style).
Huh, I thought it was always versatile. Anyway, in the end it mostly tests an authentic browser. Some are just doing proof of work, that doesn't even confirm a browser just a compliant JS/WASM engine and that you're willing to invest some CPU power into visiting the site.
The term "exactly" is wrong, but the idea is that it is legally the same and in particular to debunk the idea of "AI is just photobashing".
AI learns concepts, humans learn concepts.
AI models are simple, that's why concepts are represented simple and counting fingers is a challenge. Human brains are complex, which allows them not only to grasp the concept of "A hand has five fingers" but also the concept of "Let's give the alien hand another number of fingers" and finally the concept of "Even the alien has the same number of fingers on both hands if the body is also otherwise symmetrical".
Both are rarely able to reproduce their reference material. You can overfit an AI model and you can have an eidetic memory, but that are both rare exceptions.
The problem with "exactly" is that it should prove one point but can be refuted on another point, and that many people don't get that refuting "It works exactly like this" doesn't refute "the concept how it relates to copying source material or not is the same"
But they are no longer turing tests either.
Like and subscribe to show you're a human.
You know, that you're citing the ad everyone made fun of seriously?
I only find the mention of "bilogical beings" in the summary but not as a quote.
The part they may have (mis)understood for this may be that one:
Our physical experience of pain is something that makes us very sad and feel terrible, but the AI doesn't feel sad when it experiences 'pain', it's a very, very important distinction. It's really just creating the perception, the seeming narrative of experience and of itself and of consciousness, but that is not what it's actually experiencing. Technically you know that because we can see what the model is doing."
But that part does not imply that it needs to be biological. He says current systems don't feel sad or terrible ("terrible" seems not to be a well-defined term here) when they experience 'pain', which is a reasonable take. But still it doesn't prevent you from modeling another artificial system that is more driven by experiencing 'pain' or positive stimuli.
We all know what will happen to the marketing department.
There are enough John Doe without any university affiliation that have not much to lose when arXiv (or even publishers) ban them. If everyone with "I have an opinion and ChatGPT wrote an article about it" only uploads three papers before being banned, that are still a lot of papers.
When a Banker jumps out of a window, jump after him--that's where the money is. -- Robespierre