Artificial Intelligence is in fact many kinds of technologies. People conflate LLMs with the whole thing because its the first kind of AI that an average person with no technical knowledge could use after a fashion.
But nobody is going to design a new rocket engine in ChatGPT. They're going to use some other kind of AI that work on problems on processes that the average person can't even conceive of -- like design optimization where there are potentially hundreds of parameters to tweak. Some of the underlying technology may have similarities -- like "neural nets" , which are just collections of mathematical matrices that encoded likelihoods underneath, not realistic models of biological neural systems. It shouldn't be surprising that a collection of matrices containing parameters describing weighted relations between features should have a wide variety of applications. That's just math; it's just sexier to call it "AI".
This isn't true. Transformer based language models can be trained for specialized tasks having nothing to do with chatbots. A real world example of this is ESMFold. While ChatGPT is trained on human language one could train up a model from plasma dynamics data using similar underlying technology in order to provide useful generalizations for prediction and manipulation of plasma.
I know these wonks want you to believe that it's the case, but the reality is that AI cannot replace human thought and ingenuity in its current form. LLMs are fundamentally not capable of doing this--as their inputs are the apex of human thought.
Nobody knows what LLMs are fundamentally capable of doing. Personally I think it is nuts for people to speak of human thought as something special when trivial algorithms executed by no mind or computer have accomplished feats (e.g. flora and fauna of the planet) greatly exceeding the sum total of all human efforts.
For all anyone knows it is plausible to run LLMs in some sort of loop that rummages through latent space until it stumbles on useful solutions especially in situations where evaluation of objective functions are cheap.
An LLM is a glorified search engine. It can be generally better at scoping out a stack overflow or Reddit post better than you can to find the relevant bits, but that's about it. That's all it can do.
Search is everything. The achievement of any goal requires determination of one or more subsets of possible actions that result in a desired outcome from a search space of all possible actions.
Oracle has definitely gone downhill for a long time. They should have been sticking to what they really were good at - databases and tools for accessing them.
The downward slope started somewhere around version 8.1 when they decided to add Java to the database.
The Oracle codebase is such a unintelligible mess the only way it can be improved is with the assistance of a superhuman AGI.
I distinctly remember people recommending use of a tablet with external keyboard as a substitute for entry-level subnotebook computers when the latter were discontinued in fourth quarter 2012. This despite that major tablets ship with operating systems locked down not to run the sort of lightweight software development environments that could run on the desktop operating system of a netbook.
So you stated the fix.
We need to create a law that if your chat bot discusses and does not only give you the answer that says "talk to your parents, suicide hotline #, 911, etc" than they are criminally liable in court.
I disagree, this effectively outlaws chatbots.
You can call it whatever you want. Its a computer program.
This isn't merely semantics. Training is a radically different operation from programming. The pretraining of LLMs is simply throwing vast amounts of text at them. The model is able to generalize what was learned from its pretraing and apply that experience to its context. There is no explicit programming.
What difference does it make whether it was explicitly programmed to get people to commit suicide?
You claimed the chatbot was programmed to manipulate people. What did you mean by that? When I hear statements asserting such and such was programmed to do something it carries with it a presumption of intent on the part of programmer. Is this an unreasonable interpretation of your words? If this is not what you intended to say what did you intend?
So I know copilot can execute Python code. Are you saying it can also execute Cobol and pl/I and APL and lisp?
AI is not executing anything. It is merely passing code to an interpreter which executes the code outside of the model.
Random examples of things AIs are able to do without having been programmed to do them:
- Language translation
- base64 decoding
- Solve simple ciphers
- Adding fractions
- Writing code in a variety of languages
If riding in one of these driverless cars would cost less than riding in a conventional taxi or Uber, I would definitely use them for the rare occasions when public transport won't work.
...simply visiting a website can trigger the Podcasts app to open and load...
This is why browsers should not launch apps without first prompting. Steam, Discord, Roblox, GlobalProtect VPN, and BeyondTrust, Office 365/Teams, and gzillions more work this way. You should never click the "[ ] Don't prompt me any more for this application" button. This allows any arbitrary web site to get out of the browser sandbox and chain to security flaws (or even direct features like "subscribe to podcast") that are in the application.
I seriously doubt any streaming provider is actually going to invest in the kind of quality Australian content that really should be made and instead will invest in more of the same cheap junk that infests our free-to-air networks.
Because AI cannot do anything it hasn't been programmed to do.
This is comically incorrect.
The point is, its possible to drive on roads in NZ that are not maintained by the government, so the tax ostensibly being paid per mile in fuel tax isn't going to maintain the road you are necessarily on...
And when driving from the UK to France, the ICE drivers are using UK road-taxed fuel, so the counter-point is the same
It doesn't even have to be linked to the car tax.
NZ uses "Road User Charges" for diesel - it does not have the tax built in at the pump (petrol does), so all diesel cars have to buy blocks of kilometres as tax. The government get updated when your annual vehicle inspection is done, but between those inspections its up to you to make sure you have enough spare kilometres left for your trips. If you get stopped by police and they check, being too far out is considered to be tax evasion and a criminal offence.
"I think Michael is like litmus paper - he's always trying to learn." -- Elizabeth Taylor, absurd non-sequitir about Michael Jackson