Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Seeing through the Emperor's clothes. (Score 1) 62

Chat GPT and its rivals are the latest incarnations of 50 to 60 year old programs like ELIZA (1962-4) and PARRY (1972). Then in the 1990s Andrew C. Bulhak's Postmodernism Generator using the Dada Engine, a system for generating random text from recursive grammars. and Jason Hutchens' MegaHAL which used, in part hidden, Markov models to produce often lucid text and, of course, others.

Chat GPT has the advantage of a huge corpus of text in its training data, tools for decoding natural language & (probably) more computing power than the entire world had in the 1970s.

While journalists keep saying that Chat GPT is artificial intelligence, the reality is it is a clever implementation of an LLM. Provided the answer to a question is well represented in its training data, it has a good chance of producing a factual, preferably useful, answer. Ask for something that it limited representation of in its training data & it can "hallucinate". Ask for an artistic response (poetry, screenplays, etc) and it will oblige, but the answers seem somehow flat.

When I first started exploring Chat GPT, I set it a number of tasks: Writing a short stand-up comedy set on a given topic, writing poems in given styles on given topics, rewriting the end of Shakespeare's Romeo & Juliette and Hamlet to have happy endings, given the first half of the script for first scene of my satirical fantasy play The Grin Reaper: The Bunny's Tale complete the scene. In each case it produced something compliant but the "comedy" was dead flat, the poetry lacked emotion &, as for the Shakespeare, nah!

Does this mean that AI can't compete in the arts? No, it means that 2023 LLMs can't but the technology is improving and if the creators of them see a business advantage to improving them in that direction, we can expect something a lot better

Comment Re:Why is the fine listed in USD? (Score 1) 84

The article is in Bloomberg, which besides being generally a producer of ultra-low-quality articles (which yet somehow seem to get posted here all the goddamned time) is also based in the USA. And Slashdot is also based in the USA, and the editors are lazy. Now you know, I guess?

US news site CNN reported it in Euros & gave a translation "Meta has been fined a record-breaking €1.2 billion ($1.3 billion) by European Union regulators for violating EU privacy laws by transferring the personal data of Facebook users to servers in the United States."

Comment Re:2 years or more may be to long can we cut down (Score 1) 52

I can't understand why people don't just move to a different country and study there.

You first have to find a country that will allow you to move there, although if you are going as a student it's usually easier to get a student visa than a work visa ... as long as you are enrolled at an accredited tertiary institute

Unless it's an English speaking country, you'll probably have to learn the language that education is conducted in there

Very few countries offer free education to foreign students, so you'll need the cash to pay for your courses.

Submission + - Stability AI Launches StableLM, An Open Source ChatGPT Alternative (arstechnica.com)

An anonymous reader writes: On Wednesday, Stability AI released a new family of open source AI language models called StableLM. Stability hopes to repeat the catalyzing effects of its Stable Diffusion open source image synthesis model, launched in 2022. With refinement, StableLM could be used to build an open source alternative to ChatGPT. StableLM is currently available in alpha form on GitHub in 3 billion and 7 billion parameter model sizes, with 15 billion and 65 billion parameter models to follow, according to Stability. The company is releasing the models under the Creative Commons BY-SA-4.0 license, which requires that adaptations must credit the original creator and share the same license.

Stability AI Ltd. is a London-based firm that has positioned itself as an open source rival to OpenAI, which, despite its "open" name, rarely releases open source models and keeps its neural network weights—the mass of numbers that defines the core functionality of an AI model—proprietary. "Language models will form the backbone of our digital economy, and we want everyone to have a voice in their design," writes Stability in an introductory blog post. "Models like StableLM demonstrate our commitment to AI technology that is transparent, accessible, and supportive." Like GPT-4—the large language model (LLM) that powers the most powerful version of ChatGPT—StableLM generates text by predicting the next token (word fragment) in a sequence. That sequence starts with information provided by a human in the form of a "prompt." As a result, StableLM can compose human-like text and write programs.

Like other recent "small" LLMs like Meta's LLaMA, Stanford Alpaca, Cerebras-GPT, and Dolly 2.0, StableLM purports to achieve similar performance to OpenAI's benchmark GPT-3 model while using far fewer parameters—7 billion for StableLM verses 175 billion for GPT-3. Parameters are variables that a language model uses to learn from training data. Having fewer parameters makes a language model smaller and more efficient, which can make it easier to run on local devices like smartphones and laptops. However, achieving high performance with fewer parameters requires careful engineering, which is a significant challenge in the field of AI. According to Stability AI, StableLM has been trained on "a new experimental data set" based on an open source data set called The Pile, but three times larger. Stability claims that the "richness" of this data set, the details of which it promises to release later, accounts for the "surprisingly high performance" of the model at smaller parameter sizes at conversational and coding tasks.

Comment Re:No biggie (Score 1) 108

Except they won't be wiped out, the sea level will rise gradually over several years and they will be displaced, moving inland or north. They will have lost their homes & businesses and many will end up on the streets or working minimum wage, but they will be alive.

I find it hard not to see a parallel with the displacement of people that was the basis of Steinbeck's The Grapes of Wrath eighty-some years ago.

Comment Re:Access to chips? (Score 1) 31

Broadcom are a fabless semiconductor company so I can't see why they should be the reason for shortage in chips, probably the foundries they subcontract to are the bottleneck. Yes, Raspberry have created the RP2040 dual core 32 bit ARM chip used in the Pi Pico but scaling from being able to do that to designing a quad core 64 bit chip would be a massive job & they would still be fabless at the mercy of others.

Slashdot Top Deals

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...