Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI

'Hallucinate' Chosen As Cambridge Dictionary's Word of the Year (theguardian.com) 23

Cambridge dictionary's word of the year for 2023 is "hallucinate," a verb that took on a new meaning with the rise in popularity of artificial intelligence chatbots. The Guardian reports: The original definition of the chosen word is to "seem to see, hear, feel, or smell" something that does not exist, usually because of "a health condition or because you have taken a drug." It now has an additional meaning, relating to when artificial intelligence systems such as ChatGPT, which generates text that mimics human writing, "hallucinates" and produces false information. The word was chosen because the new meaning "gets to the heart of why people are talking about AI," according to a post on the dictionary site.

Generative AI is a "powerful" but "far from perfect" tool, "one we're all still learning how to interact with safely and effectively -- this means being aware of both its potential strengths and its current weaknesses." The dictionary added a number of AI-related entries this year, including large language model (or LLM), generative AI (or GenAI), and GPT (an abbreviation of Generative Pre-trained Transformer). "AI hallucinations remind us that humans still need to bring their critical thinking skills to the use of these tools," continued the post. "Large language models are only as reliable as the information their algorithms learn from. Human expertise is arguably more important than ever, to create the authoritative and up-to-date information that LLMs can be trained on."

This discussion has been archived. No new comments can be posted.

'Hallucinate' Chosen As Cambridge Dictionary's Word of the Year

Comments Filter:
  • With a word like that, at least we have confirmation as to how clickbait became the source of 'news' revenue.

    Shocker. /s

    • by ls671 ( 1122017 )

      Furthermore, kids nowadays!

      The word "hallucinate" already became the word of the year several decades ago when LSD began to rise in popularity.

  • by narcc ( 412956 ) on Thursday November 16, 2023 @05:58PM (#64010881) Journal

    LLMs are neat, but they're probably not going to transform the way we live and work. As the Cambridge dictionary word of the year reminds us, they're far too unreliable to be more than a novelty for most applications.

    • It seems to be a problem of incompleteness. The LLM is an interesting pile of math that can produce output in response to prompts. But even though the human brain does something similar, there are far more elements to human cognitive processing than what are at work in the LLMs. At least in the current generation of them.

      Maybe someday we will be able to introduce a hallucination-prevention mechanism, though I suspect that simply building bigger LLMs is not going to be the way that problem is solved.

      This

      • by narcc ( 412956 )

        even though the human brain does something similar

        It is extremely unlikely that there are structures in the brain that are similar to transformer networks. You can say that NN in general are 'inspired' by brains, but there is no reason to believe they share anything other than the most trivial similarities. In fact, we can even abandon the analogy completely if we want, and often do for performance reasons.

        Maybe someday we will be able to introduce a hallucination-prevention mechanism

        The problem is that so-called 'hallucinations' are exactly the kind of behavior we should expect, given how models of this kind function internally.

    • Re: (Score:2, Informative)

      by geekmux ( 1040042 )

      If you intend to convince Greed that good-enough machines operating 24/7 aren't worth the replacement investment for those good-enough meatsacks always bitching about more time off, more money, and more benefits..being arrogant enough to demand sleep every 18 hours or so, you're gonna have to speak a lot LOUDER than that.

      I'd suggest you speak in money with a metric fuckton amount of brogue. It's the only recognized language and dialect.

      • by AmiMoJo ( 196126 )

        Looking at the source website they seem to be quite worried that people will just ask AI to define words and give example sentences in future, taking business away from dictionaries.

        https://dictionary.cambridge.o... [cambridge.org]

        I suspect though that much of that business has already gone away because you can just google a word to get a definition. The only dictionary I ever use now is a Japanese to English one, all the data for which is free (I pay for the app because it's good).

        • Cambridge being worried about business revenue from dictionary lookups? Ranks right up there with Harvard poor-mouthing.

          There's an entire University wrapped around that dictionary with 500M+ in cash on hand, and a few billion in assets.

    • You forgot to say that LLMs are nevertheless "powerful."

  • From this February:
    "In the United States, there is the state of New Guinea. This state is located in the southeastern corner of the country and is bordered by Georgia, South Carolina, and North Carolina. New Guinea is known for its beautiful beaches, mountains, and forests, and is home to the Appalachian Trail."

  • AI is the new NFT, may it follow the same path!
  • Hallucination refers to a sensory effect. The psychological term that matches this well-known phenomenon best is actually âoeconfabulationâ which refers to making up stuff while believing it.

    âoeHallucinationâ is the word that caught, though.

    • Hallucination refers to a sensory effect. The psychological term that matches this well-known phenomenon best is actually âoeconfabulationâ which refers to making up stuff while believing it.

      âoeHallucinationâ is the word that caught, though.

      LOL - Am I hallucinating, or did slashdot mangle your word?

      • No, it mangled his quote marks. Welcome to Slashdot and its complete lack of support for Unicode. Whenever somebody types in text that includes "smart quote" (as some apps helpfully do automatically), we get this.

    • Or delusion. (Though, AFAIK, it doesn't have a verb form.)
      • "Or delusion. (Though, AFAIK, it doesn't have a verb form.)"

        To delude. Although if you want to indicate someone having the delusion, you want the passive voice: to be deluded.

        • Yes but while you can say "Joe is hallucinating," you can't say "Joe is deluding" and have it be in the same sense. The former means Joe is the one experiencing the hallucination whereas the latter means Joe is misleading others, not experiencing the delusion himself.
          • "you can't say "Joe is deluding" and have it be in the same sense."

            As I said, you have to use the passive voice: "Joe is deluded." The verb form of hallucination, hallucinate, means "having a hallucination" while the verb form of delusion, delude, means "*inflicting* a delusion." They both have verb forms, the verb forms simply have different uses.

  • Word of the Decade, and not just for AI.

Where there's a will, there's an Inheritance Tax.

Working...