Submission + - Breakthrough in AI - Bridges Transformers and Neuroscience Without Sacrificing (arxiv.org) 1
An anonymous reader writes: Ever wondered if AI could think more like a human brain while still crushing language tasks like today's top models? Enter Dragon Hatchling (BDH), a groundbreaking new large language model from researcher Jan Chorowski. Inspired by the brain's efficient, scale-free networks, BDH uses a swarm of locally connected "neuron particles" to process data, blending biological realism with Transformer-level performance. It matches GPT-2's results on translation and language benchmarks with the same parameter sizes (from 10 million to 1 billion), all while following brain-like rules like synaptic plasticity and Hebbian learning—think neurons firing and wiring together in real time. What sets it apart? Built-in interpretability: its activations are sparse and clear, making it easier to peek inside and understand how it reasons about concepts. Plus, it's GPU-friendly, biologically plausible for explaining human speech, and scales up smoothly, potentially unlocking more universal AI reasoning. A fresh take on merging machines and minds that's as practical as it is intriguing.
now on HuggingFace (Score:2)
Now #1 trending on https://huggingface.co/papers/... [huggingface.co]
( https://huggingface.co/papers/... [huggingface.co] )