Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Dumbass puts huge money late into obvious bubbl (Score 1) 51

I agree that "the path" is absolutely oversold- as it's potential, not known, but to say there isn't a path is just as absurd.

The fact is- we don't really know.
LLMs- absurdly large language models- are as close as we've come to something that exhibits intelligence.

Functionally, at the core, brains and ANNs are both big ass networks of threshold logic.
Training an ANN to have simple behaviors is ridiculously trivial- you've been able to do it with open source tools for decades.
You can train a model to drive a car in a video game better than any human alive can ever hope to. You. On your own computer.
Modern understanding of double descent, and the advent of transformers have allowed us to make absurdly large networks that can run on GPUs.

Why is it such a stretch to think that a good language model is one of the secret ingredients to what we call general intelligence?

Comment Re:Dumbass puts huge money late into obvious bubbl (Score 2) 51

Bubble does not imply that the asset is somehow value-less or folly.

This may come as a surprise to you, but the world-wide-web still exists.
The bubble popping is not a certainty- eventually real-value can catch up to inflated value. But if it does- AI isn't going away. It's just going to be valued correctly, which is definitely not zero.

Diminishing returns does not mean value does not increase. We have yet to reach "zero returns".

Comment Re: Good job (Score 1) 37

Too bad it can't, you know, go find me some facts and all

Tool-enabled LLMs can. This is old tech.

or at least tell me when it can't find any.

Can do that too.

It generates text that looks statistically like text it has seen.

This bullshit again.
It's not remotely true.

An LLM is trained to give responses that match existing text, this does not mean they "generate text that looks statistically like text it has seen."

Comment Re:Like GPU benchmarks (Score 1) 37

LLMs can count letters in unseen phrases.
The fact that a model could not does not mean that LLMs cannot.

gpt-oss-120b:
Me:

How many times does the letter "z" appear in the following word: "zlkjxcvlzkjxlcvzzZZZzlskdfjasdzz"

GPT:

So:

- **Lowercase ‘z’: 7 occurrences**
- **Uppercase ‘Z’: 3 occurrences**

**Total (caseinsensitive) = 7+3 = 10**.

Comment Re:Like GPU benchmarks (Score 1) 37

and tell you confidently that "there are 3 'r's' in strrawberrrry," because, as you point out, they can't count letters in token sequences.

While there's certainly no guarantee for it to be- there's no reason such capability can't be generalized.

gpt-oss-120b:
Me:

How many times does the letter "v" appear in the following sentence:
```
Verily I say, it's very nice to invoke a vehement interaction with your visage
```

GPT:
Counting them:

1. V (in *Verily*)
2. v (in *very*)
3. v (in *invoke*)
4. v (in *vehement*)
5. v (in *visage*)

So the letter “v” appears **5 times** in the sentence (including the initial capital V).

*(If you count only lowercase“v”, there are 4 occurrences.)*

Comment Re:My only demand for AI is "please stop" (Score 1) 56

I don't disagree with your sentiment on it at all. But the fact is- its use is prolific, and expanding rapidly.
The older generations will probably always have limited adoption. The younger generations will probably continue to expand adoption. For better or worse (worse, probably)

Comment Re:Japan's high speed trains (Score 1) 218

That would be the Sanyo Shinkansen then, and it's 100% grade separated, and always has been.

If it was near the Marine Corps base it would have been the Sanyo Main Line. I don't know what the historic speeds were, but these days the maximum is 130 kph. The Shinkansen line is some distance from there.

130 kph (about 80 mph) might not seem like much, but it can appear pretty quick when you are very close to the train at a crossing. The Shinkansen line was around 250 kph when you were there I think, now up to 300.

Comment Re: It could (Score 2) 218

Somehow the Japanese are building new HSR lines right into the centre of their two biggest cities, Tokyo and Osaka. Grade separated. 90% tunnel through mountains, elevated in urban areas.

They have earthquakes and even more densely built up areas to contend with. Somehow they manage it, regularly. And not just for HSR, the Tsukuba Express line is another example that is not high speed but is fully grade separated and runs right into the centre of Tokyo. Partially underground, partially viaduct.

Comment Re:High Speed Rail in China seems Phenomenal (Score 1) 218

More high speed rail than the rest of the world combined, all built in the last 15 years. It's some of the fastest too, with peak speeds exceeding those in Japan (where they are limited due to noise concerns rather than safety or the capabilities of the trains).

They are also now building new maglev lines, starting in Beijing, which will be the fastest in the world, and are going to be the longest and most extensive in the world as they rapidly expand.

They also have more underground rail than the rest of the world combined, all built in the last 20 years.

Slashdot Top Deals

I don't want to achieve immortality through my work. I want to achieve immortality through not dying. -- Woody Allen

Working...