Forgot your password?
typodupeerror

Comment Re:Diversifying supply chain from single source (Score 1) 22

It is insane that major US companies are making trillion-dollar bets that a single-source provider will remain operational.

You are aware TSMC has built two chip fabs in Arizona and building a third one, right? You are also aware that Samsung built a chip fab in Texas over 20 years ago and is building a second one, right?

Comment Re:Second sourcing, multiple suppliers, etc. (Score 1) 22

Except that they would have to presumably reengineer the old silicon for Intel's process, which kind of defeats the purpose of reusing old designs to save money, I would think.

Or more likely, Intel will have to adapt their processes to Apple designs and for other companies if Intel wants to do business as a chip foundry.

Apple also uses CPUs in things like the Apple Watch

Off the top of my head, here are the other processors Apple uses: Apple Watch, Homepod (S series), AirPods (H series), modems (C series). There are millions to tens of millions of these processors that Apple will need each year.

Comment Re:Second sourcing, multiple suppliers, etc. (Score 1) 22

Not at all. Apple used Intel CPUs for 15 years. The great "PC vs Mac" debate is about the user experience, not the hardware architecture of the CPU behind it, and certainly not what foundry a CPU comes from.

The main reason Apple left Intel was all on Intel for not making progress for years on chips. This was the same reason Apple left IBM. Apple thought that by using Intel they would not be in the same situation again. Little did anyone know how Intel would struggle at 10nm for years. It is unlikely that Apple will ever go back to using x86 for their main processors though.

Intel these days is more open to being a chip foundry like TSMC than before. Apple using Intel to fabricate their chips as a secondary supplier makes business sense for everyone.

Comment Re:Need SuperKendall's take (Score 1) 22

How great is it that Trump requires Apple to do business with Intel, the spin will be delightful.

And why is this "great"? The main reason Apple stopped sourcing chips from Intel had nothing to do with politics. It was due to Intel's stagnation in making chips. Intel was stuck for years while AMD passed them by. Apple finally had enough. Some would call that just business.

Please, please, please let it be Apple's main processors. A hysterical black eye to Intel and a kick in the balls to Apple fanboys. Win win!

Again the issue was entirely Intel's incompetence at making progress for years. Apple would probably keep buying chips from Intel if they were good chips. After all Apple bought Intel's entire modem business from them. More than likely Intel will make other chips for Apple first. For example every AirPod requires a chip. Every Apple Watch requires a chip. Apple modem chip C1 could be fabricated by Intel.

Comment Re:What I don't like about Dawkins (Score 1) 368

You absolutely can though. There is nothing stopping you from seeding the run with a single LLM, or even substituting the function definition for random() with:

random() { // determined by fair dice roll
        return 5;
}

We can trivially and easily do this.

And further, it seems you are now suggesting that substituting the above random function for this one:

random() { //
    input = ask-user-for-fair-dice-roll();
    return input;
}

and now you sit there rolling dice and inputing the results, and the computer program gains consciousness?

really?

Comment Re:What I don't like about Dawkins (Score 3, Interesting) 368

The difference, of course, is that we currently DO actually know EXACTLY how an LLM works. We can snapshot the model and seed the random number generator to make it generate exactly the same output from exactly the same input every single time. We can pause it, set breakpoints, inspect and dump data structures.

It IS simply a program running on a CPU, and using RAM.

Is it possible that's all humans are in the end? Sure its possible, I can't prove otherwise. But we are not remotely in a position to assert that its the case.

You invoke philosophy which is entirely appropriate. There are fairy tales for example of artists painting things so realistic that they come to life. And it poses an interesting question here: is there is a difference between a simulation and a real thing? Can a simulation of life, be "alive"? Or must it forever remain a simulation.

And a related, and perhaps ultimately simpler question is can a *turing machine simulation of life* be "alive".

A lovely illustration of the question:
https://xkcd.com/505/

Can what you and I perceive as our lives, the universe around us, and everything REALLY be underpinned by some guy in a desert pushing pebbles around in a big desert somewhere?

Can the arrangement of stones in a desert, and some guy updating moving them aorund, in some pattern he interprets as representing the information that describes our universe actually "BE" our universe?

Is is the pattern of rocks is JUST a pattern of rocks. Is the guy moving them around JUST moving them around. Is the interpretation of the pattern as a representation of the state of a universe, just that, a representation?

Or you truly think there is a galaxy with a planet with people on it having a conversation on slashdot,'frozen in time' waiting for some guy to move the rocks into the next pattern and that somehow results in the experience we are sharing right now?

Or put more succinctly - can an abstract representation of a thing be the thing? be it bits in a DRAM module memory or pebbles arranged in the sand? can it be the thing it represents? Can the painting of a zebra if its done skilfully enough be a zebra?

Comment Re:Microsoft part right, part wrong (Score 1) 106

You fail to actually detail how passwords that exist in a password manager can be compromised in Chrome. A keylogger can only record what the user types. If Windows Hello or biometric authentication is turned on, the user does not type in the authentication password nor the site password. Also you seem to forget that while company equipment is not yours, there is compartmentalization put in place from one bad actor being able to infiltrate multiple systems. An IT support person should not be able to access HR, Payroll, etc.

Comment Re:I'd love to trash Edge, but... (Score 1) 106

Er, I meant if they have enough control to dump RAM. Thinko because what I was thinking is that if they can dump RAM they can dump your password database, too (unless user authentication is in the loop and that authentication relies on secrets not in the device).

By default, authentication is required. So dumping the password database dumps encrypted passwords as authentication is a separate process. Also Chrome requires authentication each time for each password; only Edge requires authentication once to load the entire database. Only Edge has the very flawed design you are assuming happens for all browsers. And that does not factor into additional hardware protections. For some Android phones and iPhones/Macs, the password database does not exist on any hard drive. The password database is stored in a separate security chip. So dumping the password database would be extremely difficult as access to the chip is not direct.

Comment Re:Motherboards are useless without RAM (Score 1) 66

Yes, the main problem is many people who want upgrade their motherboards now generally will be upgrading from a prior generation that used DDR4. The current DDR5 shortage makes that difficult. Whereas consumers could re-use their existing SSDs, case, GPU, etc, they will need DDR5 RAM. I have seen DD4 conversion kits being sold now where a user can put DDR4 memory into an adapter that fits DDR5 slots. I have no idea how well they work.

Comment Re:Fraction inflation? (Score 1) 66

I think one nuance is the bad comparison and wording. Asus sold 15M in 2025. As of today in 2026, they have shipped 5M. Those are not the same metrics. Shipped is not the same as sold. It might be that the reporter only has sales numbers for 2025 and inventory numbers for 2026 as reported by Asus; however, Asus could also conflating the two numbers.

Comment Re:What I don't like about Dawkins (Score 4, Insightful) 368

The parent poster acknowledges this, they are saying the randomization is *introduced artificially*.

The same as any dice rolling app. All you have to do is seed the pseudorandom number generator the same for each run, and it will roll the same dice, in the same order, every time.

Likewise, if it wants to spit out the next word/phrase and 2 of them have 33% probability, and two have 17% ...

Then if you seed the random number generator with the same seed for every instance / run, you'll get the same output from the same input on the same model.

The system is entirely determininistic. The same as any other software, from the ghosts in pacman to the bots in quake arena, to a chess engine. We introduce "randomness" to make it more enjoyable, but its pseudorandomness, that we artificially insert. We could just as easily seed the random number generator the same way every time, and then it would do the exact same thing every time. None of these are actually thinking and making decisions.

Slashdot Top Deals

The person who makes no mistakes does not usually make anything.

Working...