Comment There will never be a Quantum Internet (Score 1) 19
Quantum "bandwidth" decays exponentially as a function of distance.
Quantum "bandwidth" decays exponentially as a function of distance.
AI works well if you know what you are doing and you use it to take away the tedium. Say coding a 500-line routine that you know how to code, know what it should do and have the ability to tell a shit result from a good one.
Where I come from cost of coding is irrelevant. The only cost that matters is the liability incurred by the codes existence.
This is like a Doctor telling a nurse exactly what drug to administer. If you are going to use it actually diagnose the problem and come up with solutions , current LLM models are pretty shit. It's too bad most people who are using LLMs think it can replace actual domain-specific knowledge just because LLMs can fake it so well.
This actually seems much closer to telling a nurse to fabricate the drug before administering it.
Bursting of the AI bubble can't come soon enough. Far too much tech bro nonsense, DRAM / GPU prices way too damn high while unhealthy levels of collateral damage accumulate.
There is too much value in the ability of corporations to custom train models on internal datasets for the whole world revolves around our centralized AI service thing these AI corporations are dreaming about to have ever worked.
To make matters worse generative AI space is stagnating. Going forward cost of local inference is declining at the same time distance in capabilities between centralized corporate models and publicly available weights shrink. Regulatory power plays have stagnated as the public tires of AI scare mongering and so they are out of options.
What is going on currently with the colossus stargates is a destructive desperate hail mary to brute force their way out of the massive hole the industry keeps digging for itself.
Any article talking about qubit counts should reference the computers capabilities in terms of logical qubits.
Any article talking about error correction should provide a scaling law for the required fanout of correction circuitry as a function of logical qubit count.
Any article failing to do either of these things is a waste of time.
But everything I said was a fact.
You wouldn't know the facts if they hit you on the head. This is the source:
https://searchepsteinfiles.com...
From this you somehow manage to infer the following:
"So apparently Jeffrey Epstein wrote that Donald Trump sucked Bill Clinton's dick and Vladimir Putin has the pictures.
Stop, read that again, Google it, and just Jesus fucking Christ what the hell is wrong with our world?"
NONE of the shit you spouted above is anything even remotely resembling a "fact". FFS you even manage to get the speaker wrong. It was Jeffrey's brother who made the statement.
If that makes you uncomfortable what the hell does that say about you?
What makes me uncomfortable is coming to obviously absurd conclusions based on shit information and then proceeding to wonder "what the hell is wrong with our world".. Look in the goddamn mirror.
So apparently Jeffrey Epstein wrote that Donald Trump sucked Bill Clinton's dick and Vladimir Putin has the pictures.
Stop, read that again, Google it, and just Jesus fucking Christ what the hell is wrong with our world?
What the hell is wrong with YOU? Get some mental help or something dude... your rantings are completely unhinged.
What too many people do not seem to understand with LLMs is that everything it spits out is simply a probability matrix based on the input you gave it. It will first attempt to deconstruct the input you provided and use statistical analysis against it's trained knowledge base to then spit out letters, words, phrases and punctuation that statistically resembles the outputs it was trained to produce in it's training materials.
LLMs are simple feed forward networks run in a loop. They make no use of "statistical analysis" nor is there a "knowledge base".
The just statistics statements are as useful as saying just autocomplete or just deterministic. These are completely meaningless statements that in no way address capabilities of the underlying system.
Until this version, ChatGPT obviously suffered from a lack of training materials within it's trained neural network to have it overcome the English language's typed grammar rules for it to be able to discern that em dashes are not typically used in everyday conversations and/or that the input to not use them needed to change it's underlying probability network to be able to ignore the English language's grammar rules and adopt it's output without the use of the em dash.
It's is shorthand for "it is"
This is a very difficult concept to train into a neural network as it needs to have been training on specifically this input/output case long enough to have that training override the base English grammar language model, which is a fundamental piece of knowledge a LLM requires to function and one of the very first things it is trained to handle.
This is gobbledygook. You are guessing and have no actual clue how the technology works.
russia has already bled out nato's military capability (barring the us openly jumping in, which is unlikely) so there's little that nato can do except covert operations
More unhinged nonsense from a clueless clown.
https://en.wikipedia.org/wiki/...
I think there was no legit reason to move Maxwell.
I don't think Trump is a pedo, because that doesn't square with his tossing out Epstein because he was a creeper
How do you square these two things? For what other reason would Trump pull strings to furnish a pedo like Maxwell with a luxury prison cell while publicly floating the possibility of pardoning her? Why do it?
I can see you've not used one. The Quest 3 has graphics virtually indistinguishable from a wired headset.
Any ideas why the Internet is full of complaints about shit compression and lag? Is everyone using their Quest 3 incorrectly? I distinctly remember the complaints rolling in at the time. Checking back they are still there.
Just because it's standalone doesn't mean you can't use your GPU power on your PC. Actually I lie since the last wired headset I used had a lower resolution, so going wireless streaming with the Quest 3 was a leap up in graphics quality compared to my previously wired headset.
The problem with wireless HMD displays is latency and bandwidth. Years ago I remember seeing perhaps even for a short time shipping add-on product for Vive I think it was that were at least using high bandwidth 60ghz radios. Now people are going with standard WiFi and bolting on crappy compression schemes to make it work. This is a step backwards.
The Quest 3 on launch was smaller than any other HMD on the market, it was the opposite of bulky. You want bulk, buy a tethered headset. Untethered ones aren't bulky in the slightest. Literally every other headset released is setting small size records.
So an HMD with an internal battery and onboard CPU/GPU weighs less than a similar HMD without any of those things? Yea sure that makes.... total.... sense.
Generates more heat than the sun? You have that backwards. Heat is generated by my GPU. The headset doesn't generate any perceptible heat on my face and the front of it is lightly warm when I take it off after a gaming session. On the other hand when I use PCVR I walk back into the room with the PC, that room is very warm.
Do you ever read what you write before clicking submit? You can't have it both ways. Either the work is done within the onboard GPU generates heat or the same work is done somewhere else generating heat.
If you are wireless streaming and complaining about heat from the PC then as a basis of comparison you need to do the same shit on the onboard GPU and compare the results.
If you run something that makes the whole room heat up on a remote GPU running that same thing on the HMD if possible which it is not would result in the HMD melting into a pile of goo.
If you run the same workload of the HMDs GPU on a remote GPU then it isn't going to heat up the room because the remote GPU will be sitting mostly idle.
Yeah I'm one of those people, and I will not buy a tethered headset. There's just so many upsides to non-tethered and you've yet to point out a downside that wasn't simply the result of your ignorance.
Quest 3 is a shit experience compared to tethered HMDs for seated SIMs.
Run any PC game directly on the headset, with emulation if only an x86 version is available
This is literally an advertised feature, one that was part of their hands-on demonstrations. Valve has contributed heavily to FEX (a user mode emulator, so you're not emulating system libraries), which is integrated with Proton in SteamOS. Yes, it's subject to any potential compatibility limitations, but I don't see how it's "absurd nonsense".
I have no doubt it can run PC games. Yet the device is not physically capable of running any PC game directly on the headset. I could not for example run MFS, DCS or Elite. My PC is barely capable of doing that and it has huge fans dissipating hundreds of watts. If the HMD could do that
I'm sure it runs smartphone VR games just fine yet this is all quite a far cry from running any PC game.
And you can play a Quest 3 as long as you like if you plug it in to your computer. You can play games on your computer through the Quest 3 so that it is reliant on the power of the machine connected to it and not the device itself - and you can do the same wirelessly. There are also some games that you can install on the Quest 3 itself and play natively without a computer. Honestly, it sounds like you aren't really up to speed on more modern VR offerings.
While I've never used Quest 3 there are no shortage of complaints about shit quality of the compression.
That's a strange way of saying will run any game on a store that 99.9% of PC gamers use and already have.
This statement doesn't make much sense. VR is an outlier 99.9% of PC gamers don't care about. 99.9% of software on the game store does not support *ANY* VR HMD. All the HMD can do at best is run low spec flat games in flat mode.
Even my dad has a Steam account and the last time he played any computer game Obama was still president. Your complaint is just not a thing that anyone gives an iota of a shit about
I have no tolerance for app store monopolies and dependencies, DRM..etc. If it isn't available on GOG, open source or directly from the vendors I'm not interested. If an HMD requires a fucking account on steam to work at all I'm not interested. This is just as stupid as saying to use a monitor I need to have a Samsung account or to use a mouse I need to have a Razer account. No fucking way, no thank you.
These issues have in fact drawn significant attention and criticism over the years. I reject the disgusting appeal to popularity you are trying to invoke to cover for fundamentally anti-consumer and indefensible behavior.
There are actually several, including language translation layers in some models.
You are misinformed. There is ZERO NLP processing in LLMs. The only "translation" is a simple lookup table that translates integer tokens to word fragments.
It has to be at least a little inconvenient, though. The end users DO NOT READ. No matter how scary the warnings are, whatever told them to do it is scarier and they ignore these warnings.
They read, they just don't have infinite attention to dedicate to vendor nonsense. The source of software isn't relevant. What software is allowed to do is what matters. Misplaced focus especially given the fact Google app store itself is full of malware only contributes to fatigue swaying attention away from what is important to what is not.
If an OS vendor really cared about what was in the best interest of the user they would never place the user in a situation where they face take it or leave it demands for privileges from app vendors. They would not allow blanket privileges assertions so broad and nebulous their utilization is effectively ubiquitous.
Innovation is hard to schedule. -- Dan Fylstra