Ancient scrolls of dubious provenance hint darkly that DDR4 was not the first inhabitant of the RAM slots we consider so permanent. Debased cultists still sometimes mutter chants mentioning "PC100", or even uncouth syllables such as "korr"...
I'm not sure what a historic timeline of these ratios (not "differences", please) would gain you.
These ratios can have a big impact on what algorithms and implementations you choose to maximize performance. I suppose if, say, the ratio of RAM to disk speed increased by a factor of 10 over the decade before last, then decreased back to its original ratio in the last decade, it might be worth trawling through some old papers (or old source trees) to revisit lessons learned in the earlier period -- but that seems like a bit of a stretch.
If you're just curious, it shouldn't be too hard to generate timelines of CPU cycle speeds, cache and RAM latencies and bandwidths, disk performance, and so on. But really, each of those has enough factors that a simple "ratio" would probably conceal more than it illuminates.
Water saturated with perchlorates? No, I would not want to be that first human.
...to show everyone that you're a hack writer.
To be fair, when Ken Thompson gave his Turing Award lecture, he didn't have access to Slashdot anonymous cowards to explain the errors in his reasoning. He did the best he could with what he had.
Thirty-one years later, it's still worth reflecting on it.
Apparently I've been neglecting Chrome on this old image for quite a long time. Chrome 21, Mac OS 10.6.8. No crash observed.
Are you proposing to pay people to do this sort of thing, instead of feeding and housing them in the prison system, where the prison-industrial complex can siphon off most of the money that goes toward their support?
That's just crazy talk.
When you talk about "thermal infrared", you're making assumptions about operating temperature. A computational system running on stellar material could radiate thermally well above our visible bands.
I'm sorry that you misinterpreted my post as a "scientific argument", rather than snark.
It seems to me conceivable that the conditions within a star might support high-speed, highly dense computation. I propose no details of the implementation, nor any way to demonstrate or falsify the conjecture. But if it is happening, it may well be that we wouldn't observe it as anything different from the stellar behavior we already do observe. Heck, it could be happening eight light-minutes away from us.
I think this supposition is no more silly than one scaling our current "industrial" "civilization" to something that spans a galaxy.
The firmament is peppered with huge concentrations of high-density plasma, supporting computation and communication far beyond the capacity of low-temperature, low-energy, solid-state matter. The byproducts of all that computation and communication look to us like thermal and optical noise because, being advanced, the minds running on them do so efficiently. Why leak information out into the vast, cold universe before you've taken full advantage of your substrate's Shannon capacity?
But, no, you're probably right. If there are other civilizations out there, why aren't we seeing the smoke from their cook-fires?
If only the Phone Cops had been able to call in the Phone Firemen...
I'm detecting a certain sameness to the stuff that theodp has been posting. Anyone else notice it?
One aspect of the Fermi Paradox is the assumption that "civilization as we know it" necessarily broadcasts a huge amount of information-bearing electromagnetic radiation, and that more advanced civilizations will broadcast more. From a modern perspective, this seems silly.
A signal recognizable across interstellar distances represents waste. It's energy that's spent without reaching its intended target. One aspect of "advancement" is reducing this waste -- improving modulation schemes, encoding efficiencies, and transmission techniques to minimize wasted power.
A signal recognizable across interstellar distances also represents lack of diversity, or wasted capacity. If you're using a certain chunk of spectrum to broadcast a signal recognizable across light-years, you're not getting as much capacity out of that chunk as you could by using it for a bunch of geographically localized broadcasts -- for example, by broadcasting separate programs to each of 100 individual square miles within a 10-mile square, rather than one program for the entire 100-square-mile area. Take this idea a bit further, and you see our current cellular networks. From space, their signals would sound like noise.
It seems to me that the natural signal of a civilization like ours is a pulse of EM broadcast, lasting perhaps a few decades, then going silent or becoming indistinguishable from noise as we move to more localized and more efficiently encoded transmissions. If nobody happens to be listening in our direction during the right interval, brief compared to technological civilization's lifespan, they could easily miss us completely.
"You must have an IQ of at least half a million." -- Popeye