Just think about all the great old Amiga/Commodore-64/etc games you could sell using something like this. I'll pay good money for Bard's Tale I/2/3 and Raid on Bungling Bay.
Mathematics, especially in this context, is just a language for expressing ideas. So I think that in some sense it is possible that some particular string theory really does describe what's going on. I don't know if it's very likely, but the idea of a "final theory" is that it is, in some sense, a complete and accurate description of our universe's mechanics. (The holographic principle is nice because it states that two seemingly very different theories can actually be equivalent.)
I think there are two main caveats here. One is that you never really know when you're "done", and have a theory that is indeed final. We know now that we are not done today because of inconsistencies, but science also does not have even the capability of perfect validation. The other is that a microscopic, reductionist description of physics is not useful or even the correct language for describing more macroscopic effects since basically "more is different." Chemists don't use quantum field theory because it's just not helpful.
And while it's great (even important) to consider philosophical ramifications of theoretical work like this, we have to remember that it's all still conjecture and it will probably always be conjecture. The philosophical spin-offs, so to speak, should never be taken as a way of either supporting or condemning the theory.
I think I'm basically summarizing some of what Wienberg describes in his book Dreams of a Final Theory (1993). That seems old but I highly recommend it!
We wrote our own parallel filesystem to handle just that. It stores a checksum of the file in the metadata. We can (optionally) verify the checksum when a file is read, or run a weekly "scrubber" to detect errors.
We also have Reed-Solomon 6+3 redundancy, so fixing bitrot is usually pretty easy.
I made a $3 mistake on my income tax return (Scottrade updated my tax info *after* I'd sent mine in, but they didn't notify me).
The IRS apparently took that as an excuse to torment me for most of a year. I got audit for the above $3 claim, as well as for "falsely claiming that I was due a tax deduction for student loans" (I took some night classes at the local community college). Apparently that $3 claim was justification for a fishing expedition.
First time, I take an entire day off to redo my taxes, discover that I have made a $3 error, cut them a $3 check, and sent them the 1098-T from the college to prove that the other claim is false.
Couple months later, they send me the exact same form. I again take another day off to recompute my taxes (I was correct), and again send them the same 1098-T info that they requested.
Third time, I told that I will be taken to court because I haven't provided the proof required. I take yet *another* day off to go to the local IRS office in Nashville and sit down with a lady to explain that I've already sent the 1098-T form in.
She logs into her computer, turns it toward me, and starts hitting page-down. "We don't have any record that you sent it in." I see it flash by and tap on the screen. "Yes you did, it was just on your screen a second ago." She pages up and stares at it in silence for 2-3 minutes. "Well I just don't understand that."
Great. So now that the IRS knows I've sent it in, we can put this whole misunderstanding behind us, right? "I'm sorry, but there's nothing I can do to fix this". My choices were pay it off, send an appeal to the IRS, and hope that suddenly grow a brain after the **4th** time, or go to tax court, lose yet another day's salary, and hope the judge was smarter than the IRS. So I paid.
The IRS's excruciatingly, devastatingly, mind-numbing incompetence cost me roughly $1000 in lost salary for a $3 difference. And the whole collective IRS can go pleasure itself with a saguaro cactus.
I manage a couple of petabytes worth of disks (consumer, not enterprise) for the HPC center at Vanderbilt University, and they get absolutely hammered by CMS-HI users 24/7/365. At scale, you will daily see problems that you would never even think of.
The firmware on consumer hard drives is often crap. Very few of them support TLER, we have ~400's drives (Seagates) that needed a firmware fix to prevent sudden death but the fix wouldn't work en bulk over the SAS controller so we had to yank/flash/replace/repeat, and drives will occasionally lock up hard and require a power-cycle.
Don't believe for a second that Linux doesn't need a defrag utility. We were mystified by a sudden influx of permanent drive *slot* failures. After *much* investigation, it turns out that our users were filling them 100% full, erasing 5%, refilling, erasing 5%, etc, until the average file (~100 MB) had thousands of extents. The vibration from the head frantically scanning the disk to read the file was enough to cause the SATA connector to destroy the connector on the backplane (Supermicro chassis, would *NOT* buy again, Chenbro is the way...) We wrote a simple defrag script that simply copied the worst files to a different location and then move them back.
RAID5 isn't nearly sufficient at this point because you will eventually have two or more simultaneous failures just due to the number of disks. We wrote our own filesystem to offer Reed-Solomon-6+3 redundancy.
I'd love to know if you guys have any similar "WTH" horror stories.
A couple of years back at one of the Supercomputing conferences (I think in Phoenix), Fermilab had a cloud chamber in their booth, and you simply *would* *not* believe the amount of ambient radiation passing you at all times. I can easily believe that altitude would have an effect.
Another interesting idea would be to do the same experiment by latitude. Does the Arctic Region Supercomputing Center have a higher rate than the Maui Supercomputing Center? What happens during an aurora?
I have a MBA from a top 25 school, but I also have 4 years towards a Ph.D. in theoretical physics and 12 years experience in academic high-performance computing, so I hope I have street cred when I say this...
Saying you can get a "12 Hour MBA" is like saying you can get Ph.D in astrophysics by reading Carl Sagan's "Cosmos". It's Dunning-Kruger made manifest.
I found my MBA to be just as challenging as my physics degree. Strategy, game theory, operations, and economics aren't exactly power-puff courses. And there's a reason they hand out Nobel's in economics.
Don't confuse the body of knowledge with the person seeking it. There's a difference between someone who
That depends on what you mean by "tangible benefits." One argument I've heard for practical, what's-in-it-for-me-today benefits is that the technology produces spin-offs such as techniques to mass-produce rare-earth magnets, the world-wide web, etc. But that's honestly a weak argument because there's a lot of research going on that has similar chances to produce spin-off tech.
For particle physics, the feeling is that we are on the verge of some kind of revolution! Admittedly it's been that way for a few decades now, but the current working theory (the standard model) has a number of deep problems (thanks wikipedia!). Most new theories, and there are a whole lot of them, predict new phenomena just at the edge of our experimental reach. Part of that is because well-meaning theorists prefer to propose theories that are either presently or soon-to-be testable. But part of it is because the experimental frontier has advanced to energies at roughly the electroweak unification point and lots of theories have interesting behavior to predict at this point, broadly speaking.
So it's not just a more-is-better kind of effort that won't stop until we build solar-system-sized accelerators. There really is a sense that a major shift, possibly even a philosophically-challenging development, is nearly within our grasp, within our lifetimes. This is not a "practical" argument for basic science, but only history can tell us what has had short and long-term practical benefits. History does tell us that this sort of pursuit has in the past been enormously beneficial. Maybe we are in a whole new era where new physics will be completely impractical, but that would honestly be surprising if true.
You hit numerical problems if you calculate it that way. Wikipedia gives a series expansion that works well for large values of gamma:
v (in units of c) = 1 - 1/2 \gamma^(-2)
v = c (1 - 1.8e-10), or 0.99999999982 c
I don't think in these cases you have multiple labs bidding for the job. You have multiple countries wanting to host the lab, but that's a different story.
The biggest problem for high energy physics is establishing multi-year funding. The US government cannot promise anything beyond a single year of funding. If say $8 B has been spent over 10 years and one year congress says "but I promised to cut spending", then that's the end of the road for that lab. This happened for the SSC in 1993, but also a lot of times since then on lower-profile, some $500 M experiments that were, yes, in construction already.
Now say 15 years later the $10B has been spent, but its not quite done, another $2B would let you finish the project. Do you really throw away $10B to save 2B? There is no fraud, just a mis-estimation of the costs of building a beyond state-of-the-art machine and slightly larger technical problems than were expected.
Most of the cases I'm familiar with, including the SSC, were not actually budget overruns even though they were politicized that way. If you're a politician who wants to (a) publicly demonstrate how fiscally conservative you are and (b) not actually cut spending on items that might affect the bulk of your constituency, then you cut big science every time. Even if the budget grows on the whole, you've made a statement and some headlines.
I also wanted to mention the failed SSC in Texas, cancelled in 1993. That would have been running at double the LHC's energy about a decade earlier. In 1993 congress seats were won by senators promising budget cuts, and Big Science had a large target painted on its back. Killing the SSC was a big-profile way of appearing to reduce spending while at the same time not damaging something that many people understood or cared about.
Since that time, the US has proved time and time again that they are incapable of sustaining funding for a long-term science project. All of the high-energy accelerators in the US are operationally shut down, and almost no proposals in the past 20 years or so have survived all the way to producing results before getting scrapped by some budget shortfall in a particular fiscal year. The LHC survives because the US is not such a major (or critical) contributor.
Well, many of these tunnels, including the one the LHC uses, have been refurbished multiple times already. Cern's main ring was built to be somewhat future-proof, but that was a long time ago. A google search came up with The history of CERN, which dates the groundbreaking to 1954.
In accelerators you have two basic designs: linear and circular(ish). In linear accelerators each boosting element (RF cavity or whatnot) gets one chance to give the beam particles a kick, so the energy is limited to how hard you kick (limited by technology) and how many elements / how long (limited by budget).
In circular accelerators you are limited by synchrotron radiation. At some point the energy pumped into the beam matches the energy lost via synchrotron radiation. To move in a circle you have to accelerate inwardly, and an accelerating charged particle radiates light. At particle accelerator energies, this radiation is in the x-ray spectrum. You can reduce the loss by using a larger ring -- a smaller curvature requires less centripetal acceleration and hence less radiation loss. You can also of course build stronger boosting elements, but the radiation also heats the beamline and surrounding superconducting magnets, so it's not "that simple."
The other thing to vary is the kind of particle accelerated. Electrons have a very small mass and lose a larger fraction of their momentum to synchrotron radiation. SLAC and KEK are linear accelerators that use electrons. (Cornell's CESR is a ring that accelerates electrons too, but at lower energies compared to these others.) Protons are the other obvious choice, which is what Fermilab and CERN's LHC (after the upgrade) are accelerating. Being much more massive, the protons slough off less of their momentum to synchrotron radiation and can be accelerated to higher energies given the same size ring. The disadvantage of protons is that the energy of the proton is shared among its three quarks (and gluons I think) whereas the electron is truly singular as far as can be told.
I've been out of touch lately but as of at least 8 years ago three proposals were being discussed: VLHC -- big ring accelerating protons. Next Linear Collider (NLC) -- long linear accelerator for electrons. Muon collider -- a smaller ring (actually with straight sections like a track&field track) that produces and accelerates muons. Muons are just like electrons only 200 times more massive and is unstable with a half-life of 2 microseconds. The muon collider was thought to be an ideal Higgs factory, but with a lot of design challenges. One of the main challenges is to not only accelerate the muons before they decay, but also collimate, or "cool", the beam very fast as well so that you can create as many head-on collisions as possible.
So the news that the VLHC design is currently in favor is interesting, but this is hardly the first time the issue has been discussed and I doubt it will be the last. Several years ago the NLC design seemed most favorable, but this would, by its length, be limited to a specific design energy and probably be built to produce Higgs, Higgs, and more Higgs. It seems to me like a VLHC would have more discovery potential for more massive Higgs particles, signs of supersymmetry, or whatever else might exist.
I hear there's petrol in Krokodil. That's pretty addictive and dangerous.
Are you referring to nicotine and alcohol? I don't believe these are either the most addictive or the most dangerous on an individual usage basis. You may be right if you count aggregate effect (total number of people addicted to smoking or drinking) but this makes a good argument against legalization.
It's hard to use absolute arguments on this kind of subject. Take your argument to one extreme and you get something like "The problem isn't having easily-available nuclear bombs, the problem is those people who would use them." This statement goes too far at least in one way, that while drugs use causes some collateral damage so to speak, not nearly as much as nukes.
Another problem with legalized (free market) drugs is that many are physically addictive, so users are quickly unable to make free choices regarding their use. Marijuana is less addictive than most, which allows room for debate on legalization. (I voted "yes" on legalization in CO, though I'm willing to admit that may have been a mistake. I'm not sure yet but it makes an interesting experiment in any case.)
I felt like some of DPR's arguments supporting his website were delusional. Particularly this: "Let us assume you have a son who is in his teenage years and you knew they were going to do drugs, what as a parent, would you do? Would you let them go to their friends’ friends’ dealer or would you help them buy from Silk Road