Then some crazy people started using it blindly as a object serialization language.
Translation: they wanted JSON, but it hadn't been invented yet. Since they needed to "get the work done" so they went with what they had.
Then some crazy people started using it blindly as a object serialization language.
Translation: they wanted JSON, but it hadn't been invented yet. Since they needed to "get the work done" so they went with what they had.
In use, power plants which can't be throttled back for times of low demand are as much a problem as power plants which vary their output during the day.
Yeah, but they handle that by varying demand. To wit: most of coal here have an aluminium plant pair with them, who get their power for near free. They take the excess supply. It's not a total solution because the price of power here goes negative most nights (ie, the coal power generators PAY others to take their power) - so they are offloading some of it onto the rest of the grid as well. But to me that's fair, as ultimately the coal and nuclear power plants are paying the price for their inability to follow the load by giving away the energy. Currently wind and solar are offloading the cost of not being able to supply when needed to the rest of the grid. Clearly they will have to pay the cost one day - probably by giving their excess power away to pump storage operators, who then get to sell it later.
How we pay for the peak demand pumped storage, which still costs $5/watt but is only used a couple of days a year is an interesting question. But we have the exactly the same issue with transmission lines - we have to pay a huge amount extra to cope with demand imposed by just a few days a year. We managed it, so I guess we will manage it with pumped storage too.
In reality they have just crossed another milestone: they are cheaper when they are generating. That will do for now while there is substantial fossil capacity to back them up, but if we are to phase out fossil fuels entirely the figure you have to compare nuclear to is generation plus storage.
The cheapest by far is pumped storage. In countries with plenty of hydro it's effectively free. For the rest of us it's about $1/watt generation capacity. Nuclear comes in at $8/watt or so. Wind comes in at $4/watt and solar is hitting parity with that, so even with storage renewables are cheaper. Nuclear is already history.
An argument I often see here is there are no sites available for pumped storage. Turns out that's wrong. Here in Australia (which is mostly flat desert) we did a survey recently. You need is a hill where you build a dam about 500m in diameter, that has a valley about 400m below within 3km or so. Turns out the country is littered with literally 10's of thousands of sites like this.
None of this is free of course - you still have to spend the $5/watt or so. Australia's energy consumption is 50GW, so that totals AU$250 Billion. That's a metric fuck ton of money to a small country like Australia. But as it happens out coal generation facilities are near retirement, so we would have to spend it anyway.
OK, I'll try again. YOU DON'T. All you see is electrons being "consumed" as they hit the screen, forming the interference pattern.
If you DO look for them as they pass through the slits -- where one can do this without "consuming" them by e.g. putting a conducting loop around the slit that will experience a voltage pulse as the electron passes THROUGH it or by illuminating the volume right behind one of the slits with intense light that can scatter off of the moving electron and hence detect the slit the electron passed though -- then the interference pattern goes away.
This is a fundamental sort of "goes away". It isn't just that we gave a small extra push to the electron, as in principle you can CLASSICALLY make the detection so weak that it wouldn't affect the classical trajectory. It is that the detection itself shifts the PHASE of the electron and hence destroys its coherence with the electron(s) passing through the other slit, so there is no longer any interference.
Also, you can read:
which both walks through all of this and provides you with references to at least some of the actual experiments that verify it.
It is. The point is that if you measure which slit it passes through, the interference pattern that implies that it passed through both AS A WAVE, not a particle, disappears. The rule for this sort of thing is that if you measure wavelike properties, you don't get a definite particle position/state. If you measure a definite particle position/state, you don't get wavelike behavior any more. This is called complementarity and is the basis of the uncertainty principle. Electrons and photons alike behave the same way. Even at very low intensities where you can effectively observe single photons or electrons AT THE DETECTOR, if you don't look to see which slit they go through you get interference, implying that the electron passed through both as a wave (function). If you do look and measure the slit the photon or electron passes through, no more interference pattern, no more waves.
The electron in some sense isn't in two places at once -- it is "everywhere" at once, but with a low probability, if it has a very well defined momentum and hence wavelength. If you measure it or confine it to some specific location, you do so at the irreducible expense of having a well-defined momentum (and hence wavelength).
I'm actually teaching this stuff right now at the most elementary level. There aren't a lot of good intro level books on it, but I'm using Harris's "Modern Physics", which is at least pretty readable and has some of the real math in it. Beyond intro modern physics books, you can try Wikipedia (often has surprisingly good articles on this sort of thing) or a real quantum textbook. Or two. Or three or four. It takes years to not quite understand quantum mechanics, and an important step along the way is to UNlearn all of the nonsense you "learned" about it in English statements and to concentrate on the consistent mathematical and conceptual formulation of the theory.
Hey, somebody had to do it. Using English with embedded classical logic to describe quantum phenomena is a waste of time. And even most physicists have never read Schwinger or studied the Nakajima-Zwanzig equation and hence have little idea of how to formally obtain the classical measurement projection in an open system interacting with a classically described statistical bath when the combined closed system is in a stationary state and has no probabilities at all. And then there is relativity and time reversal invariance.
I'm just sitting here, wondering if the back of the envelope computation is dead. When did we get to the point where we could resolve 30+ orders of magnitude effects in the lab? We haven't even -- as far as I know -- experimentally verified whether normal matter gravitation attracts or repels antimatter, which seems like it would be a pretty important first step in building a QFT with gravity or GR, but even that seems beyond us so far.
"I do not want your cheap brainburning drugs. They are useless for work. And I am a working man today."
So, only really expensive designer brainburning drugs for you, eh? Or is it cheap brain enhancing drugs (he says, sipping his coffee...:-)
Mod +1. This is the second or third time I've seen summaries of the press announcement, and the first time it has been even obliquely acknowledged that the so called "repeller" is nothing more than a localized lack of PULL, not any sort of actual gravitational "push". -1 to the article itself for being misleading bullshit and creating a "dipole" like an electron and an electron hole create a "dipole" in a uniform neutral metal, no more.
Surely there is nothing surprising about this. People have been doing cosmological simulations for a LONG time with a large number of pointlike objects interacting with GMm/r^2 attractive forces but to simulate galactic evolution and universal evolution from the big bang. The interesting point being that in the center of a uniform mass distribution, there is no net force but nevertheless 1/r^2 forces with any kind of inhomogeneity in the underlying free mass distribution tends to accrete in some places and abandon others, especially if it can inelastically interact and clump together into bound subsystems. This must have been seen in simulations pretty much every time, and should come as no surprise in nature.
But frankly, what's wrong with smoking in a bar?
... Nobody forces you to go to my bar
As others have pointed out the staff can't go elsewhere. Non smoking bar staff have successfully sued their employers after getting lung cancer.
I wouldn't worry overly about it. I'm an Australian, and it looks like is end of the line for Australia's actions on smoking. The two areas that annoyed voters were their kids starting smoking due to peer pressure and slick ads, and the mess smokers left around with 2nd hand smoke and butts. The kid problem has been cured by making it expensive and making the packs so ugly it wasn't cool to be seen with one (seriously: no one looks cool with a picture of a gangrenous foot near their mouth), and the 2nd hand smoke was cured by banning it from public places.
If it does stop here it will be one of those rare successes in public policy. It leaves people are still free to do whatever they dammed well please in their private life, while stopping them from effecting others with their less healthy habits.
I'm hoping our nanny state government will notice the success and apply the same techniques to the illegal social drugs. Making them legal, putting high taxes on them, and regulating the purity will solve a myriad of problems. Stopping people dying from injecting bad shit is one of them. Using those taxes to get people to pay for them rehab down the track is another. Removing the money from the swaps created by illegal gangs is another. Win. Win. Win. It is a nanny state, so I guess it won't happen. But I can dream
Having used the "sweet rm" trick back in the 80's somewhere (with much more limited space, and a cron FIFO groomer) it also doesn't protect you from a wide variety of file corruption issues and overwrites. Remove a file, recreate it, remove it again? Delete two files from different parts of your tree -- e.g. README -- that have the same name? Original file gone (unless you don't just alias rm, you write a very complicated script). If you run out of space and have an alias/script like "flush" to take out the trash and make room for more, it just moves the problem one notch downstream.
With that said, it did save my ass a few times. Then I learned personal discipline, started using version control (SCCS at the time, IIRC) onto a reliable server to not just back up any files of any importance I create but to save reversible strings of revisions back to the Egg, and stopped using my reversible rm altogether after one or two of the disasters it still leaves open.
Moral: Version control with frequent checkins usually leaves your working image itself on your working machine. Keeping the repository on a different machine is already one level of redundancy. Keeping it on a server class machine in a tier 1 or tier 2 facility with reliable, regular backups and RAIDed disk is suddenly very, very, very reliable. As the current incident shows, not perfectly reliable. Human error, multiple disk failures in an array, nuclear war, internal malice or incompetence or just plain accident can still cause data loss, but in this case what is being reported isn't disaster -- they had 6 hour backups! Even though I'm sure there will be some folks who are inconvenienced, MOST of the users will still have usable, current working copies and be out anywhere from zero to a few hours of work. I've been on both sides of the sysadmin aisle in data loss server crashes, and -- they happen. Wise users use a belt AND suspenders to the extent possible lest they find their pants gathered around their ankles one day...
Also the fact that your eyes can see because of the index of refraction of the lens...
Because of dispersion (different frequencies) inside dynamically polarizable materials. Not in a vacuum. In a vacuum, the speed of light is predicted to be -- the speed of light.
Light can be bent by gravitational fields, but the thought is that the bent trajectories are geodesics in bent spacetime, not actual lenses which bend light by slowing it down due to the susceptibility of space.
OK, have to step in here. The map is not the territory, and the idea of a thing is not a thing. If you are saying "God is not a thing, it is an idea" I'd agree with you. But ideas are not in any necessary one-to-one correspondence with the Universe of "things that actually exist", and ideas to the very best of our experience a) are highly complex phenomena contingent on all sorts of material stuff and do not just float around like quantum particles that permeate and surround the Universe (h/t to Terry Pratchett); b) cannot and do not "create" anything, ever. In fact there is no evidence that anything, ever, has been created. The laws of physics are all pretty much constrained by conservation principles (consistent with observation) that state that nothing is ever created, it is all just existing stuff changing form and moving around.
The second thing I'd object to is the idea that anyone at all can "reason" about God in a meaningful or useful way. The first step in such a reasoning process is to choose one's premises, or axioms, or postulates -- the basis for one's eventual "consistent" conclusions. This is precisely the same whether one is reasoning about mathematics, the Universe of stuff that actually exists, or the enormous metaphysical space of pure speculation -- reasoning about pink unicorns, trying to decide if Santa likes hot chocolate with or without a splash of peppermint Schnapps on Christmas eve, how many angels can dance on the head of a standard shirt-packing pin. The premises themselves cannot be proven -- they are PREMISES -- so all reasoning contingent upon the premises is Bullshit in the precise sense that there is (as noted) no necessary one-to-one correspondence with the pattern of consistent results on derives with the very best of intentions and the real world.
The second step in USEFUL reasoning is to seek out objective correspondences between those contingent results AND the real world. To the extent that they are discovered to exist, we strengthen our degree of belief in the conclusions, and by Bayesian reasoning, the premises that led to the conclusions in good correspondence. To the extent that they are contradicted, we at least weaken our degree of belief in the conclusions, and again by inheritance in the premises that led to the contradiction. This is a slight oversimplification as multiple premises contribute to most nontrivial conclusions and it is not necessarily clear which one(s) fail, but there is no doubt that REASON requires reduction of belief in the conclusion itself rather than amplification when there is either no evidence supporting it (but there is evidence supporting competing ideas and arguments) or if the evidence contradicts it.
And here's the rub. The very first step about any reasoning process about God has to begin with the pure assertion that God exists. This is because we have no direct and usable sensory data, no direct "experience" of God the way we have experience of toast, or things falling down when dropped. We have built powerful apparatus that extends the range and sensitivity of our senses and none of it reveals God. We have conducted careful statistical analyses of human experience contingent on things like belief and prayer and behavior and -- outside of obvious stuff that behaving "well" is more likely to make one happy than being a butt in human society -- no phenomena or statistical anomalies are observed that require supernatural explanation. One cannot predict one single thing about the world and how it behaves or outcomes based on religious belief or the asserted premise "God exists for some useful meaning of the word `exists'". To paraphrase, the rain falls on Saint and Sinner alike.
What we CAN do is examine the consequences of BELIEF ITSELF. Believing in something has an enormous impact on human existence. In a sense, our society (or societies!) are defined by their beliefs, their memetic structure, their history, their evolution -- including religious beliefs. Religious beliefs make an enormous set of untestable, empirically unsupportable assertions, assertions that are blatently internally inconsistent. Contradictions abound. One can, as everybody SHOULD know, "reason" your way to any conclusion you like from contradictory premises, so it comes as no real surprise that humans are constantly manipulated and manipulate others on the basis of these absurd contradictory beliefs. Since all major religions assert a special exception for ordinary reasoning processes when it comes to reasoning about the religions themselves as a necessary step in getting people to continue to believe in the absurdity, they persist, and humans who accept them make monumentally poor decisions, choices that they would never make if they were actually reasoning correctly and optimally in and about the real world.
Religion is arguably the number one killer of humans active on the planet at this very moment. It is directly responsible for some of the largest and longest running armed conflicts in our mutual history. It enslaves and distorts the judgment of some 3/4 of the human population -- literally enslaves perhaps a billion women in the Abrahamic faiths. It causes the redirection of a huge fraction of the global production of the human species into the "service" of the priesthood(s) of the various religions, who spend most of it supporting themselves without an actual job that actually produces something useful, like toast or Schnapps flavored hot chocolate. The religions that persist after a brutal memetic evolution process involving world conquest and domination at the point of a sword are almost without exception socially engineered at this point to make the poor and disadvantaged human content enough with their lot to avoid revolution against the prevailing powers that keep them poor and disadvantaged by promising them eternal pleasures in an imaginary afterlife if only they behave themselves and are good little proles in this one.
Sure, this too is an oversimplification -- some people, in some religions, also do some good things. But that is more because they are good people than because the religion itself is good, and good or not it isn't likely to be TRUE. Reasoning from FALSE premises isn't all that great a thing to do, or to base a sane society on.
Just kidding. Not so much.
Personally, I think we evolved without it when we took to walking upright. A penis bone would have kept all male penises pointing up at the angle of optimum intromission. This would have forced all males to urinate in long rainbow arcs that got piss all over the place in a highly conspicuous way and would have made the penis, sticking out and up right up front, highly vulnerable to all sorts of weapons as tribal man fought one another. Hard to tuck the junk back and out of risk when you are standing if you have to break a bone to do it. Humans are also enormously mutually fertile (roughly 10% of the time) and live a very long time, so long intromission, short intromission, neither one is going to be effective at ensuring "monogamy" and of course arguing that human culture is monogamous even today is pretty much to make a RELIGIOUS argument as the best that can be supported empirically is some mix of serial monogamy, serial polygamy, serial polyandry, and just plain fucking around with a smattering of true "lifetime exclusive" monogamy mixed in, maybe 10 or 20%. Swans may mate for life, but humans are lucky if they mate for dinner, if one follows overt statistics, and even that is probably driven more by religious memes than by "nature". The memes are rather at war with the genes, and different cultures follow different patterns for optimizing mate selection worldwide.
As many above have pointed out, there is little reason to read the entire series "like a novel" from cover to cover, in addition to the fact that yeah, it would take a while to WORK through it like a textbook as opposed to read through it quickly to see what is there. And yeah, there are better books now in profusion on many of the topics covered, although AFAIK there is no book or book series that is as encyclopedic on the subjects he covers.
However, many people will find some of the sections very useful. I personally found "Seminumerical Algorithms" useful indeed when learning about random number generators and testing random number generators. It isn't the last word, and it certainly isn't the latest word as we move into a 64 bit world and beyond, but it is an excellent starting point. In other parts of the series there are other gems or nuggets well worth studying or reading, even if you move on to actual research papers or better books afterwards.
To sum up, it is a useful thing to own if you are doing a lot of very widely spread code development and need to acquire literacy quickly in subjects it covers, even if you are going to end up looking for an O'Reilly text on some of those subjects to get a more modern perspective. Those OR books are probably going to reference, rewrite, and augment Knuth.
Note well that I'm an Old Guy (tm) and actually did write a lot of code in Fortran once in the long ago before abandoning it for C and Unix and beyond. TAOCP was one of the ONLY really good encyclopedic references for people who were NOT CPS majors and who needed to learn about algorithms of one sort or another or some aspect of coding covered in one of the many CPS courses they never took. They (I) didn't need a course with the best textbook of the day -- we needed to get started. Once started, we knew how to learn and go beyond the start. 1.5 cubic feet of shelf space wasn't too high a price to be able to learn something about everything or anything to get started.
"Pascal is Pascal is Pascal is dog meat." -- M. Devine and P. Larson, Computer Science 340