A quick search says either Germany or Belgium: http://en.wikipedia.org/wiki/N...
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
It's not a programmer thing; just look at the comments to the Wall Street Journal article and you'll find the same complaints. I find that pedantry is mostly a class issue. The educated upper classes (and those who see themselves as such) use pedantry to place themselves above others they view as lower class and uneducated ("begging the question" being a perfect example). You will never hear complaints about Bostonians who don't pronounce "r" (*Pahk the cah in Hahvahd Yahd."); you will hear endless complaints about black people who say "ax" instead of "ask" (even though "ax" is actually the original pronunciation). The Boston accent is perceived as cosmopolitan and part of a historic American tradition. African-American vernacular is saddled with poverty and ghetto stereotypes by those outside the communities.
By definition, "improper" English is how poor people speak.
This depends on the possible quality and size of a universe simulation. Is it possible to simulate the entirety of a universe using only a finite subset of that universe?
If yes, then there are (at maximum) an infinite number of simulated universes and and infinite number of recursively simulated universes. Thus the probability of us being the root/real universe is zero ("of measure zero" if you ask a mathematician). Perhaps the holographic principle comes into play to allow the entire universe to be simulated without using the resources of the entire universe.
If no, then there can be only a finite number of simulations in the observable universe. Also, each of the simulated universes is a smaller and/or less-precise version of the simulating universe. In this case, there are (at maximum) a finite number of simulated universes and a finite number of recursively simulated universes capable of hosting intelligent life (a cellular automata with only one cell could hardly be called intelligent). In this scenario, there is a non-zero probability that we live in the root/real universe.
I lean towards no, but I don't have any evidence, just a bias for thinking myself real.
There is no Doppler effect for a single photon, unless that photon is emitting other photons.
True, in a contracting universe, photons gain energy. Noether's theorem says that energy conservation is a consequence of time translation symmetry (t -> t + constant), not reversal symmetry (t -> -t), so conservation of energy isn't required. The "energy imbued by the creation of the universe" seems ill-defined. If you believe Hawking and Krauss, this energy is zero.
Read the blog post I linked to above. There's no way to consistently assign an energy density to spacetime curvature. Quoting Prof. Carroll:
[U]nlike with ordinary matter fields, there is no such thing as the density of gravitational energy. The thing you would like to define as the energy associated with the curvature of spacetime is not uniquely defined at every point in space. So the best you can rigorously do is define the energy of the whole universe all at once, rather than talking about the energy of each separate piece. (You can sometimes talk approximately about the energy of different pieces, by imagining that they are isolated from the rest of the universe.) Even if you can define such a quantity, it’s much less useful than the notion of energy we have for matter fields.
Consider the region of space that contains the photon. If each dimension of the universe double in size, then the photon loses half its energy. But, the vacuum energy increases by a factor of 8 (volume increases by 8 since space is 3 dimensional). This process can't keep energy constant.
You can also reason that different photons will lose different amounts of energy depending on the energy they started with. There's nothing to keep these changing energies balanced with the vacuum energy in expanding or contracting space.
Note that the linked blog post was in response to another Arxiv Blog article that makes the same mistake.
It has been known for quite some time that energy is difficult to define rigorously in General Relativity. A good explanation can be found in this post by CalTech physicist Sean Carroll. Key point:
The point is pretty simple: back when you thought energy was conserved, there was a reason why you thought that, namely time-translation invariance. A fancy way of saying “the background on which particles and forces evolve, as well as the dynamical rules governing their motions, are fixed, not changing with time.” But in general relativity that’s simply no longer true. Einstein tells us that space and time are dynamical, and in particular that they can evolve with time. When the space through which particles move is changing, the total energy of those particles is not conserved.
As a simple example, imagine a photon traveling through an expanding universe in a region with no other matter or energy (dark or otherwise). The expansion of space stretches the wavelength of the photon (cosmological redshift, which is distinct from Doppler redshift), causing it to lose energy. The photon loses energy with nothing around it gaining. Energy is lost because spacetime itself is changing, so Noether's theorem doesn't apply.
"Copacetic" is a perfectly fine word. It's just very rarely used. A native speaker would only use it to show off that they know the word. Other examples: peripatetic and callipygian.
How do you let people know what the compiler does? Unless there's a human-readable spec, people can't plan for future code/contracts. Writing random code/contracts and seeing if it "compiles" is not a great way to program/negotiate.
Doing the same thing every time is only a prerequisite for being correct. What if most people don't like what the current compiler does? After editing, how do you let people know what changed? How do you even know the compiler is correct without a human-verifiable document of expected behavior?
If a new version of the compiler comes out, does that mean that all previous versions were interpreting code incorrectly? Is there any existing compiler that behaves correctly?
When will the questions stop?
I second this. KPhotoAlbum was specifically made for fast tagging. Tagging 1,000 images with places, people, interesting things, etc., in half an hour is pretty normal for a session. As for lock-in, your image tags are kept in an all-text XML file. I run it off of Gnome and XFCE and don't notice any problems.
No exception. Rule #1 is broken by reviving old forms and rules, usually with a Neo- prefix. First examle that comes to mind is Stravinsky's Neoclassical period. Compare The Rite of Spring (which gave birth to the Modern period of classical music) to his Italian Suite, which was the first of his Neoclassical pieces, written seven years later.
It was skipped; that's a comment in Python. The return statement is all the code needed.
Rotating the nut clockwise always moves it away from you; rotating it counter-clockwise always moves the nut towards you. You're flipping your position axes (looking from the top/bottom) but keeping your velocity axes the same (moving in the same direction), hence the direction of rotation flips.