Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re: Wow, the UK is even more screwed up than the U (Score 1) 238

The U.K. Constitution is now almost wholly written in light of decades of law reform in the area, but it's not consolidated into a single document, and is not especially well codified. (See the final paragraph below).

Since 1997 there has been a significant roll-back of the personal prerogative by successive Parliaments legislating in areas where it was routinely exercised. As a recently relevant example, the Lascelles principles (from the Senex letters, for example) became obsolete with the Fixed-term Parliament Act 2011 (which controls events upon the demise of a government by expiration, resignation or loss of confidence), and several recent Representation of the People Acts and some acts controlling secondary legislation such as the Cabinet Manual, which outlines the continuity of government in the event of a demise (death or resignation) of the Prime Minister. The Queen did not have a right to reject Theresa May as Prime Minister even if Cameron's careful announcement of his plan to first tender his resignation and then recommend his successor described the actual order of events; her personal prerogative in the area is obsolete, and the (advised) prerogative vested in Cameron is now strictly controlled by statute.

For better or for worse, what's missing in the UK Constitution is a clear and standard entrenching mechanism. Practically every Constitutional scholar and lawyer in the country would agree that there *is* entrenching, and the concept that one Parliament cannot bind the next has been obsolete as a legal reality for some time, and as a political one for even longer. Ignoring entrenchings coupled to international treaties (e.g. the Single European Act), and entrenchings in the terms of statutes themselves, there are entrenchings controlled by terms in other Acts of Parliament. The tower of legislation needed to remove the Scotland Act would be enormous, for example. And the new Secretary of State for Exiting the European Union and the Attorney-General will be busy for years trying to identify what legislation will need to be included in the effective repeal of the European Communities Act 1972. A misstep will indeed risk the whole enterprise being declared defective by UK courts, which would effectively require Parliament to legislate to correct the defect or outright overrule the ruling (which in turn would need to be done carefully, otherwise "ping pong" results).

In effect, the major difference between Parliamentary supremacy in the UK in 2016 and Constitutional supremacy in Canada or Australia in 2016 is that the Queen of Canada and the Queen of Australia have prerogative powers that are *protected from* the federal parliaments by the Constitution. In the UK, any royal prereogative can be removed by Parliament, and the only protection today is the Queen's Consent (or the Prince's Consent where relevant), which are implemented as standing rules of the House of Commons and subject to change at the will of the majority of MPs. In Canada, for example, an enumerated power of the executive (e.g. the power to appoint Senators where there are vacancies) cannot be removed without the consent of some or all of the provincial legislatures. Several non-enumerated powers have been held by the courts in Canada to require provincial consent too. In the UK, by contrast, the power to summon members of the House of Lords has been modified by Parliament (and in some cases the House of Commons) acting alone several times in the past century, and even in the years since the effective federalism of devolution has arrived as a constitutional reality.

(A clear example is considering the difference between the Fixed-term Parliament Act 2011 in the UK, which withdrew a prerogative power altogether (preventing the executive from dissolving Parliament and calling an election), and the Canada Elections Act 2000 which in establishing fixed election dates had to consider that dissolution cannot be removed without a formal constitutional amendment involving provincial consent, and which preserved the power of dissolution at will by the Governor-in-Council. Likewise, it would take all ten provinces and the federal Parliament to allow Parliaments to operate longer than five years, whereas the UK Parliament could do this unilaterally.)

One can debate whether the lack of a formal standard entrenching mechanism has provided much needed flexibility in the evolution of federalism of the UK (and indeed in participating in the evolution of a federation of which the UK itself has been a member) or whether it will turn out to be a source of bitter and extensive litigation because of lack of clarity over the non-standard and informal entrenchings. (Remember that two of the constituent nations of the UK have their own legal personalities and thus can enter into contracts with the Queen of the United Kingdom through her agents and ministers, and have done so in many areas involving devolved powers and agencies and so forth; it is probable that all the rules of contract-making and treaty-making will apply in the Supreme Court of the United Kingdom, including the entering into enforceable verbal agreements (where English and Scots law diverge, possibly relevantly even... yikes! So it's possible that undoing the Scotland Act unilaterally would be forbidden by the courts unless preceded by general legislation adjusting the law of contracts. And it's hard to imagine how that would not be a recipe for either marketplace chaos or damaging the UK's attempts to form trade and other agreements with other countries in implementing Brexit.)

Finally, there have been references to the Supreme Court of Canada (and a couple of other Commonweath courts have had similar cases) seeking to define what outside the central document is part of the constitution and thus subject to its entrenching formulae. There has not *YET* been a case to the Supreme Court of the United Kingdom or its predecessors that produced a similar list, however, it seems clear that the Scottish executive and possibly the executive of Northern Ireland are likely to cause that to happen in due course in light of "Brexit Means Brexit". The list is likely to strongly resemble the Canadian one, and you can bet that the parties will quote from the (persuasive) sources that determined the list. https://www.wikiwand.com/en/Li...

Comment Re:Small black holes, right? (Score 1) 220

Semiclassical gravity gives a perfectly consistent picture right to the limit of strong gravity -- that's where the radius of curvature is on the Planck scale. For stellar black holes, that's well inside the horizon. At the horizon of supermassive black holes, the curvature can be arbitrarily flat, and microscopic physics just inside the horizon is no different from well outside the horizon. (This is the basis for the "no drama" conjecture frequently discussed in the aftermath of the Polchinski et al. (AMPS firewalls) paper).

There is no reason at all to conclude that semiclassical gravity is incomplete in weak gravity. The problem with GR vs QFT making incompatible predictions is not at the horizon, but much nearer to the singularity (and the presence of the zero-radius singularity itself, and you're right that most gravitational physicists hope that the singularity vanishes in a quantum theory of gravity).

You'd be right if you said we aren't certain about what happens at and very near the singularity. But very near, for a stellar black hole, is on the order of light-picoseconds to light-nanoseconds, and that distance doesn't climb nearly as quickly with its mass as the Schwarzschild radius does. For instance, you can cram a large large large number of high energy particles obeying the spin statistics for bosons into a compact area; near that seething mass you can also convert a large number of particles obeying the Pauli exclusion principle into ones that do not; the problem is in the details about the mechanisms for shortening of particle wavelengths that are already very short versus degeneracy pressures, which nobody has a clear answer for at this time. (GR has no such mechanism, so the particle wavelengths go to zero. Ideally a quantum theory of gravity has a mechanism that keeps those wavelengths from becoming arbitrarily short.)

Comment Re: Why? (Score 1) 220

Yes, but they'd have to be either only a little weaker or a lot lot closer.

LIGO is sensitive to the amplitude of gravitational waves. Wave amplitude falls off linearly with distance. It's wave energy-density that falls off with the square of distance, and LIGO can detect the range of amplitudes to which it's sensitive at very tiny energy-densities.

Comment Re:Why? (Score 1) 220

Ohhh, close!

In practically all theories with gravitons that take GR as the unremovable background, gravitons are the quantizations of the classical spin-2 gravitational wave.

Sure, there are lots of gravitational waves passing through you, but they're very very very low amplitude; also, the weak interaction of these waves are such that even the smallest wave must be made up of enormous numbers of gravitons. So the GWs being shed by the Earth-Moon orbital system results in many many many more gravitons passing through you than solar neutrinos. Good luck unambiguously measuring _just one_, which is pretty much what one will almost always mean by "detecting a graviton".

Additionally, gravitational waves aren't what's keeping you in your chair on the surface of the Earth -- that's mostly static curvature sourced by the Earth's mass-energy. There are theories which quantize that, but they're not gravitons (in some forms of quantized gravitoelectromagnetism they can be virtual gravitons, though, along the lines of virtual photons in normal electromagnetism.)

The Earth *does* shed very low amplitude gravitational waves though, because it is not exactly spherically or rotationally symmetric. GWs are never shed by bodies exactly symmetrical about their rotational axes, or spherically symmetrical. Consider that there are non-binary bodies which are very very close to axisymmetric in the sky: many of the suspected compact massive objects like regular millisecond pulsars and the X- and gamma- black hole candidates must be much closer axisymmetric than the Earth or the sun or the pretty damn spherical slowly-rotating shells inside ordinary stars. By the shell theorem, most ordinary stars have better rotational symmetry at their surfaces than the Earth does. Yet they each source a lot more curvature than the Earth does, even though they shed a lot fewer gravitational waves (after subtracting Earth-Moon and star-large_planet GWs).

However, ordinary stars, or neutron stars, or black holes that are in binary systems shed (a) more gravitational waves, and depending on the distance between the bodies (b) gravitational waves at much higher amplitudes. So binary systems throw off more gravitons. The two LIGO events from merging black hole binaries put out more (spin-2 massless gauge boson) gravitons than the solar system produces on its own over the course of billions of years. Yet these LIGO gravitons have little effect on you sitting in your chair, right?

Comment Re:ALIENS. (Score 1) 220

"c" (small letter) is the sole free parameter of the Poincaré group, which is the isometry group of Minkowski spacetime, which is the spacetime of Special Relativity. Locally, in our universe, spacetime far from black holes and the big bang is such an excellent approximation of Minkowski spacetime that Special Relativity passes all known tests (including those done by nature in pretty extreme conditions). The boundary of locally is determined by the radius of curvature, and for all practical purposes there is nowhere in the known universe except very near black holes (the singularity, not necessarily the horizon, where curvature radiuses can be arbitrarily large) and very near the big bang where that is less than light-microsecond scale (and typically curvature radiuses are many orders of magnitude larger).

The parameter corresponds physically to the speed of a particle whose mass is always zero under Poincaré group transformations. Those particles are said to be "massless" or to have zero rest mass or to have zero invariant mass.

The Poincaré group cannot be the isometry group of a spacetime in which objects travel on spacelike geodesics. Although one can _deform_ Special Relativity to incorporate the behaviours of objects moving faster than "c", the known deformations come at a cost, typically of making some physical observers see manifestly unphysical things, and we have substantial evidence against quite a few of those observations at any but the tiniest energy-densities in the known universe. (Deformed SR has been a topic of study for about sixty years.)

Finally, in Special Relativity, light is assumed to be massless. (There is ample experimental evidence which make the upper possible bound on light be extremely tiny, and all tests are compatible with light being actually massless).

However, In General Relativity we don't talk about light being "massless" at all, or having a particular velocity; it simply has the behaviour that anywhere in the manifold a particle of light's motions are constrained to the surface of a nonempty, open, convex cone of tangent bundles on that point as determined by the values of all the fields at that point. The cone we use in practice is called the "light cone" or the "causal cone", because as far as we can tell experimentally, light ALWAYS travels on the surface of that cone and nowhere else inside or outside it, and all other matter travels within the cone and never outside it. All known matter behaves this way. Additionally, (classical) gravitational waves travels on the cone (there is good evidence for that), and a quantization of them into gravitons would too (so in SR terms they would be "massless").

That said, in GR you can use a solution to the Einstein Field Equations of General Relativity that admit a hyperbolization AND also a causal cone structure whose slope is wider than that of light. General Relativity admits what Geroch calls "a democracy of causal cones". (He also will tell you that there is zero evidence for this in our universe). However, when you take that approach, you develop an initial values surface and evolve the solution to the EFE forward by doing the calculations. You fall into trouble if you try to set up an initial and a final values surface and try to match the two using your solution, as the results do not tend to match intuitions. Indeed, using just the standard light cone, Miguel Alcubierre found enough "bugs" in the wishful-thinking/surface-matching approach that he has been able to write a book about the topic in the setting of numerical relativity.

So, your intuition is that something whose causal cone is outside that of the standard one is that cherenkov radiation will result. There is a non-easy but robust way of checking that intuition: develop a solution to the EFEs that are a slight deformation of e.g. the Minkowski, de Sitter or Schwarzschild electrovacuum (with possibly more field content than just the electromagnetic one if you feel very ambitious) and solve the equations for your faster-than-light test particles.

Go for it, it has value as an exercise.

Comment Re:Cue the millenials... (Score 1) 391

Uh, wow, while your comments on Canaris are reasonable, your last paragraph is really really garbled.

Austria-Hungary declaring war on Serbia started WW I, and a few days later they also invaded Russia. Sure, the German Empire as it was then were also busy in the same week (and drew in Britain and France), but so were the Ottomans.

Note the absence of Italy and Japan (which ultimately allied themselves with Britain, France and Russia).

Compare WW II, which started with the Japanese invasion of China (or arguably but less clearly with the Italian invasion of Ethiopia), acting against the British, French and Soviet interests, and trying to draw in Nazi Germany (in particular Italy ceased objecting to the absorption of Austria and the other former bits of the fragmentized Austria-Hungary in exchange for their diplomatic, financial and military support in the Abyssinian conflict, and Japan/Manchukuo started shooting fights at the borders with the Soviet Union).

On top of it, Turkey stayed almost entirely out of World War II (opposing Germany), although it ultimately did join the Allies late in the war. Several of the bits of the fragmentized Ottoman and Austro-Hungarian empires actively opposed Nazi Germany at various stages during the war.

And this is without having to deal with the question of whether Nazi Germany and the German Empire were fundamentally the same player.

So really hardly "the same players", unless you look only at northwestern Europe and ignore some of the details about the constitutional natures of several of the states there during each war.

Even if you argue (and you can, reasonably) that nationalism and nation-states versus multi-national empires was the key factor in both world wars, WW I was much more about the disintegration of empires run by one dominant ethnic group into several states run by locally dominant and formerly repressed ethnic groups, whereas WW II was much more about a handful of ethnic groups re-establishing dominance over many others in new multi-national empires. That is, thematically, the wars were not so much re-runs as rather mirror-images: old empire -> new republics vs new republics -> new empires.

Comment Re:STARTTLS broken, like UUCP maps (Score 1) 129

I'm past the UUCP days, I don't want to maintain a map of who can do things and who can't

Let me refresh your memory about some aspects of what you're "past", on the perfectly reasonable assumption that I did a lot more UUCP routing than you (and really, more than practically everyone).

UUCP maps were just an out of band and not-very-frequent node-adjacency flood, which let one construct a graph from which one could extract directed acyclic subtrees using Dijkstra's SPF. Just like OSPF. Just like ISIS. The difference was in some details. Like ISIS, communication could be completely out of band (some sites got the maps newsfroups ON TAPE). Unlike ISIS or OSPF, maps could be flooded entirely arbitrarily and signed with PGP, even by multiple parties, including but not limited to the newsfroup admin and the site admin. The leaf connection syntax "site (SCALAR METRIC)" was also highly useful.

Later developments included the ".dom.ain" wildcarding system which allowed for abstraction across large numbers of sites (whole countries were in the later maps; for example, the Canadian maps aggressively sought to have sane gatewaying information for things under .ca, in NetNorth (part of BITNET) and in their ean X.400 research system)), although this was hardly a unique practice.

Finally, the maps were for routing hop-by-hop UUXQT messages, which were in actuality RPCs. Mail transport happened to invoke rmail as the remote procedure. News transport happened to invoke rnews as the remote procedure. Both were typically single-hop, but were not restricted to be (multihop uux/uuxqts was sometimes actually even used, rather than pulling things up a layer). There were experimental uses of arbitrary RPCs; at least one site sent DECNET mailer data via uux/uuxqt, and I know of one that sent SLIP via uux/uuxqt as well (SLIP over uux over uucp 'x' protocol over X.25, in fact. Gross things happened in the late 80s!).

The major deficiencies in UUCP maps are shared in the Internet's routing system: an advertised adjacency is not authority to use the adjacency in arbitrary ways; and scalar metrics are not especially informative (even with the macro system that let one specify HOURLY instead of an integer value).

However, the UUCP map update-and-distribute system were made more dynamic, it would be superior to what we have in the Internet today in just about every conceivable way. The only piece missing is a canonical representation of networks, and there is no need for that on day one. (On day one you could:


#... metadata with authority, validity, canonical location, etc.
AS65535 10.0.1.0/24(1), 10.0.2.0/24(2)
# ... metadata with authority, validity etc
AS65534 10.0.3.0/24(1)
#
AS65535 AS65534(10)
#Comment: no routing to ASes beyond AS65535 via AS65534
AS65534 <AS65535>(10)

This would be pretty trivial to automate, and it's easy to translate this sort of thing in and out of BGP4. Policy (preferred exit seleciton etc) is inevitably easier to encode in maps (or a database that is a map in all but name, e.g. the PRDB) than in a protocol like eBGP, which has been known since, well, since forever.

Comment Re:Canada fully Independent (Score 1) 295

The advantage with Canada's independence is that we got it by asking nicely and without anyone having to die

Lots and lots of Canadians died prior to the Imperial Conference of 1930 (and the subsequent Statute of Westminster, 1931), with hundreds dead in actions specifically to free Canada from control by the British Cabinet. It is precisely on the basis of those deaths that Mackenzie-King (cf. Mackenzie, several paragraphs below) was able to lead the Conference to the principle that all the Dominions should have both legislative and foreign policy independence and control of their own militaries.

After 1931, even the formal ties were effectively cut: the Judicial Committee of the Privy Council had a subcommittee of Canadian judges who had *exclusive* appellate jurisdiction; the British government agreed that they would pass without amendment any Constitutional legislation Canada required provided the federal and provincial governments were in agreement (of which there was a strong lack readily visible to all onlookers until no earlier than 1982, and even then at least one Province claims to have withheld critical agreement on the formalization of the amending formula and the entrenchment of specific Acts).

1931 also marked the final time when Canadian troops would be summoned by the Imperial government to fight in wars directed by the Cabinet in London.

Compare with the various post-Boer War mutinies by Canadian troops condoned by the Canadian government. An example: when conscripted Canadian troops were held in awful conditions in Wales due to British government vs Canadian government conflict in de-deployment and repatriation policy (the British government were fairly plainly trying to keep the Canadians in service, in part because they cheaper and less politically connected than English troops):

"In all, between November 1918 and June 1919, there were thirteen instances or disturbances involving Canadian troops ... The most serious of these occurred in Kinmel Park on 4th and 5th March 1919, when dissatisfication over delays in sailing resulted in five men being killed and 23 being wounded. Seventy eight men were arrested, of whom 25 were convicted of mutiny and given sentences varying from 90 days' detention to ten years' penal servitude." [Nicholson, Official History of the Canadian Army in WW I]

This sort of thing led to a lack of conscription in Canada during the first part of WW II, and later on a plebiscite/referendum on the question of conscription late in the war (in April 1942) led through a series of compromises to the result that few conscripts actually left Canada and fewer still ended up on the front lines of the war (less than three thousand) -- the Canadian conscripts were mostly deployed to free up volunteers (and British conscripts...) from non-combat posts. It is entirely possible that had the Canadian government caved in to British demands for troop numbers and introduced conscription early in the war, Canada would have exited WW II before Pearl Harbor. Indeed, it is mainly Pearl Harbor and the entry into the war of the United States that led to the passage of the referendum at all.

Earlier in Canadian history there were even small-scale uprisings -- one might even call them revolutionary or civil wars -- that led to deaths and reprisals. Among them were the rebellions of 1837-1838 (William Lyon Mackenzie, Mackenzie-King's grandfather, declared a Republic of Canada and led armed skirmishes in what's now southern Ontario; Papineau, Storrow Brown, Chenier, Oklowski and the Nelsons led armed uprisings in and around Quebec City and Montreal -- a couple hundred dead all together, thousands wounded, and scores of executions and deportations to Australia). There were occasional low-level disturbances of the peace in Ontario, Quebec, New Brunswick and Nova Scotia more or less until the end of the U.S. Civil War at which point the anti-Republican parties that controlled the governments that confederated to form Canada and the British government agreed that self rule and full representation-by-population and universal adult male suffrage (neither of which Britain yet had in spite of the (Great) Reform Act of 1832) was the only way out of an eventual second North American Colonial Revolution with Britain and the United Empire Loyalists, and more specifically the supporters of the British Tories, again on the losing side.

Comment Re:What is Solaris good for? (Score 1) 99

OpenSolaris is old and discontinued. OpenIndiana is a CDDL fork of OpenSolaris, rebased onto what's now called Illumos (http://illumos.org/), and is one of several Illumos "distros".

OpenIndiana was meant to be an answer to desktop Linux. It did not do especially well in terms of uptake, for reasons related to Linux's desktop results. However, there are a variety of other distros which are more server-oriented, and they are fairly popular.

They include for example SmartOS (used by http://joyent.com/ for multitenant hosting and for their own software development), OmniOS (used for mainly single-tenant hosting, and for software development http://omniti.com/), Nexenta (used for building large storage systems), and Delphix (a data storage service).

They all rely on the debuggability of Illumos (mdb, dtrace), virtualization (zones, now including Linux branded zones, crossbow, kvm), services (NFS and iSCSI in particular, also various others like SMB), OpenZFS, and a variety of other useful features, such as even under light use making enormous use of threading for parallelism and concurrency (and the threading systems scale well; OpenZFS alone typically uses a couple thousand threads, hundreds of thousands of mutexes, and many condvars, and all will go higher with load; other kernel subsystems can be similar).

It's fairly common for computer services departments in universities and laboratories and so forth to use e.g. an OmniOS server in front of a large storage pool, offering up iSCSI, NFS and other shares to clients, or alternative SmartOS in front of a large storage pool, offering up lightweight VMs to clients.

Oracle's Solaris has diverged from Illumos (and vice-versa). The key features are similar, but Oracle has been targeting much higher-end applications -- much larger and busier storage pools, especially ones which are very heavily random-acess (big Oracle databases are an application). Like Illumos, it can run very well on hardware with huge numbers of cores (including hyperthread-like cores). Unlike Illumos, it's not developed in the open (and is not open source), but it is well-supported enough that expensive contracts get you fixes and sometimes features quickly. Illumos has been slower until fairly recently, for reasons including the lack of ability to do a fully self-hosted build (it relied on nonstandard build tools), an idiosyncratic source code repository, both of which have now been changed in the past few weeks.

Comment Re:The kilogram is based on a chunk of metal? (Score 1) 278

The metre and the second were closely related from the start; the relationship is through the hydrostatic equilibrium of an object of Earth's mass and angular momentum, both of which were fairly precisely understood in the late *17th* century, at least a hundred years before the SI metre was officially adopted.

There is a deep connection between the metre as the quarter-great-circle of the Earth and the metre as the length of a pendulum arm with a half-period of one second, which is unfortunately distorted by gravitational anomalies arising from crustal mass concentrations, tidal effects, and nonuniformities of the Earth's rotation that makes physical realization of a pendulum-based metre awkward (it can be done, but requires corrections based on time and location; it's a bit harder to do on a ship in less-than-calm conditions, however...).

The longitudinal survey definition won out because its errors at the time were smaller and the corrections were easier to generate and tabulate in almanacs.

The gravity-arm_length-time relationship is described here:

https://en.wikipedia.org/wiki/...

In any event, the metre is now mostly deparochialized in that it doesn't directly depend on the gravitational field sourced by the Earth or its rotation or orbit. You can make a practical realization of a metre anywhere in the universe where you can measure the speed of a massless particle.

Most people who grow up with exposure only to SI have no problem in using decimal fractions of SI units, including metres. Indeed, people who can deal with fractions have no problem applying them to SI units casually. Half a kilometre. Three quarters of a litre. Just like people who grew up with U.S. customary units will say things like half a gallon or a quarter of a mile.

People who are exposed only in adulthood to a different system of measurements are possibly frightened that they look ignorant or, worse, that they appear mentally unable to learn the new-to-them system.

Comment Re:This is huge (Score 1) 214

ER=EPR is designed to avoid superluminal representations of the Poincare group (which is the symmetry group of Special Relativity, and which has "c" as its sole free parameter, corresponding to that of a massless particle; photons are expected to be massless).

Avoidance of non-locality even gets a explicit mention in section 3.1 of the Malcadena & Susskind paper http://arxiv.org/abs/1306.0533

So, no, ER=EPR does not satisfy non-locality.

(It's mostly designed to try to preserve AdS/CFT in the face of the AMPS paradox, which strongly suggests that not all of AdS/CFT gauge/gravity, semiclassical gravity as an EFT outside the horizon, unitarity, or the "no drama" conjecture (and thus the strong Einstein Equivalence Principle) can be simultaneously valid. However, the introduction of a truly huge number of wormholes to a model of the universe is not calculationally attractive, and does not really help with intuiting the internal state of physical black holes any more than AdS/CFT has done so far. Additionally, it requires a modification of QFTs such as the Standard Model for at least some infallers (cf. p 36 at Polchinski's http://www.slideshare.net/joep... .))

Comment Re:So now we have a new paradox... (Score 1) 172

No, it still vanishes, however an imprint of the egg persists on the floor (but in the short run invisible even in principle to anything not actually in or under the floor) such that it interferes with the thermal radiation the floor produces on a cold cold cold day in the far future. Careful examination of that thermal radiation will show the mass-energy-momentum of the egg reached the floor at some point in the past, but is insufficient to reconstruct the egg.

(Additionally, there's the interesting point that the dropped egg is in free-fall until it hits the floor at which point it experiences a dramatic acceleration. The "no drama" conjecture holds that the dropped egg would pass through the event horizon without experiencing an acceleration, and it in turn is based on the (strong) Einstein Equivalence Principle. One of the reasons Hawking is even interested in this is the question about whether the EEP is preserved in a resolution of the AMPS (Polchinski et al) paradox, and his and his collaborators' solutions rely upon the BMS symmetry (and in particular supertranslations). Their argument suggests that the whole spacetime outside the black hole biases the Hawking radiation when the black hole evaporates, but this raises a number of so-far-unanswered questions (presumably this will form part of a future paper).

The biased Hawking radiation means that the entanglement energy of the swallowed half of entangled pairs ultimately escapes to infinity (they claim that this is background-independent, but that's something else which will have to be demonstrated in a forthcoming paper), and so there will be no firewall at the (inner) event horizon (of a Kerr black hole).

So there's no splattered egg. It may be (sort-of) splattered under the floor. (Both GR and semiclassical gravity predict this, but also that the splattering will be unseen above the floor, and also that the precise behaviour of the microscopic components of the egg depends on the behaviour of strongly curved spacetime and quantum fields, and theories describing those presently tend to make inconsistent or even incompatible predictions).)

Comment Re:Of course it never gets past the event horizon. (Score 1) 172

As you point out later in your post, an event horizon is mostly a concept relevant to external observers, not someone falling into a black hole

To reiterate what AC says, no, an event horizon is *especially* relevant to an infaller, because it's the boundary formed by the set of points surrounding a region of spacetime at which all null geodesics lead inside that boundary and ONLY inside that boundary (or outside it in the case of a cosmological event horizon).

Hawking's argument about apparent black holes effectively says that [a] no such point exists if the black hole can evaporate and [b] that enough quantized properties of infallen matter can be recovered from the configuration of all the fields local to the black hole (including the gravitational field) that conserved quantities stay conserved. (That's especially relevant for entangled pairs, and much less relevant for things like baryon number, in the Wilson sense of relevant).

It's kinda interesting to think of the picture for a cosmological event horizon for a universe that transitions from expansion to contraction -- the galaxies that are being carried across our Earth-centric cosmological event horizon could, with a suitable evolution of dark energy (e.g., if the metric expansion can accelerate, why can't it decelerate and reverse?) -- wind up being carried back in. They would still look like ordinary galaxies when they popped back into view, just like they did when they faded out, because conditions just outside the observable universe are almost certainly very similar to those just inside (and that's also true for Schwarzschild black holes, for a very careful definition of the spacetime region "just inside" the horizon, assuming that the "no drama" conjecture is correct, but that definition doesn't practically save an infaller that is self-gravitating (like a star freefalling through a supermassive black hole horizon) or bound together electromagnetically (like a person freefalling through a stellar black hole horizon)).

So our cosmological event horizon is probably real (since the metric expansion is unlikely to reverse) -- maybe black hole event horizons are too (possibly for the same reason: background dependence, e.g. \rho_{crit} in the standard model of cosmology).

Comment Re:Of course it never gets past the event horizon. (Score 1) 172

The black hole per se is not an emitter. Unruh radiation depends on the observer-dependent aspects of the horizon (any horizon; it's also true for the cosmological horizon) and is very dim and very similar to a very cold blackbody; while some observers will see it brighter and warmer than others, this is not true for any observer free-falling towards (and through) a horizon, because the Unruh radiation is also in free-fall You can reverse the picture and ask why a free-falling infaller does not get ionized by the light of distant stars as she approaches the horizon, and arrive at the same answer: the starlight is also in free-fall.

The picture is a little different for non-freefalling infallers: one that sees a big dipole redshift for distant starlight (ignoring the big Einstein lens in front) will see a shift for the horizon radiation as well.

A "true" horizon is not observer-independent -- if a boundary exists at which all available null (and by extension timelike) geodesics only lead inside that boundary, it's a true event horizon. Hawking has argued that they do not exist in any background (that's contentious) and instead says that only apparent horizons exists. The geodesics that exist at an apparent horizon are awfully messy, and the topic for him and his collaborators for the next couple of days will be explaining them for all backgrounds. As of now, I think nobody but them can do anything but guess about their explanation.

(Recovering information from a black hole that only has a (possibly long lived) apparent horizon and no real event horizon involves supertranslations which form part of the Bondi-Metzner-Sachs symmetry group, of which the Poincaré group (the symmetry group of Minkowski spacetime) is a subgroup; there's lots of headscratching about *how* one does this exactly, but certainly the BMS symmetry group provides lots of (even infinite) extra degrees of freedom into which one can move information about a particle in curved spacetime. (But doing so classically is both hard and maybe not complete because you can set up pretty realistic toy models in which you cannot actually extract the metric sourced from quantum particles)).

Comment Re:Of course it never gets past the event horizon. (Score 2) 172

The experience of a classical infaller (or an observer of a classical infaller) is not really relevant in this story (but please see my final paragraph). Hawking is trying to deal with the AMPS (Polchinksi et al) firewall paradox, wherein an entangled (quantum) pair has one pair partner fly off towards infinity with the other remaining gravitationally bound to the compact dense object that has a horizon.

AMPS strongly suggests that at least one of the following must be false: semiclassical gravity as a valid EFT right to the horizon, gauge/gravity correspondence (in particular AdS/CFT as a useful tool in probing energies higher than the EFT limit), unitarity, and the "no drama" result from General Relativity (which is pretty solidly rooted in the EEP).

Hawking is attempting to preserve all of the above by arguing that the non-escaping pair member ultimately escapes to infinity. In an expanding universe where the non-gravitational field content dilutes away (and consequently cools) leads to a relatively warm horizon temperature for most observers at a distance from the dense compact object. All horizons are observer-dependent (a standard result from General Relativity); all horizons emit a very nearly thermal spectrum (an accepted result from semiclassical gravity, and Hawking did a lot of work in that area, leading to the term Hawking radiation); that spectrum lifts energy away from the dense compact object (an accepted conjecture -- that's black hole evaporation); and when that spectrum is warmer on average than the temperature of the local non-gravitational field content, that evaporation is relevant.

Even in an expanding universe there are local configurations of field content in which dense compact objects persist forever, by exchanging evaporation energy with each other, directly and indirectly; the evaporation energy heats up local diffuse field content, which is then ingested by the black hole, which decreases its horizon temperature (black hole horizon temperature being inversely proportional to mass). An eternal configuration of "dark grey holes" is a possible result, and thus Hawking's proposal is incomplete, since it only resolves the 4-way conflict in particular configurations of an expanding universe. That such configurations are physically reasonable (or even more probable) does not really matter.

Your post correctly captures several aspects of the problem. You expect no drama as you reach an event horizon (specifically the point at which all available timelike geodesics lead inside the horizon), because horizons depend on the details of the configuration of events (including those of the infaller and things that can interact (say, electromagnetically or gravitationally) with the infaller). That is, while the definition of an event horizon is sharp, its coordinate location is observer dependent. As you say, a (classical) infaller crossing the real horizon may not even notice it. However, what about an entangled pair-partner?

Breaking an entanglement transfers mass-energy-momentum (in flat spacetime one would say it releases energy) and in a local theory, that must be sourced by one or both pair partners. If we have lots of such breaking pairs, we have a large amount of energy just inside the horizon -- a firewall.

Hawking tries to step around that by saying that there is no place in the universe where all timelike geodesics point inside a small region of spacetime. That is, all black holes ultimately fully evaporate. And, even if half of a pair is local to a compact dense object for a lonnnnnnng time (many trillions of years), there is no breakage of entanglement, and so no release of entanglement energy. Thus there is no conflict with "no drama", there is no breakdown of semiclassical gravity in the low energy limit (because you don't get probably-unphysical superpositions of the metric sourced by each half of the pairs), gauge/gravity remains useful (because you can still focus on the black hole surface area), and quantum fields evolve unitarily (because nothing stays local to the dense compact object forever).

But if even one black hole anywhere in the universe refuses to evaporate (for instance, because it is so large that it is always colder than the *cosmological* horizon, which also produces very nearly thermal spectrum), Hawking's argument falls apart. The likely accelerating expansion of the universe already imperils his solution for really super super massive black holes, which are not forbidden.

Your analysis is pretty good (for classical infallers); you might want to think about your last long paragraph in terms of an entangled infaller (whose entanglement partner is far away from the black hole), or a classical object made up of entangled particles (again with the entanglement partners far away from the black hole). Additionally, think of the case where the temperature of the CMB alone is always equal to or hotter than the horizon temperature of the (really massive) black hole, both for classical infallers like the one your post thinks about, and for entangled infallers. (You can also consider entangled infallers that "somehow" (there are various mechanisms) appear right at the horizon, with one half going inside the horizon (for some period of time; think about short and really really long periods) and the other half going to infinity right away -- it may help to think of neutrino/antineutrino pairs as they are not likely to interact much with any accretion disk material).

Slashdot Top Deals

"If the code and the comments disagree, then both are probably wrong." -- Norm Schryer

Working...