Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Go back .... (Score 1) 326

Bit harsh on the Welsh, that. They also have a lot of pride in their national identity.

(Actually in all seriousness I slightly threw my American PhD supervisor when he first came to Britain and I told him about the differences between English, British and the UK. It's not that he didn't know the differences, but he didn't know the depth of feeling caused by calling a Scot or a Welshman "English".)

Comment Re:Go back .... (Score 1) 326

Yeah but people who belong to the United Kingdom of Great Britain and Northern Ireland are not normally called Irelandish. That might be taken as provocation.

(Even calling them "British" is inaccurate, given that the Northern Irish do not, in fact, live in Great Britain. If they did it would be the United Kingdom of Great Britain.)

Comment Two comments (Score 4, Insightful) 2219

I think I've got reasonable karma on here and the very few who recognise my login probably think I don't post total drivel *all* of the time, so I'd like to put in my two bob's worth. I don't like the beta as it is at the minute. The front page looks fine to me: lots of white space, but I can live with white space and it's no different from other websites, although I could very much do without the constant targetted videos from advertiers; but it's the comments pages that are distinctly compromised compared to the present setup. It's far harder to close an entire thread; it's far harder to close sections and leave others open and see quickly which comments have been added since the last refresh; far less content is onscreen at one time; and the comments screen is far too narrow, which compounds the previous issues. I'm sure that with more reflection I could think of other issues with the comments, but those are probably my greatest complaints.

Over the last few days the comments pages have been increasingly dominated by childish anti-beta messages. I understand these are probably born out of frustration and irritation (even anger on some parts), but they've made the website far less usable than if the beta had been rolled out without argument. This is the flipside of it: no redesign is worth fucking up a website over, and certainly doesn't justify the sheer amount of petulant whining the boards have been filled with.

And that said, over the last couple of days, when I've had mod points I've tended to use them to at least reverse the modding down of people protesting the new beta, since there seem to be no other avenues for people who genuinely care about how the comments sections of slashdot are presented. I have no issue with a redesign, but diminishing the usability of a service is a pretty hamfisted way of increasing its profitability.

Comment Re:It was on the rise... (Score 1) 125

"Or do you mean something as strong as being unable to predict the evolution of some arbitrary lower-dimensional hypersurface matching a real observation?"

This may be the case. If nothing else, in cosmology we know we are not in a globally hyperbolic spacetime: the universe is riddled with geodesic crossing. That in itself makes a 3+1 split and any evolution based on it problematic. Cosmology is inherently built on a 3+1 split of some form, whether that's with respect to an observer comoving with the CMB or with respect to some timelike vector later associated with conformal time, or whatever. We also have, in principle, the issue that we don't have a good Cauchy surface to put initial date on. In reality these issues may or may not be all that significant. Certainly on the scales we're typically considering I'm not very concerned. I've spent most of my career in perturbation theory and I set my initial data on a constant redshift surface at about z=100,000 and integrate up to roughly megaparsec scales, and I don't really expect that we're going to hit horrible issues on such scales.

Except in principle, which might or might not be a point that interests you.

"Or alternatively, that predictions work but nevertheless we are liable to discover that the manifold is not smooth or there exists some field or fields that take real values at each point (or both)?"

We already know the manifold is not smooth; that part is without issue. The existence of a single black hole breaks any assumptions of smoothness. More importantly, the presence of any structure at all breaks it. What we don't know is whether the manifold can be assumed to be smooth for cosmological purposes. This is at the heart of the issues I'm talking about -- we derive cosmology assuming first that we can foliate spacetime with maximally-symmetric 3-surfaces (which it seems may very well be valid, although this is yet to be proven given we lack an averaging procedure that can be applied to tensors in a covariant manner), and then more damningly by assuming that the dynamics we recover by perturbing the resulting Einstein equations are equivalent to those that describe a perturbed matter distribution. This is categorically not the case. The Einstein tensor is non-linear, so it can't be the case.

Of course, the correction may be miniscule. Using spatial averaging and a mixture of different toy models has certainly suggested that the error is small, on the order of 10^{-5}. (This has its drawbacks: it's an unphysical procedure. Spatial averaging involves averaging over spacelike surfaces, which are obviously separated from an event by spacelike geodesics and therefore unobservable. If one takes the further step -- as some have -- and feeds the results of this averaging back into the evolution then one is breaking causality horrifically.)

In practical terms, I'm talking entirely on principle. In principle, Robertson-Walker cosmology is ill-defined until, and only until, we have clearly specified a consistent, coherent way to map from local solutions -- of the order of kiloparsecs if not parsecs -- up to cosmological scales. What emerges from such a procedure may or may not even behave like general relativity. We simply don't know. It seems unlikely that the result of a sane averaging *will* behave just like GR and the best approaches we have to it -- such as Zalaletdinov's averaging -- certainly introduce extra terms that crop up in the "Friedmann equations". (These typically appear as a "curvature" of spacetime.) There is no physical basis to them and they cannot be interpreted as though they were in a genuine Friedmann equation, for the simple fact that they're not in a Friedmann equation because the universe is not Robertson-Walker.

In practice, linear perturbation theory around a flat Robertson-Walker spacetime has been so successful that I fully expect it to continue being so. While -- again, in principle -- the parameters of the model are entirely phenomenological and it is dangerous to rush to ascribe physical meaning to them, I ultimately expect that the errors we're introducing doing so are relatively minor. For instance, it is not consistent to state that the dark matter energy density appearing in the Friedmann equation is at all related to that deduced on cluster and, even more so, on galactic scales. The latter in particular is observed from dynamics in a Newtonian framework (which is itself a phenomenological description), while the former is simply a parameter describing something that appears in the equations as a pressureless fluid interacting only by gravity. On the other hand, the densities deduced match closely enough that there's probably some deeper relationship between them. The same applies to baryon density, and so forth.

Comment Re:It was on the rise... (Score 1) 125

Definitely the bottom-up approach. It's almost intractible, but it's still easier than attempting to decompose a metric into constituents. A major issue is that general relativity is not linear, so if you've got two Schwarzschild metrics (which describe a spherical mass alone in the universe, edging towards flat spacetime inifinitely far from the mass) you can't simply add them together and get a new metric describing two spherical masses. In the weak-field regimes this will work OK but it's still only an approximation. Away from the weak field this simply doesn't work. While this does make it harder to attempt to connect up smaller-scale metrics to get to a larger one, it makes it basically impossible to go the other way.

So what we can have are models such as Swiss-cheese models, where the universe is built from LTB (or, increasingly, Szekeres) metrics each stitched onto a region of Robertson-Walker. Two sizes of LTB model spherical local voids and spherical local clusters, while if you use Szekeres you're modelling quasi-spherical voids and clusters, at the cost of computational complexity. Or you can use that complexity of a Szekeres, or related metrics, and build more realistic models of local structure than the LTB allows.

An alternative approach (as in papers by Clifton, Rosquist and Tavakol, http://uk.arxiv.org/abs/1203.6... and http://uk.arxiv.org/abs/1309.2..., or recently by Korzyski, http://arxiv.org/abs/1312.0494) is to literally take a universe composed of an arbitrary number of black holes. Surprisingly, you can solve this system exactly. It's not a viable model of the acual evolution, but it is a very good testbed for whether the type of effect I'm talking about is actually significant. The answers appear to be "A universe composed of a large number of black holes looks like Robertson-Walker" but "It is not clear that the dynamics are Robertson-Walker".

"Should choice of metric be kept maximally free at each step?"

In principle, yes, in practice, no. Without constraints all you'd end up with are the Einstein equations split into different scales, with no simplifications. That's not a realistic approach, much as it would be lovely.

Computationally we're definitely going to head towards numerical cosmology as a subset of numerical relativity. One of the major issues is dynamical range; theoretically, and using current perturbative techniques, we can cover a range of momenta from k->0 (edging towards infinite wavelength) all the way to around k~1 Mpc^{-1}, which is cluster scale. That can then be stitched onto n-body codes which can now map down to kiloparsec scales with a significant overlap with perturbation theory. Attempting to solve the whole problem you have to somehow cover at least that same five or six orders of magnitude, throwing at each step the full weight of GR, which is extremely costly. But it will definitely come, and sooner rather than later.

Comment Re:It was on the rise... (Score 1) 125

"Floundering around continuing to wonder what mechanisms generate the metric is pretty much the job description of a physical cosmologist, isn't it ?"

Certainly one part of the job description, yes :) Most of them would disagree with almost everything I say about how to approach it, of course, which is what makes the whole thing great. In about ten years when all the data is in we'll be drowning in too much of it and I more or less expect more funding in cosmology to be channelled into more genuine theorists along with the statisticians and data miners we'll be needing. And this is something I'm very much looking forward to.

Comment Re:It was on the rise... (Score 1) 125

"Is only *significant* on large scales, and in particular in the weak-field limit, no?"

Actually no, in the hard interpretation of what I'm saying -- that the acceleration predicted in cosmology is a result of assuming a negative pressure in a Robertson-Walker metric... but the universe is composed of a conglomeration of untold billions of different metrics -- there is no weak-field limit. In the softer interpretation where structure forms and "disconnects" in some way or another from the universal expansion -- basically, is modelled with something like an LTB patched onto a Robertson-Walker -- the acceleration still has little or no impact, although this depends on the exact model for acceleration, of course.

Something that was in vogue for a good few years was the idea that local structure could account for the observed dark energy, without the need for any negative pressure at all, by whacking Earth somewhere near the middle of a void around a gigaparsec across. This doens't seem particularly plausible anymore, although that's not least because the models studied so far have been staggeringly over-simplified, not because people are resistant (though some are) but becuase the problem rapidly grows impossibly difficult. Literally impossibly.

"do you think that a parsimonious everywhere-the-same negative pressure ... [is] dead for DE already"

I'm certainly not going to state that this is dead. The simplest model is a cosmological constant and there's no reason to state that there isn't a cosmological constant, and some good reasons to say that something that acts as a constant exists. (If nothing else, the low-energy limit of many generalised theories of gravity manifest in the action as basically the first few terms in a Taylor series for the Ricci scalar -- so where general relativity has R, genearlised theories can be written as C + R + (1/2)*c*R^2 + ... where C and c are constants, and those three dots hide a world of unhappiness. Interestingly, C here would act as a cosmological constant, and the R^2 term acts exactly as inflation and, in fact, is both the earliest studied inflationary model (Starobinsky in the late 1970s, a good few years before Guth, albeit with very different motivation) and is slap in the middle of the allowed parameter space. No other simple model is in as good agreement (though it must be pointed out that other simple models are within one sigma, so this isn't, strictly speaking, necessarily at all important.)

The point then is that the low-energy limit of a generic gravity has a cosmological constant, and therefore acts to accelerate the expansion of the universe. More significantly, this is a *screened* constant, meaning that there may well be a fundamental constant in nature, which we can call L, and a constant coming out of an approximate description of a better model of gravity, which we can call, I don't know, A. The observed constant in the action is C = A - L. (That negative sign arises for entirely tedious and unimportant reasons.) So even the naturalness problem, which queries why the observed constant is so small is, if not removed, at least bounced elsewhere.

What's even more, the low-energy limit of a generic gravity also gives us a theory of inflation that fits the data perfectly. *And* it gives us normal gravity. And it does all of this without any scalar fields.

Brilliant, right? Well, half-convincing at least, but there are the usual caveats and problems, not least that this is very definitely a phenomenological description - we don't have the theory this is meant to emerge from. Bummer.

Anyway, this digression was tho point out that there's no way I can say that an everywhere-the-same negative pressure doesn't exist... because I think it *does* exist, in the form of a screened cosmological constant. There may also be a negative pressure that is almost the same everywhere coming from some scalar field, probably an effective field rather than a genuinely physical field.

The presence of these negative pressures then changes the smaller-scale solutions I was talking about -- so you have a LambdaLTB or a phiLTB (Lambda with a cosmological constant, phi with a scalar field), which behaves slightly differently to the usual dust-only LTB. Likewise with Schwarzschild -- Lambda Schwarzschild is normally known as Schwarzschild-de Sitter. And so forth. But even so the influence of a negative pressure on these metrics is very different to its influence on a Robertson-Walker, and the observed acceleration is attributed to a feature of Robertson-Walker, which is only valid on the very large scales.

Basically what I'm saying is that the universe is made up of until billions upon billions of metrics that near to the objects that made them are close to Schwarzschild (or Kerr, or Kerr-Newman), and when you look at these billions upon billions of metrics overlaid on one another they "average out" (in some vague, unspecified and actually appallingly ill-defined manner -- doing this this is an absolutely unsolved problem) to a Robertson-Walker. The fundamental issue in cosmology I'm talking about isn't really whether or not there is negative pressure, since there almost certainly is, but that we're modelling its effects with a Robertson-Walker metric on the unproven (and, fundamentally, unfounded) assumption that not only do the metrics of the matter in the universe "average out" to Robertson-Walker but also that the *dynamics* governed by those metrics is the same as the dynamics of Robertson-Walker. Neither of these statements can possibly be true, although the former seems likely to be at least a very good approximation and the extent to which the latter is wrong is very much open to debate because while I'm certainly right that this is a fundamental issue and that cosmology is in principle totally wrong, it may very well also prove that the error introduced is minimal. And speaking emotionally, I think unfortunately that that's going to be the case, and we're going to be left floudnering around continuing to wonder what the hell is producing such a large negative pressure.

Comment Re:Neutrinos? (Score 1) 125

1987A was also in one of the Magellanic Clouds, which are climbing all over us. It seems unfortunately unlikely we'll get many neutrinos from this, although the improvement in technology since the 80s might mean we get roughly the same number as from 1987A. (Less than 20 detections -- 17 or so, if memory serves me, which these days it rarely does.)

Comment Re:LOL ... (Score 1) 125

Most of my career has been taken up in cosmology. To me, a megaparsec is the smallest smidgeon that I'll even consider looking at. (Well, until more recently when I've deigned to look at scales as small as a few thousand parsecs.) For context, a megaparsec is the size of a supercluster of galaxies, something like three or four million light years.

Slashdot Top Deals

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...