Actually he did average them. The paper is at http://arxiv.org/abs/1409.5830... and he discusses this issue in section 3.2.
"GNU Fortran is widely used even in high-performance computing because the commercial compilers aren't really better at all (and generally more buggy)."
While I've certainly encountered (and notified Intel of) bugs in ifort, and more than I'd expect in something that cost the university a pretty parcel of money, I wouldn't even begin to pretend that gfortran is as good for high-performance computing as ifort. Unless you're triggering an ifort bug, and I haven't hit a genuinely serious one since a weird memory cap back in 2005, or using ifort on some esoteric architecture, you are *never* getting better performance out of gfortran optimised code compared to ifort optimised code. They may be equivalent for a lot of operations, but for others ifort is simply a lot better, particularly when tuned to a particular Intel chipset. I use gfortran a lot and I don't have any serious complaints about its optimisation, but ifort's is better.
Actually I still use gcc for gfortran - I can't personally afford to use the commercial license, and I have code I want to maintain and develop in Fortran. But on Windows I develop C++ in either MSVC, due chiefly to the fact I like its debugger better than any other I've worked with, or in Clang, and on Linux I develop C++ in Clang. I'm glad gcc is there but my default these days is certainly Clang.
Thankfully for our aching heads, unless our understanding of the laws of physics is very wrong, probably not. The structure of the universe being fractal in nature doesn't imply that *everything* in reality is fractal -- it implies that gravity will tend to construct fractal structures, when dealing with objects in a large enough number. Down at our level, there's too much competition with other forces, primarily electromagnetic although on some Solar and planetary scale objects such as neutron stars the forces are a bit more exotic in nature, to be purely governed by gravity and so a different framework takes shape. A fundamental interconnectedness of everything is a nice idea, but it would make my head hurt, and isn't justified by our current understanding of physics. (Which, of course, may change - and which also in itself doesn't rule out there being at least some level of self-similarity between systems dominated by electromagnetic forces and systems dominated by gravitational forces, given the similarity in their behaviour.)
You're not the only one to start thinking along these lines. You might be interested in this somewhat random and unrepresentative set of papers:
I know very little about this area myself but it seems relatively settled that the fractal dimension of the universe - if such can be defined and has a meaningful interpretation - is between 2.5 and 3.
It would certainly change the stability analysis, yes, but the universe does not have a significant angular momentum. That would leave a characteristic signature on the CMB, a preferred direction, and it's been hunted, with each new and improved dataset, and we still don't have it. The hunt has turned up other interesting anomalies such as the appallingly-named "axis of evil", but those signatures are also (probably) not due to an intrinsic net angular momentum. That doesn't say there isn't one, just that it's not particularly significant and is, as yet, unobservable. (In case you're interested, a universe with a net angular momentum could be described by one of the Bianchi models rather than a (Frieman-Lemaitre-)Robertson-Walker. A simpler case that might interest people is the Goedel universe, notorious because it contains closed timelike loops, which allow for the possibility of time travel. The Goedel universe wouldn't form a realistic model of the universe but it's an interesting spacetime.)
Well, it violates the weak energy condition, although it does satisfy (just) the strong energy condition. The weak energy condition says that density + three times pressure is greater than zero, and that's pretty easy to violate if you've got a scalar field. The strong energy condition says that density + pressure is greater than zero, which is far harder to violate; dark energy is right on the border of it, and a cosmological constant *is* the border (pressure = - density).
Of course, you're totally right - none of this means that they violate conservation of energy. That's built in to the theory...
"What they've proved mathematically as that at the event horizon of a black hole the math fails. It falls apart and no longer makes any sense because the numbers get too large on one side of the equation."
Not so. The maths dies at the singularity at the centre of the hole, but it doesn't at the event horizon except in a badly-chosen coordinate system. Alas, the usual coordinate system we'd present the Schwarzschild solution in is indeed badly-chosen and has an apparent singularity at the horizon, but this is not an actual singularity, as can be seen quickly by calculating a scalar curvature invariant - the Ricci scalar is the immediate choice, it's basically a 4d generalisation of the more-familiar Gaussian curvature - and seeing that it's entirely well-behaved except at the centre of the hole. So we look for a coordinate system well-behaved at the horizon and quickly come across Painleve-Gullstrand coordinates, in which spacetime is locally flat and perfectly behaved at the horizon. The implication is the poor sod wouldn't be able to tell that he'd got to the horizon, except through tidal forces (which depend on the size of the hole), and then he'd struggle to navigate before slamming into a singularity.
Even more confusingly, for a *realistic* hole, the insides are rather different. A Schwarzschild hole has a singularity inevitably in the future - all future-directed paths one can travel on, or light can travel on, end at the singularity. That's a bit of a bummer if you happen to be in a Schwarzschild hole. But a Schwarzschild hole is not physical; it is a non-rotating, uncharged hole, and that's not a realistic setup. In a charged (Reisser-Noerdstrom) or a rotating (Kerr) or, come to that, a charged rotating (Kerr-Newman) hole the singularity is "spacelike" -- there exist paths on which we could, in principle, travel, that avoid the singularity. In the case of a Kerr(-Newman) hole it's even smeared out into the edge of a disc. In reality, good luck navigating in there, but the singularity is not inevitably in the future in there.
A bit closer to the point, you're right that speed doesn't really have anything to do with it. Instead it's the type of path you can travel on, and where *they* go. An event horizon can be defined as the surface on which "null" geodesics, on which light travels, remain equidistant from the hole. If you travel, as massive particles do, on a "timelike" geodesic then you're fucked; you're never going to be able to accelerate enough that you even travel on a null geodesic, let alone a "spacelike" geodesic along which you can basically access anywhere. On a spacelike geodesic you could get out of a hole no problem. You could also travel in time, and you could break causality fifteen times before breakfast. I'd like to travel on a spacelike geodesic - it would be fun. Though managing to get back to a timelike geodesic might be significantly less so.
"Another obvious but often overlooked theory is that our universe IS a black hole inside a larger universe."
That's an extraordinarily strong statement. Our universe might be indistinguishable from a black hole from the outside, yes, but there's a big "might" in there, and an "outside" that doesn't necessarily make much sense either. It all depends on the setup you're assuming. Sure, we could end up finding that the universe is "inside" a black hole for a given definition of "universe", "inside" and "black hole", or we might find that that statement does not make any extent. I wouldn't want to say anything stronger than that, frankly, not least as I'm aware of models of cosmology that are observationally indistinguishable from a standard, infinitely-extended, flat universe, which are also flat, but which have finite extent. One way to do so is to simply put the universe into a toroidal topology. Since GR is a local theory it says nothing about topology, and it would be hard to argue that a universe extended on a torus would look like a black hole from the "outside", since that would be the entire extent of spacetime.
Also, more to the direct point, in the article we've kind of moved away from discussing is this snippet [p5]:
He [Einstein] notes that some theoretical attempts have already been made to explain the new observations:
“Several investigators have attempted to account for the new facts by means of a spherical space, whose radius P is variable over time”.
Once again, no specific citations are made, so we can only presume that Einstein is referring to works such as those by Lemaître, Eddington, de Sitter and Tolman (Lemaître 1927; Eddington 1930; de Sitter 1930a,b; Tolman 1929, 1930). Indeed, the only specific reference in the entire paper is to Alexander Friedmann’s model of 1922:
“The first to try this approach, uninfluenced by observations, was A. Friedman, on whose calculations I base the following remarks”
This not only provides a few references to papers that Einstein -- in 1931, writing before the derivation of the Einstein-de Sitter model -- may well have based his work on, including the Friedman and Lemaitre models that were proven in the 30s by Robertson and Walker (working independently) to be the unique dynamical, homogeneous and isotropic models, but also shows that Einstein was aware of the Friedman model. (If you're interested, Tolman's work was on spherically-symmetric models that are isotropic but not homogeneous, and are subsets of what are now known as the Lemaitre-Tolman-Bondi models. The de Sitter papers, I think, established de Sitter and anti-de Sitter space, and I wasn't aware of Eddington's paper before but reading through the article here it turns out to have been on the instability of the Einstein static universe.)
(That was my reply, from a computer that I wasn't going to log onto Slashdot through.)
The EdS model is not "the workhorse of modern cosmology", no matter what the author of this summary wants you to think. If any model could be described thus it would be the Friedman-Lemaitre-Robertson-Walker model, which was already known (thanks to Friedman and Lemaitre who developed it in the 20s) by 1931. The EdS model is a specialisation of the FLRW to a universe containing pure pressureless matter, and an expansion is necessarily decelerating. As such not only can it not describe the early universe, when the existence of the CMB and the expansion of the universe together imply a period where the universe was instead dominated by radiation, nor the late universe, where observations imply that expansion is instead accelerating. EdS was used as an approximation to the late time universe until the 90s when it was obvious that it was in conflict with observation. It's sometimes still used for rough approximations thanks to the simple solutions one can find for linear perturbations, but those are only valid up to redshifts of approximately 1, and no later.
Swapped to Feedly and kept going as before.
Interesting page they link to on there: http://vimcolorschemetest.goog...
With ABP enabled, that soaks up 1.96G on my machine.
Without, it soaks up about 540M. That's impressive.
This is on OSX, by the way.
Similar - adblock, noscript and flashget. 4gig RAM and Firefox routinely taking about 900meg since the update to 29. I tend to only have three or four tabs open, and usage was around 500meg in 28.