Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:decompression: 800 Mbps (Score 1) 122

This would make sense if HTTP requests were typically bandwidth-limited. Almost none of them are, most are way too short and never actually get TCP going at line-rate. HTTP is most often latency-bound, not bandwidth-bound, and the compression is meant to help with latency (reducing number of request packets), not bandwidth.

Comment Re:Header Compression + Binary Headers (Score 1) 122

It is actually surprisingly complicated.

It turns out that a typical HTTP/1.1 request requires multiple TCP packets to get all the headers across. With TCP slow start, this takes a long time because only one packet gets transmitted per round trip in the beginning. Obviously this gets even worse if you try to browse to a different continent, with 100ms+ latency.

HTTP/2 manages to fit most requests into one packet, assuming a reasonable MTU. To do this requires both a binary protocol encoding and header compression. Without those, you need two packets which is half as fast.

Of course you could argue that this is all because TCP is a stupid ancient protocol which no one sane should be using in 2014.

Comment Re:Profitable, if self-contradictory (Score 1) 549

It is difficult to imagine a destroyed Earth that is less hospitable than Mars. Not impossible, but difficult. In almost all cases it is easier to terraform Earth than Mars.

It would even be easier to build deep underwater communities on Earth.They are unlikely to be destroyed by climate change or ecosystem collapse, and resources are vastly easier to get there.

Comment Re:Gratuitous LIGO Slam (Score 1) 25

One of the challenges is that there is just so much data from modern experiments. E.g. the LHC throws a lot of irrelevant data away before it even hits the hard drives. To a layman like me it seems likely that some potentially useful data gets thrown away in this process.

You cannot fault the LHC designers for doing this; the data handling and storage there is awe-inducing. Transporting and storing orders of magnitude of (probably useless) more data is just not feasible.

Comment Re:ironic (Score 1) 260

That is the trick: You do not need to pump water anywhere. You just avoid using water when other forms of energy are delivering. If there is a power surplus even with the turbines off, you try to export that somewhere, and if no one wants it even for free, you stop the wind turbines or solar plants or whatever. If you happen to have pumped storage available for the few hours a year with low or negative prices, great, but otherwise it is not a great loss.

Comment Re:ironic (Score 1) 260

The neat thing about reservoir-based hydro power is that you can multiply its output by using pretty much any intermittent power source. If the rainfall is cut in half, you "just" build enough wind turbines or solar cells or whichever to provide the missing half of yearly energy output, and save up water whenever you can. At a large enough scale, you can boost hydro power many times over, assuming the hydro generators themselves are large enough. Luckily hydro generators are reasonably cheap to upgrade, whereas reservoir capacity is hard to come by.

Slashdot Top Deals

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...