I found this article to be rather long winded in order to create a story with suspense. The moon has a side facing away from Saturn which is darker then the side facing saturn. It seems to be due to collecting dust from a larger ring that is on the border of its orbit.
Done, saved you a long and pointless naritive.
Actually, that's not quite correct. You've got two errors there, and missing the real mystery, although the article itself actually fails to explicitly specify what the solution is.
The darker side is actually the leading hemisphere, not the far or outer side (from Saturn). Dust doesn't onto the far side, the moon plows through it in places, getting dust on the leading side. No mystery here for quite a while though - telescopes have been able to make out "the dark patterns look a lot like dust" for quite a while. The Phoebe ring itself was only detected about 10 years ago, but it was expected that dust was coming from the outer moons for a while.
The thing is, if the only process happening was that dust was being swept up by Iapetus, then every time the dark side faced the Sun, the dark coating would heat up, cause the ice underneath it to sublime (think evaporate, if that doesn't mean anything - it's close enough) and freeze again over the dust, leaving behind a light surface again. But we see a dark surface. Why? Mystery!
The solution (which the article doesn't really explain fully) is that initially dust from the ring caused ice to turn to gas, leaving behind a dark residue that we now see (and the Cassini probe has been able to measure), but instead of just floating around above the (relatively) warm, dark surface until it faces away from the Sun and cools down, much of the vapour refreezes on the light side as it passes over it due to the lower temperature there.
The dark residue (not the original dust) now causes further heating each orbit, repeating the cycle. Over time, a large amount of ice from the leading side is being evaporated away, leaving that side to get darker and darker from the residue, with a certain amount of the ice migrating to the light side and refreezing (as light coloured ice) keeping it nice and bright.
TLDR: Mystery! Dust doesn't explain the dark leading side of Iapetus! Ice would cover it in a shiny coat each orbit. Planetary detectives trace the culprit to dark residues left behind as heated ice moves to a new neighbourhood on the cooler side of the moon. More dark areas means more solar heating, and more ice migrating away in a self-perpetuating cycle. Mystery solved! Good job, planetary scientists!
Yet one of Saturn's moon's, Iapetus, is unique
Aren't they all unique?
Yes, and they're all special too.
And they can grow up to be any kind of planet they want.
AS LONG AS IT'S A DWARF PLANET, RIGHT PLUTO? HA HA HA, LOSER!
Viewed from which side? Counterclockwise does not apply here.
Viewed when looking down from the north pole. This is mentioned in TFA, per
Rather that [sic] (looking down from the north pole) orbiting counterclockwise around its parent planet, which all the other moons do, Phoebe revolves clockwise around Saturn.
Two things to note:
First, they're not "teleporting" the photon. It's a quantum property (such as spin - which, also, by the way, doesn't have to do with the photon rotating in the way you'd think of, say, a ball spinning, but I digress) that is being "teleported" and applied to the target photon.
Second, when you send a fax, the "picture" (or the information to recreate it) is sent along the wires (or optic fibres, radio waves, etc.) through space from one location to the other. With entangled particles, the effect that alters the target particle's property doesn't travel through space. It just affects it directly. That's why it's instantaneous (as opposed to travelling at the speed of light or less, like your fax signal) and called teleportation. The downside is that this teleportation effect cannot convey information.
There is no 'bufferbloat because RAM is getting cheaper'. What he is seeing is what happens when you want to saturate your link.
Yes, and if a link is saturated, there should be packet drops, which TCP senses, then automatically throttles back to reduce the required bandwidth and avoid saturation. But what is happening, is that these huge buffers are holding packets that would otherwise be dropped, and so TCP doesn't get the feedback it needs to detect saturation. So it continues transmitting at full speed, believing it has uncongested pipes, which in turn continues to fill the buffers, and so on.
Because of the buffers, most of these packets are eventually getting through, but maybe in seconds instead of tens or low hundreds of milliseconds. Thus you're getting huge latency.
Jitter is caused by the buffers eventually filling or TCP timing out (registering packet loss), dropping the rate for a little bit, the buffers draining, then TCP upping the rate again as the buffers refill, hiding the saturation, until they're full again. Rinse and repeat.
It's related to the "bloat" of buffering (due to the increasing affordability of RAM and the "more of a good thing must be better than a little of a good thing - QED" mindset) because, if the size of the buffer is kept below a certain point related to the pipe bandwidth and number of traffic streams, it tends to act just as a temporary "buffer" against spikes in the traffic (the intention of buffering), and can't cause the scenario above, having insufficient capacity to overload the bandwidth just from buffer contents alone. Above this threshold, the latency issues and back-and-forth thrashing noted above occurs. The bigger the buffers, the worse the effect.
And it's not just a "well, keep your traffic below x mbit if you're on ADSL2" issue, because it happens anywhere a high capacity pipe interfaces with a low capacity or otherwise congested (of any capacity) pipe. This might be your ISP's backbone which is getting hit by several thousand people downloading the latest WOW patch simultaneously, causing your 300kbps Skype call to go to hell through latency and jitter. If the ISP's equipment had smaller buffers, the servers would be throttling back as packet loss occurred. You'd probably still be losing packets, but they'd be detected and re-transmitted pretty quickly and you possibly wouldn't notice the latency or have jitter.
What he is seeing is what happens when you want to saturate your link.
So, no, what you get with appropriate buffers is your TCP connection moderating itself to the appropriate link capacity and availability, and latency remaining approximately the same (relative to what you're seeing in bufferbloat, but worse than an uncongested link, obviously).
With bufferbloat, your bandwidth appears to remain about the same, but your latency balloons massively and you get jitter effects as above.
Why won't sharks eat lawyers? Professional courtesy.