The one scientists use today: invariant mass aka rest mass.
As I understand it, the problem is that the batteries must be at 0 degrees C to accept charge. The limited sun it's getting now isn't enough to heat the batteries (surface temperatures are about -70C IIRC).
That would depend heavily on the radioactive material used, no? For example, Wikipedia lists Pu238 to have just a ~16% drop in output after 20 years.
I'm guessing cost and weight were the key factors for picking solar over an RTG.
Time to audit the books to see who pocketed the money a more robust design would have addressed.
I think you guys should play some Kerbal Space Program and see just how much more fuel you need at every stage just to put an extra kg of equipment on that lander.
There are several things at play here.
One is the latency as mentioned, which is very important for VR. Heck even playing with a mouse and a regular monitor I can feel the difference between 60 Hz and 120 Hz, not to mention 30 Hz. At 30 it feels like my mouse is submerged in honey. At 60 it's decent but if you switch suddenly to 120 it you do notice that 120 is quite responsive in comparison.
Then there's also motion blur. Due to the way most LCDs currently operates, they introduce a lot of motion blur. This is beause the image is shining for almost the entire frame ("sample and hold"). The eyes tracks the motion, expects to find the object has moved but get's conflicting data because the monitor is still displaying the same old frame. This causes a perceived motion blur.
This is unlike "modern" CRTs for example, where the image faded quickly (within 2ms or so). For the rest of the frame the monitor was effectively black. That is much better for the eyes, and results in smoother perceived motion at the same framerate.
Even at 120Hz motion isn't completely smooth if you use sample and hold. Newer LCD monitors can strobe the backlight to get an effect similar to CRTs and thus reducing motion blur.
The LHC has about 115 billion protons in each bunch, and was designed for almost 3000 bunches at a time (I forget how many they ran before the shutdown).
Only a fraction of these protons collide, so there'll be plenty left when the beam quality is low enough for them to dump the beam and get a fresh one.
At work we have a 100/100mbit internet connection (fiber), "business class" from a very solid ISP. We're 10 people here. Not long ago internet was horribly slow to the point that it took literally a minute to load my usual news site. Ping was up in the 1-2 second range.
Turned out one of my coworkers was downloading the some Windows ISOs from Microsoft.
If we didn't have QoS on the VOIP I'm pretty sure we would have noticed quite quickly.
I'm in Norway on a 100/10 connection (which is plenty for me), and when surfing more obscure music videos on Youtube for example I definitely notice when I hit non-cached content. Even 480p can take up to a minute to start playing, and often has to pause to catch up.
For cached content 720p or 1080p is just there, instantly.
While Firefox is fast when it's fast, unlike Chrome a single tab can bog down your entire browsing session since it's only using a single process.
I strongly dislike Chrome for other reasons and have stuck with Firefox for ages, but they really should put more effort into their Electrolysis project if they don't want to be left in the dust. Heck I'm finding myself using IE11 for a lot of stuff these days.
Here's from my router:
IPv6 Connection Type: Native with DHCP-PD
WAN IPv6 Address: 2a02:fe0:c400:1:95d2:656f:...
WAN IPv6 Gateway: fe80::219:2fff:fee6:73d9
LAN IPv6 Address: 2a02:fe0:c411:a960:da50:e6ff:.../84
My PC gets a fe80 address, but I can ping the "LAN IPv6" address above.
My ISP supports IPv6, my router supposedly supports IPv6 (Asus RT-N66U), I can see the router getting an IPv6 address from my ISP, I can see my PC getting an IPv6 address from my router yet when I test it out on the various "do I have IPv6" pages it's failing.
After spending a couple of hours mucking around I gave up. I'll deal with it when it matters. Hopefully it's less painful then.
If I'd listen to di.fm on my way to and from work, which takes me about an hour each way, it would be about 4.6 GB per month just there.
Now if I paid for an "unlimited" plan, I would expect such casual usage to be perfectly within the bounds of "unlimited".
Indeed. Only a small fraction of the collision events are kept, otherwise the amount of data would be overwhelming.
In particle physics, a trigger is a system that uses simple criteria to rapidly decide which events in a particle detector to keep when only a small fraction of the total can be recorded. Trigger systems are necessary due to real-world limitations in data storage capacity and rates. Since experiments are typically searching for "interesting" events (such as decays of rare particles) that occur at a relatively low rate, trigger systems are used to identify the events that should be recorded for later analysis. Current accelerators have event rates greater than 1 MHz and trigger rates that can be below 10 Hz.
Well the problem is that when I rate a movie, Netflix has no idea why I rated the way I did. They don't know the context.
I recently rated a movie 1 star, I didn't even finish it, because the story was just so horribly badly written. But I liked just about everything else, the plot itself was great, my kind of genre, cinematography was good, actors too. But that stupid story just killed it for me.
How's Netflix going to figure out why I rated that a 1 without asking me? I think they should ask follow-up questions to get some of that context if I rate a movie very different from their prediction.
But yes, if I've recently watched a movie... don't recommend it to me again for some time. That'd be a good start.
Dark matter is one of those as well. They've theorized dark matter and attributed each unexplained item in astrophysics to it but have no real evidence it exists.
No. You got it exactly backwards. It's entirely the opposite of string theory. String theory was born as a theoretical construct and they're trying to figure out how to make predictions with it so they can see if it matches the real world.
When it comes to dark matter, what they have is a ton of observations which does not match the predictions of our current theories. What they see is mass being affected by something we can't see. So they've given it a label until we figure out what it is: dark matter.
So, just to repeat, dark matter is just a label given to what we can see happening but which we cannot currently explain with our established theories (GR and Standard Model). Hence it absolutely is reality!
And no, having "enough" dark matter would not explain the big bang. However certain dark matter theory-candidates give predictions which can explain the matter distribution in the galaxy, which neatly solves another puzzle.
You can read up on some details here about the latter: http://www.illustris-project.org/about/#public