No they weren't. Cellphones were cool from the start. At least, around here anyway. Everyone wanted one. The problem with glass is the same with bluetooth headsets. People ware them even when they're not using them... which makes you look like a douche. Once Google has these embedded in regular glasses this will stop being an issue.
Agree with the first part, but on BlueTooth headsets - what's one supposed to do with them, take them off and pocket them? That risks losing them. I leave mine in place, even when turned off, when I'm out and about. 'Cause I know I'd lose it otherwise.
Maybe it helps that I grew up in a household where hearing aids were worn by a family member, so having something in the ear was normal. On the other hand, I hated wearing ear buds for the longest time, 'til I recognized the usefulness of them.
I imagine the creation of a TCP packet would mostly use a very similar routine regardless of the platform OS or hardware.
Or maybe the transferring of a packet.
A million miles per hour is not all that much.
All the galaxies in our neighborhood are also rushing at a speed of nearly 1,000 kilometers per second (2,236.936 miles per hour) towards a structure called the Great Attractor, a region of space roughly 150 million light-years away.
I think they're calling them fast based on the relative speed to the galaxy that they're being ejected from / passing though.
Astrophysicists calculate that a star must get a million-plus mile-per-hour kick relative to the motion of the galaxy to reach escape velocity.
The diagram in TFA seems to indicate that these stars are not originating inside the galaxy, which to me raises the question, from whence do they come?
This image makes it appear the stars are mostly passing through the disk of the galaxy. I may be reading too much into the length of the coloured lines though.
That court case did nothing of the sort - it was a court case against a local US bank subsidiary asking for records of other subsidiaries in the Bahamas and Cayman Islands.
I came in here to address this issue.
An interesting quote (emphasis mine) from the linked-to case:
The nationality of the Bank is Canadian, but its presence is pervasive in the United States.[18] The Bank has voluntarily elected to do business in numerous foreign host countries and has accepted the incidental risk of occasional inconsistent governmental actions. It cannot expect to avail itself of the benefits of doing business here without accepting the concomitant obligations. As the Second Circuit noted years ago, "If the Bank cannot, as it were, serve two masters and comply with the lawful requirements both of the United States and Panama, perhaps it should surrender to one sovereign or the other the privileges received therefrom."
Over all I do hope that more data is moved to Canada (hence more jobs here), and the Canadian governments, federal and provincial, strengthen their determination (and regulations) to keep sensitive citizens' data out of the USA.
How about a nice, fat trans-Canada fibre optic cable, all within our borders? I imagine the spending on the advertisements for the "Canada Action Plan" would've paid for a good deal of it...
A projection is.
In my definition, it is "if I have x, y, z, and it continues on path q, I can project that it will continue to do so with a given accuracy". But as soon as I open my big fat mouth and say that "q will be such", I've changed from a projection of a model to a prediction. And when ALL of those predictions are wrong and revised.
That's where I think you're mistaken; they don't say, "q will be such", they state something more like, "if q continues to be such, we expect ___ with an X% level of confidence" (ya know, like scientists tend to do).
I found this IPCC glossary:
vs
Finally, the IPCC projections are criticized for being, if anything, too conservative in their projections. Time and time again they've said X in Y years and in Y - Z years X is seen to be having an effect. And when something stupid does come out (Himalayan glaciers melting in 30 years), they correct it. Ya know, like scientists do.
Also, don't confuse media headlines with IPCC projections, just like you can't expect to see realistic scenes of IT in movies.
And please, check out the link a few posts above that points to the Ars Technica story where the comp sci prof has a look at the models - he was impressed - they're pretty good. Or, "all models are wrong, some are useful" and climate models are useful.
If climate models were accurate, their predictions would be accurate. All of the models have failed on their predictions. This means, they are inaccurate and are not accurately reflecting the real world model.
They don't make predictions, they make projections; if you can't get that right, you're worse than the climate models.
Similar to confusing weather with climate.
Newtonian physics doesn't make accurate predictions (at relativistic speeds, for example), but it's still accurate (enough) for models. Or was Isaac Newton a "physicist" instead of a physicist because he didn't cover all cases?
Replying to self: broken link == missing link:
But I can't prove anthropogenic climate change with anything but a computer model... and I've made too many computer models in my day for that to be very convincing.
Have you seen climate models, or do you just deny the ones that you don't like due to your standard of "truthiness"?
Ars Technica covers climate models nicely: (see page 2)
Steve Easterbrook, a professor of computer science at the University of Toronto, has been studying climate models for several years. “I'd done a lot of research in the past studying the development of commercial and open source software systems, including four years with NASA studying the verification and validation processes used on their spacecraft flight control software,” he told Ars.
When Easterbrook started looking into the processes followed by climate modeling groups, he was surprised by what he found. “I expected to see a messy process, dominated by quick fixes and muddling through, as that's the typical practice in much small-scale scientific software. What I found instead was a community that takes very seriously the importance of rigorous testing, and which is already using most of the tools a modern software development company would use (version control, automated testing, bug tracking systems, a planned release cycle, etc.).”
“I was blown away by the testing process that every proposed change to the model has to go through,” Easterbrook wrote. “Basically, each change is set up like a scientific experiment, with a hypothesis describing the expected improvement in the simulation results. The old and new versions of the code are then treated as the two experimental conditions. They are run on the same simulations, and the results are compared in detail to see if the hypothesis was correct. Only after convincing each other that the change really does offer an improvement is it accepted into the model baseline.”
"The following is not for the weak of heart or Fundamentalists." -- Dave Barry