The speed of light in a medium, such as air, however.. is not.
Aha, someone is wrong! Even if they died a while ago. Let me correct that.
The big bang theory suggests there's a finite amount of past, but that doesn't mean the universe "was created a finite time ago". Time itself is part of the universe; it's a dimension, which only differs from the other three in how you calculate distances in space-time. (It's r^2 = x^2 + y^2 + z^2 - t^2, not plus as you might expect)
This is literally the only difference between time and space.
At this point you may be asking where causality comes from. Ah, but that's a different question, you see. If you're very curious, I would suggest reading The Clockwork Rocket (fiction, *very* well made, set in a universe where it's plus instead of minus..), or visiting http://www.thegreatcourses.com/tgc/courses/course_detail.aspx?cid=1257 if you'd rather hear directly about this universe.
tl;dr: People tend to conflate time (causality) and time (the dimension). This works well enough for normal life, but not well enough for border conditions such as, oh, the big bang. Or some laboratory experiments. The universe is bounded in time, but that doesn't mean it was created anymore than the existence of a north pole lets you ask what is north of the north pole. "Why does the universe exist" is a valid question, but not one that can be solved by adding a creator, and one that would apply just as well to an eternal universe.
The screen is a sticking point for me.
I want a 1920x1080 (or 1920x1200 or higher, but that seems like wishful thinking) screen, and I want it in a 15.6" or smaller form factor.
Apple won't give me that, so I won't go to apple. Otherwise I would.
Actually, I'm getting ~200Mb/s from 802.11n. It does happen.. if you're lucky, and pick the right card/AP combination.
(Intel 4965 and DAP-2553, for the curious)
The light gets shut down, and it goes dark?
That's.. poetic, I guess, but it's not irony.
Do tell. I expect that was an amusing scene, unless the kids started crying enough to ruin your enjoyment. It could be worse, though - I sat through the same situation, with *Ghost in the Shell*.
For kids. Right.
That's silly. There's nothing in Honeycomb that would require a dual-core CPU.
It may be that it'd run faster on a dual-core (doh?), but I'm not sure how you would even go about making an open-source OS that would genuinely fail to run on single-core machines. Not to forget that it's actually running Linux, which definitely has no such restrictions..
So you check the answer and, if it's wrong, try again until it isn't.
The probabilistic correctness isn't an issue except in toy problems, especially as you could in the limit just repeat the operation until the chance of it going wrong is less than the chance of the operator going wrong.
Once the expected future time cost of a spammer exceeds a human lifetime, it becomes possible to consider killing him.
This guy costs probably several hundred lifetimes a year.
By that logic there's no difference to swimming in shark-infested and jellyfish-infested waters - you could die in either one, so who cares that the sharks are roughly a thousand times less likely to attack you?
(I wanted to make an analogy with some kind of *safe* waters, but.. um... today's internet. Shark-infested it is.)
As people keep saying, you *may* use this optional standard.. as a replacement for an older standard where startup is *even slower*.
Odd how that goes.
This would be why we use extents rather than pure bitmaps, these days.
Don't remind me. I still have nightmares of trying to implement rename() for a distributed filesystem.
Do you think this is a good or a bad thing? Why?