Link to Original Source
Link to Original Source
It's your father's microkernel. A more elegant weapon for a more civilized age.
That's not what your mom said.
> * This guy is dead.
I get the distinct feeling that there are quite a few in the US govt and elsewhere that would like to help Snowden achive the same status.
Link to Original Source
An adaptive program (in the sense the previous poster was attempting to describe) would be one that is able to figure out on its own how to do things that its programmers had not anticipated in advance.
It's all a matter of levels. I can make a good argument that humans don't strictly fit the definition you've provided. After all, we're born with ready-made circuitry to do everything that we do. Learn a human language? Put two babies together and don't teach them a language, they'll come up with one for themselves given enough time. We're built to develop it, it's not something we figured out how to do but weren't built to do. Use of tools? It's only possible because our brain is hardwired to treat external objects as extensions of our body. For example, you can "feel" the tip of a pencil as you're writing. When you're driving a car, you "feel" the entire boundaries of the car as the space *you're* taking up. Even when playing video games, you are quickly able to think in terms of what you want the object you're controlling to do, you don't think about the buttons you're pressing. That ability of our brain to integrate tools as extension of ourselves instead of an object completely separate from us is hardwired in, it's not something we can learn.
Now, of course, I'm not going to argue we're *not* intelligent, and that we're incapable of learning. I'm also not going to argue machines are as intelligent as we are. That said, a lot of what they do is most certainly intelligence, and it's most certainly learning. After all, we're programmed to learn languages, but not with English. We're programmed to use tools, but have to be taught to write or type. However, in the same way, we've made some pretty good progress in AI. My android phone "learns" what my face looks like and how to differentiate between other people's faces. Yes, it's pre-programmed with a facial-recognition algorithm, but so are you. If that circuitry is defective, you end up with face blindness.
Yes, because anyone who cannot afford to pay for a baby sitter should forego ever eating out or watching a movie.
And the reason you find more babies out is for a few reasons:
1. Families are smaller and there is less of grandma and grandpa living 'round the block. As such, you are left with no family help.
2. Economic realities make childcare extreme expensive, even in double income families.
3. Single parents are also a lot more common, and the single parent already has someone taking care of the kid during the day. They can't magically "leave" the kid behind for everything that they do, just because other assholes in public find them to be an inconvenience.
If I can't get a sitter, I'll do my best to calm my baby when I'm out in public. If you don't like it, you can bugger off.
You know, I cannot understand the recent cultural backlash against babies.
Yes, babies cry. They cry at night, they cry in restaurants, and they cry on airplanes. They cry when they are hungry, when they are tired, when they're pooping, and when they need a diaper change. And often, they cry for apparently no reason at all.
As a father of a four month old, I can tell you that we parents aren't exactly pleased to hear our babies cry, either. We don't want our kids to be in pain, and we want them to be happy. We are acutely conscious of bothering others, and we feel helpless about the whole thing.
But you know what's worse? Assholes who cannot stop complaining about crying babies. Guess what? It's how human beings are. You cried too. So did every human being who's ever lived.
So, get over it. Babies cry. Live with it. If you don't like it, find a place without any humans who procreate. And show some empathy, for crying out loud.
Take your Prozac and walk away slowly from the keyboard...
Yes, clearly I was unaware of this fact when I made this comment. Because, you know, it's an all-or-nothing world where people offering product features tell their users to do it their way or stick it.
If you cannot offer a helpful suggestion when someone questions something they aren't comfortable with, perhaps you should cut down the snark and just ignore the comment.
Indeed. That is a great idea. Thank you.
Really? Some of us really enjoy our books -- as someone who has a personal library with ~4,000 books, I would be appalled if I had to write on any of their pages with a pen.
Not because I am planning on selling any of them, but because to me, I just see it as damaging the book.
A good many of them are autographed or antiquarian books, and the last thing I'd ever want to do is sign them with a *pen*.
I find the whole deal oddly disturbing -- maybe it's just me as a bibliophile, but writing on a book sounds like a sacrilege.
The "commands everywhere, hit enter to resample them" existed back then for macintosh programmers Workshop
The Commodore interface was like that too.
How does spacetime know how fast something is going through it? If there is nothing else other than spacetime and a single photon, what regulates the photon's speed? What is the speed relative to?
The easiest way to answer that is by saying you're thinking about it incorrectly. Our every day experiences leads us to believe that distances are absolute. That time periods are absolute. And if we're talking about the relative speeds in human experience, that's a very, very good approximation.
Turns out reality is weirder. It's not that spacetime "knows" how fast something is traveling through it. It's that space and time don't behave like our senses lead us to believe. So, from our perspective, if energy to use to accelerate a ship wasn't a problem, and we're here on Earth observing it head someplace 10 light-years away, the very earliest that ship will get to its destination will be just over 10 years, as no matter how much energy is used to accelerate it, it's going to just asymptotically approach c, but never reach it.
However, the thing is, it's not really a speed-limit, of how you would think of it. From the perspective of the people on the ship, if energy isn't a problem, you can get to your destination as quickly as you want. Under 10 years? Sure. You can get there in under a second (we'll assume we've figured out how to ensure everyone won't die from accelerating from 0 to close to c, then back to 0 again all within a second). You can get there in a millisecond. If you put more energy into acceleration, you can always get there faster. However, you still never go faster than light. It just so happens that, as you accelerate, you disagree with the people on Earth about how far your destination is. It used to be 10 light-years away, but now, after that huge acceleration, it's only 100 meters away, and you can cover that small distance really quickly at near the speed of light. To you, an infinitesimally small amount of time has passed. To the people on Earth, over 10 years.
TL;DR; It's not that that spacetime is preventing you from going faster, it's that spacetime is a 4-D Minkowsky space, and we're approximating it as a 3D Euclidean space + time in everyday life. That's a great approximation, until you start going really fast. Then it's just not good anymore.
Oh, thought of another one, just to mess with other admins:
# chattr +i
Wouldn't notice until kernel upgrade time:
$ ls -d
Visualization is also great for evaluating randomness; remember the images of broken RNG implementations a few years ago? http://lcamtuf.coredump.cx/new...