Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:why do athiests love to hate belivers so much? (Score 4, Insightful) 1293

> Electrons move around a nuclei the same way planets move around suns

Not even remotely. This idea was proposed back when humans had no understanding of subatomic behavior, and they were drawing assumptions based things they did know, like the solar system. If you want to actually know how electrons and nuclei behave, try to wrap your mind around quantum mechanics. It's almost impossible as it bears little resemblance to anything else you might be familiar with.

It's an interesting example, though, because it illustrates how whenever humans don't know what they're talking about, they fill in the gaps with things that are familiar. Like chariots carrying fire through the sky and an anthropomorphic God creating the universe.

From there your comment just goes further off the rails. Nobody thinks they're "smarter than everyone else". But observation and reason let us learn about the world, and we've learned over and over that mankind's notion of God is always several steps behind our observational understanding. Everything that has improved in the past two centuries has been at the hands of man. We're slowly figuring out ways to improve our lot in life. God's word was around for thousands of years before the enlightenment and didn't improve anything.

The universe is amazing, and every facet fills me with awe. But that doesn't mean there needs to be a personality behind it. I can take it for what it is without having to project my ideas of meaning onto it.

Comment Re:USB sucks (Score 1) 280

As evidence of USB being an improvement I submit the explosion of USB devices, from cameras, to mp3 players, to thumb drives, to midi/digital audio interfaces, to webcams, to wifi adapters, to external CD and DVD for laptops, to...

You get the idea. The ubiquity of such devices could never have happened under the old system. The interfaces of yesteryear were a stumbling block for innovation in many ways. Thank god they're gone, and thank god we've got a brain-dead simple (from a user perspective) interconnect that is simple and cheap enough to implement from a developer perspective to have allowed the world to move on.

Remember, you can always roll back to the DIN/COM/LPT/FDC/IDE setup. Plenty of parts still around. Heck, I fire up a C64 emulator once in a while myself ;)

Comment Re:Misinterpretation *By Linux* (Score 2) 280

Sorry, but that's not how it works. If you want to say that there's a consensus to operate out of spec, that's fine, but according to the spec 10ms is the minimum amount of time before you can communicate. Which is another way of saying it's the maximum amount of time you have to get ready for communication. The Intel engineer that claimed it meant that they could take longer than that is an idiot - reading the spec that way makes it meaningless.

I think it is probably wise that other systems give a longer delay. In fact it seems a little naive to assume that every device is going to operate on such tight tolerances. There is also a chance, as the previous poster said, it may not really be giving 10ms, depending on how various signals get queued - if that's the case, then Linux was wrong, as opposed to the devices.

Whatever the case, it doesn't have to be an OS pissing match. Linux seems to have implemented the spec with no wiggle room beyond the spec. Other systems left wiggle room beyond the spec, which may have been wise. It's like browsers and quirks mode - rendering tag soup is kind of wrong, but you're not going to get a popular browser if you fail to render a page every time someone mis-nests their tags.

And I'm on a Mac, so this isn't an ego thing for me. I just like to keep the facts straight.

Comment Re:USB sucks (Score 2) 280

As someone whose computer experience predates the birth of USB by many years, I find all the criticism of USB to be a hoot. I mean, sure, it's a mess compared with an ideal system, but oh my lord it's so much better than the mess we had before I don't even know what to say. When peripheral interconnects are so good that we resort to complaining about USB, it's a better world than I could have dreamed of 25 years ago.

Comment Re:2000's called... (Score 2) 123

I'm all for a nice and straightforward means to compare chips, but you're wrong that clock speed was a good way to do that - at least after about 1998 or so. There's just way too many other ways in which a chip can be faster or slower. Cache size, cache speed, cache prediction, instruction size, data path latency, pipelining, hyperthreading, multiple cores, etc, etc, etc.

The Pentium IV really put a stick in the idea of comparing clock speed because they actually made it do less work in each cycle so they could have more cycles per second - but the same amount of work. They intentionally inflated the clock speed just so they could fool people. That's one of many reasons my Core 2.4 Ghz smokes any Pentium IV 2.4 Ghz.

To summarize: the world is complicated and clock speed is a lousy metric. It's fine for comparing chips of the same architecture, or for comparing across chips when the clock speed difference is enormous. But no, returning to clock speed as our speed metric is not the end of stupid marketing, and the fact you thought it was just means you bought into the previous stupid collection of marketing.

Comment Re:Market research (Score 1) 178

I have no race in this horse (never played Minecraft or Roblox), but why do you assume it was luck that let Minecraft take the lead over Roblox, as opposed to it being better?

And I'm not looking for some subjective response like "Roblox has way better XYZ!" I'm wondering why people flocked to Minecraft. There must be a reason. If the answer is "luck" that probably just means Minecraft is better in ways that are hard to quantify.

Comment Re:AI has a high burden of proof (Score 1) 277

I agree it doesn't have to be the same way people do it. Could be an entirely different system. I'm just saying it has to be a lot more powerful than language. AI that focuses on language as the bottom layer will always be parlor tricks, not intelligence.

And while you're right we don't need to understand how people achieve intelligence to make an AI, it would sure help if we at least had a definition of what intelligence was, which we don't. Or rather, every time AI meets the definition we realize that it was a lousy definition. I predict the article's suggested tests will be more of that.

To carry your analogy further - I'd say that if we had so utterly failed to cut wood for so long, and in fact couldn't really even understand how wood was cut, we might want to take a peek at how that crosscut saw works before flailing around too much longer.

Comment Re:Missing the point as usual (Score 1) 277

> the atheists seem hell bent on the idea that intelligence and self awareness are illusions or somehow not real.

Sorry you've only met such people. I agree that is where most of them end up, but it's kind of sad, and ironic: denying the existence of something that is the fundamental basis for your ability to deny anything.

So I'm an atheist and a materialist, and I am thrilled by my observation of intelligence and self-awareness. I agree with your view and feelings on the mind. On the mundane sounding side, I'd say it's like software - the value is in the arrangement of matter, not the matter itself. On the more poetic side, I'd say it's an absolutely stunning examples of the wonders of the universe, and the fact that matter can become information and information can become self-representing and that brings about a fantastic new force of nature called "awareness" blows my mind... right after making my mind.

I think Hofstadter does a great job explaining how these things arrive from "inanimate" matter. Worth reading his book GEB if you find this stuff interesting.

Comment Re:AI has a high burden of proof (Score 2) 277

Correct. I'd go a bit further.

The questions Levesque proposes are questions that will test a language processing system, not intelligence. Language is not required for intelligent behavior and is insufficient (as various language parsers and knowledge-web systems have shown).

I don't believe any system that has language as its primary tool can be intelligent. Language is far too blunt an instrument. Anything we would be likely to call intelligence has to rest on a modelling system with is far more subtle and detailed than language. To get a flavor for how lacking language is, try encapsulating everything about a person you know well into words, then have someone who has never met them before read it. Do you think they understand that person as well as you?

Language is our most powerful tool for transmitting ideas. But even all the tools taken together are insufficient to transmit the actual concept models in our head in sufficient depth and resolution. Any system that is intelligent needs to base its intelligence on more fundamental units of thought than words. It needs to build these models on the fly and adapt them to new information as opposed to being programmed in. And back to the top of this thread, we don't really understand how that works in natural intelligence yet, so it's unlikely AI is going to pull it off anytime soon.

Comment Re:300 MPH flesh sacks of water (Score 1) 333

For several years I managed a team of software engineers split across two sites. We were all very comfortable with communicating electronically, be it email or chat. We all hung out in an irc channel as we worked. We shared wiki for documentation, a collaborative project list, we did conference calls as needed, and we sent out regular digests of what was going on with each team.

Yet there were still things that seemed to require working in person to go smoothly. It probably depends on the type of work - our software had to integrate with physical systems and workflows at a warehouse (one of the sites, the other was corporate hq) - but sometimes all the remote communication in the world did not provide enough clarity and you just had to be there. I was as surprised as anybody.

I eventually became disillusioned with the ideal of completely remote work and interaction. Issues came up with outside companies too. Without presence on site, poorer decisions would get made and things would progress more slowly. There's just so much more bandwidth interacting in person. Also, some of it might be related to the way we triage and prioritize - presence gives us urgency cues that telepresence does not. Yes, there are better and better ways to communicate, but until we can't tell the difference from being there... sometimes you benefit from being there.

And yes, I'd have loved to have made that trip in half the time.

Comment Re:That's so sad. (Score 1) 625

Other people here are going to disagree, but you're right.

Anyone who wants the joy of life to end is not living well. Unless you've got a mental illness of some kind, the promise of another day with all the love, adventure, and opportunity to do great things and be a comfort and inspiration to others, is a gift. The cessation of that gift has no benefit. There is no practical reason for death unless we limit ourselves to the idea that we can't change the way our world works.

And if you think that, all of recorded history would like to disagree with you. Keep in mind that the world could only support about 3 million people until the advent of modern agriculture around the 11th century. And yet we've found a way to support 7 billion, the vast majority in better conditions than almost anyone in the 11th century. So stop being so limited in your ambitions. We can be better than we are now. And part of that would come from extending lifespan.

Slashdot Top Deals

One man's constant is another man's variable. -- A.J. Perlis

Working...