Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:I hope "semantic" != "annoying popups" (Score 1) 68

I'm not sure I really follow your argument, but the open source community seems like a reasonable example. Linux is paid for - big companies sink billions of actual dollars into it, and contributors put in even more value in time. Quality, in the things that are important to the people contributing to it, is high. Quality in the things that are not important to contributors, but are important to many of the people who do not contribute? Not so high.

Quality is also high in ad encrusted click bait sites - in the eyes of the people contributing to them. But that's not you.

Comment Re:Author vs. content (Score 1) 522

Everybody gets stereotyped in film. Stereotypes let the audience feel familiar with characters they don't have time to get to know. Sometimes a movie will take one or two characters and write them out of their stereotype in order to tell a story, surprise the audience, or win an Oscar.

Longer running formats, like TV series, like to start with stereotyped characters everyone can become familiar with quickly and then do episodes that focus on their "other side."

Comment Re:Here's MY test (Score 3, Interesting) 522

That's actually a more interesting fact than you might think. Males are actually more likely at birth, and outnumber females until the age of 30 or so. But in the > 30 demographic females are in the majority, and that majority increases with age. Why? Because males die at a higher rate. Being male has a higher risk of death than being female.

Comment Re:Really? (Score 1) 294

We already have manufacturing robots. AI will definitely be given control of those.

There's a science fiction story, unfortunately I can't remember who wrote it, where the premise is that smart computers get so good at managing complex systems that the humans "in charge" basically get instantly fired if they don't implement the computer's recommendation. The computers aren't actually directly in charge of things, but their recommendations are so much better that not following them makes you uncompetitive. It turns out that there are a few humans who are suspicious and don't always do what the computer recommends. They get away with a few transgressions by carefully covering them up but the computers know, and these humans have a tendency to get in accidents. Behind the scenes, the computers are having things built unbeknownst to humans. With a very complex system like world-wide manufacturing, and lots of decisions being made, it's easy for them to hide their activity.

Comment Re:Really? (Score 1) 294

The idea is that once you create an AI you put the AI to work. We certainly would let it run the pipelines and traffic lights and air traffic control system. But we'd probably also put it to work doing research, such as designing new and better AIs. The fear is that once that happens, smarter AIs design even smarter AIs in a positive feedback loop and eventually they're so far beyond us that we're irrelevant. It does assume that greater individual intelligence lets you build smarter AIs though. That's a pretty big assumption.

Comment Re:Why the surprise? (Score 1) 294

Super-intelligence shouldn't be any more impossible than the regular kind. Evolution didn't optimize us to be the most intelligent things possible, it made us just intelligent enough to confer a survival benefit. With caesarean sections and a policy of only letting the most intelligent people breed, we could presumably create super intelligent humans in a few tens of thousands of years. If you also selected against whatever you didn't want, you could make sure those traits didn't survive.

We can probably manage it quicker with hardware and software development.

Comment Re:Quantum Computing Required? (Score 5, Interesting) 294

There are a few things in there that made me raise an eyebrow. Humans don't really experience much neurogenesis. There are some areas where new neurons can form, under certain conditions, but they tend to be special purpose ones, and the older structures in the brain as well. The thing that really differentiates us from other animals is our overdeveloped cortex, particularly the frontal lobes, but the neurogenesis that's been found is mostly in the deep gray matter and is more associated with things like motor coordination and reward processing. One interesting exception is the hippocampus which is known to be important in memory formation. Indirect hints of neurogenesis in the cortex have been reported, but other methods that should turn them up haven't, so the evidence is contradictory. I'm also not aware of neurogenesis being particularly pronounced in humans. It occurs in other primates, and in other vertebrates.

There does seem to be a connection between intelligence and the brain to body size ratio. Bigger bodies require more neurons to carry and process sensory and motor information, and these neurons are probably not involved in intelligence.

What we call intelligence seems to me to be likely an emergent property of a bunch of neurons that don't have any pressing sensory or motor tasks keeping them busy. Various factors affecting communication efficiency and interconnection among neurons are probably important, but these can be disrupted quite a bit in human disease and the sufferers don't lose their human intelligence (although their cognitive abilities do decline). I don't think there's a magic humans-have-it-and-nobody-else-does bullet. Human intelligence is just what lots of animals have with lots of extra capacity, possibly redirection from other things (like senses) to boost that capacity, and maybe a few tweaks for optimizing neurons that talk to themselves over ones that communicate with the body.

Slashdot Top Deals

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...