Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Women are the majority of gun owners (Score 1) 500

Low income and poverty are different things. Maine is fairly rural. Income in the countryside goes a lot further than in the city. It is not so much raw income that counts, rather inequality. Living in the gutter right next to someone who lives in an expensive condo may drive people to crime.

Comment Re:Summary insufficient, click through the link. (Score 1) 786

Lack of empathy is not a standard we as a society should aspire to. This would mean it would be OK to openly mock, bully, despise, put down and generally be assholes to one another without check. Basically this would mean reverting to caveman-like behaviours, where the physically strongest is the chief because no one dares contradicting him.

I'm sure many of the men on this forum have stinging memories of middle school because they fell victim to such behaviour at recess time. I'm actually curious why many here voted such a proposal up.

Comment Why would strong AI care? (Score 1) 174

Assuming that some day humanity develops strong AI, and shortly after a super-intelligence emerges. It seems obvious to me that this super-AI would no longer care about humanity and all its achievements, as we would be a complete waste of resources. Think of all the waste we generate as a species. Then, it seems our long-term future is doomed without strong AI because we are too fragile to achieve anything beyond our solar system, and we are doomed with it because we will be irrelevant. Is humanity but a stepping stone to something grander ? if so, why is the universe not already teeming with artificial life ?

Comment Autonomous learning (Score 1) 174

We are steadily developing the required computational resources to simulate a decent-size artificial brain. We have concurrently developed advanced machine learning methods, for instance deep learning. Together, these advances have allowed us to solve long-standing AI problems, such as automated translation, chess, face recognition, and others, to a high degree of accuracy, even beating humans. Perhaps in the near future a computer will convincingly pass the Turing test.

However we have made comparatively little progress on autonomous learning, i.e. letting a computer learn something by itself, and not by example. Do you view it as essential, and is there a path forward in this area?

Comment No economic interest (Score 1) 414

Going to Mars sounds nice, but there doesn't seem to be anything of note to mine or exploit there that would make it economically viable. It is not a lifeboat for humanity in the short run either because it would require such a continual feed of stuff from Earth to be survivable.

If we wanted a humanity lifeboat, It would be easier, cheaper, safer and more effective to build a giant, self-sustained fallout shelter under the ice of Antarctica than going to Mars. We are not doing that either.

Comment Re:And people on slashdot give a shit, why? (Score 4, Insightful) 164

Good on him indeed, this means several things:

He's a big-shot CEO who can delegate. Great

This sort of things is not reserved for women. Fathers should take time off too. Great

The workplace is not the be-all and end-all of all things. Kids are important too, they are our future. Great

Comment Such as occurred in the 1960s (Score 2) 99

Quote:

Observers of the current state of the space program like to maintain that a space race, such as occurred in the 1960s, will never happen again.

Emphasis mine. The little race between Musk and Boeing is nice to watch, however in the 1960s we were watching a race between two superpowers with basically no holds barred.

Comment Simulation and resolution (Score 1) 269

People here say, with reason, that we ought to be able to simulate every physical system, given a good enough model, enough time, bandwidth, resolution, memory and computing resources.

This should be by and large true, but consider this: computational fluid dynamics with turbulence is still an open problem. For instance, smooth solutions to the Navier-Stokes equations are not known to exist.

Yet, turbulence seems like a really easy problem compared to thought and consciousness. We even have a mathematical model that describes it. Sure, with enough computing resources we can do a good enough job of simulating turbulence in most regimes, but not all. For instance, Computational Fluid Dynamics with magneto-hydro-dynamic elements is really hard. Yet this is required for developing for instance nuclear fusion, a topic with a huge economic importance. Still, these simulations require the best supercomputers that we are able to muster at present. The race to build still-better computers to run better CFD simulations is still on. and is likely to go on for quite a while.

So total brain simulation or brain upload is not likely to occur anytime soon. We are much more likely to develop increasingly sophisticated AI based on learning and bottom-up strategies that do not care much about how the real human brain works. These strategies basically work: we can now beat the best humans at chess. Computer vision improves all the time. Soon we may have self-driving cars. Perhaps in the future a long-term sustainable and stable economy will be achieved thanks to AI progress.

However this teaches us next to nothing on how the brain works. Perhaps one day we will have the Singularity that Kurzweil keep talking about, but the resulting super-strong AIs are not likely to care about us poor inefficient meatbags that we are. Why should they? Simulating us would simply take too much resources.

Slashdot Top Deals

On a paper submitted by a physicist colleague: "This isn't right. This isn't even wrong." -- Wolfgang Pauli

Working...