Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment We have no idea what "superintelligent" means. (Score 4, Insightful) 262

When faced with a tricky question, one think you have to ask yourself is 'Does this question actually make any sense?' For example you could ask "Can anything get colder than absolute zero?" and the simplistic answer is "no"; but it might be better to say the question itself makes no sense, like asking "What is north of the North Pole"?

I think when we're talking about "superintelligence" it's a linguistic construct that sounds to us like it makes sense, but I don't think we have any precise idea of what we're talking about. What *exactly* do we mean when we say "superintelligent computer" -- if computers today are not already there? After all, they already work on bigger problems than we can. But as Geist notes there are diminishing returns on many problems which are inherently intractable; so there is no physical possibility of "God-like intelligence" as a result of simply making computers merely bigger and faster. In any case it's hard to conjure an existential threat out of computers that can, say, determine that two very large regular expressions match exactly the same input.

Someone who has an IQ of 150 is not 1.5x times as smart as an average person with an IQ of 100. General intelligence doesn't work that way. In fact I think IQ is a pretty unreliable way to rank people by "smartness" when you're well away from the mean -- say over 160 (i.e. four standard deviations) or so. Yes you can rank people in that range by *score*, but that ranking is meaningless. And without a meaningful way to rank two set members by some property, it makes no sense to talk about "increasing" that property.

We can imagine building an AI which is intelligent in the same way people are. Let's say it has an IQ of 100. We fiddle with it and the IQ goes up to 160. That's a clear success, so we fiddle with it some more and the IQ score goes up to 200. That's a more dubious result. Beyond that we make changes, but since we're talking about a machine built to handle questions that are beyond our grasp, we don't know whether we're making actually the machine smarter or just messing it up. This is still true if we leave the changes up to the computer itself.

So the whole issue is just "begging the question"; it's badly framed because we don't know what "God-like" or "super-" intelligence *is*. Here's I think a better framing: will we become dependent upon systems whose complexity has grown to the point where we can neither understand nor control them in any meaningful way? I think this describes the concerns about "superintelligent" computers without recourse to words we don't know the meaning of. And I think it's a real concern. In a sense we've been here before as a species. Empires need information processing to function, so before computers humanity developed bureaucracies, which are a kind of human operated information processing machine. And eventually the administration of a large empire have always lost coherence, leading to the empire falling apart. The only difference is that a complex AI system could continue to run well after human society collapsed.

Comment Re:It's coming. Watch for it.. (Score 1) 163

The overriding principle in any encounter between vehicles should be safety; after that efficiency. A cyclist should make way for a motorist to pass , but *only when doing so poses no hazard*. The biggest hazard presented by operation of any kind of vehicle is unpredictability. For a bike this is swerving in and out of a lane a car presents the greatest danger to himself and others on the road.

The correct, safe, and courteous thing to do is look for the earliest opportunity where it is safe to make enough room for the car to pass, move to the side, then signal the driver it is OK to pass. Note this doesn't mean *instantaneously* moving to the side, which might lead to an equally precipitous move *back* into the lane.

Bikes are just one of the many things you need to deal with in the city, and if the ten or fifteen seconds you're waiting to put the accelerator down is making you late for where you're going then you probably should leave a few minutes earlier, because in city driving if it's not one thing it'll be another. In any case if you look at the video the driver was not being significantly delayed by the cyclist, and even if that is so that is no excuse for driving in an unsafe manner, although in his defense he probably doesn't know how to handle the encounter with the cyclist correctly.

The cyclist of course ought to know how to handle an encounter with a car though, and for that reason it's up to the cyclist to manage an encounter with a car to the greatest degree possible. He should have more experience and a lot more situational awareness. I this case the cyclist's mistake was that he was sorta-kinda to one side in the lane, leaving enough room so the driver thought he was supposed to squeeze past him. The cyclist ought to have clearly claimed the entire lane, acknowledging the presence of the car; that way when he moves to the side it's a clear to the driver it's time to pass.

Comment Well it is half true (Score 1) 215

Slashdot has been crying wolf since they are a geek site and geeks seem to like that kind of thing and also like new technology, no matter the cost and issues.

However there have been actual depletions of IPv4 space of various kinds. First it was that all available networks were allocated to regional registrars. Now some of those regional registrars are allocating all their remaining addresses.

That doesn't mean doomsday, of course, it means that for any additional allocation to go on, something would have to be reclaimed. That has happened in the past, organizations have given back part of their allocations so they could be reassigned. It may lead to IPs being worth more. Company A might want some IPs and Company B could cut their usage with renumbering, NAT, etc so they'll agree to sell them.

Since IPs aren't used up in the sens of being destroyed, there'll never be some doomsday where we just "run out" but as time goes on the available space vs demand will make things more difficult. As that difficulty increases, IPv6 makes more sense and we'll see more of it.

We are already getting there in many ways. You see a lot of US ISPs preparing to roll it out, despite having large IPv4 allocations themselves, because they are seeing the need for it.

Comment Re:The issue is not title 2 (Score 1) 124

While I agree with parts of your argument, land lines are expensive more because they have millions of miles of physical wires to maintain. Cell towers do not have this burden.

Also, Cell phone service for any smart phone is MUCH more expensive than landlines now if you are single. It's sort of like "$100 for 4" or "$100 for 1".

That said, I use smartjack (flawlessly) over my internet. $19 a year. It's mainly a backup to find my cell phone and for extremely long gaming calls (can't get one player to use skype). I think the network effect for land lines is collapsing.
Pretty soon it will be smarter to have a "land line" format phone that actually connects to a local cell tower (no lines to maintain, install, etc.).

But it occurs to me that as long as they have DSL cable service, the lines will be there anyway. So maybe the network effect won't be lost. not sure. I haven't been a landline customer for 3 years.

Comment Re:This won't end well.... (Score 3, Insightful) 187

However... From my experience, the leading edge systems have been getting much MUCH better.
Many of the core stuff has been stabilized for years.

Windows 10 still uses the NT based Kernel. Like the previous versions. Most of the drivers are the same as well. The buggy stuff are in the new features, that are often not yet implemented into the prod environment anyways.

The bad old days of the 1990's seem to be over for now. Quality is much better sense then. We can do a lot of things now without much fear of bad consequences.

Just like in the 1990's we stopped having to worry so much about failure in RAM as a major issue, because RAM has became a rather reliable component on the system.

Comment Re:Edge (Score 2) 187

I really wasn't impressed with edge at all. The touch interface is very buggy, pinch zoom and scrolling doesn't work past the first few seconds, in desktop mode. the browser stuff takes up a lot of screen real-estate. And still the lack of plugins such as adblock hinders the web experience.
I still don't see the point on drawing on your web page either.

Comment Limited Time.... (Score 1) 187

There could be less demand, If we really had a good handle on the limited time to upgrade for free window.
There are a lot of people who are not in a rush to get windows 10. However this limited time means they might as well upgrade now vs waiting too long and having to pay for it. (Yes I am wide open about Free/Open Source Linux advantages...) But is it that important to give an artificial high demand to make investors thinks people really REALLY want the upgrade. vs just Getting it now for Free, vs waiting later for it.

Comment Re:My upgrade strategy (Score 0) 187

Which is fine.
I had my Linux for a desktop kick for a while back in the late 1990 and early 2000s
then I was on on Solaris for a while, then Mac OS.
I am actually trailing on a Windows kick, it is getting to a point where I may want to switch a again.

Nothing is wrong with any of these system they have their pluses and minus.
However OS X and Windows, is less struggling for hardware compatibility. Linux seems to be hit or miss, unless you invest a lot of time trying to determine if it is compatible enough, as many of discussions on such hardware fail to state if it works with a distribution or not.

Linux: I tend to prefer when I need to be very productive, When I need to crunch a lot of data. Also it is handy for cases when I need to do something outside the box, as it doesn't dumb down lower level access.

Comment Re:Different instruction sets (Score 5, Informative) 98

Benchmarks are hard for comparing computing systems already. Design trade-offs are made all the time. As the nature of the software these systems run change over the time, so does the processor design changes to meet these changes. With more software taking advantage of the GPU there may be less effort in making your CPU handle floating points faster, so you can focus more on making integer math faster, or better threading...
2005 compared to 2015...
2005 - Desktop computing was King! Every user needed a Desktop/Laptop computer for basic computing needs.
2015 - Desktop is for business. Mobile system Smart Phones/Tablets are used for basic computing needs, the Desktop is reserved for more serious work.

2005 - Beginning of buzzword "Web 2.0" or the acceptance of JavaScript in browsers. Shortly before that most pages had nearly no JavaScript in they pages, if they were it was more for being a toy, at best data validation in a form. CSS features were also used in a very basic form. Browsers were still having problems with following the standards.
2015 - "Web 2.0" is so ingrained that we don't call it that anymore. Browsers have more or less settled down and started following the open standards, And JavaScript powers a good portion of the pages Display. the the N Tier type of environment it has became a top level User Interface Tier. Even with all the Slashdot upgrade hate. Most of us barely remember. clicking the Reply link, having to load a new page to enter in your text. And then the page would reload when you are done.

2005 - 32 bit was still the norm. Most software was in 32 bit, and you still needed compatibility for 16 bit apps.
2015 - 64 bit is finally here. There is legacy support for 32 bit, but finally 16bit is out.

These changes in how computing is used over the time, means processor design has to reweigh its tradeoffs it choose in previous systems, and move things around. Overall things are getting faster, but any one feature may not see an improvement or it may even regress.

Comment Re: This is just an attempt by the Republicans... (Score 1) 140

Also, Fukushima is only rendering about 500sq miles uninhabitable for (currently optimistically estimated) 25 years while Chernobyl is about 900sq miles for over 25 years so far. It won't return to average radiation levels for over 20,000 years. You can live there now... if you don't want to have children and accept a higher risk of cancer. About 600 elderly live there now. The animals in the area have mutations, stillbirths, etc. But, those that survive handle the radiation better as time goes on and thrive from the lack of human predation and habitat destruction.

The Chernobyl radiation area 's sort of butterfly shaped tho and due to wind pattern there is a second 'wing' / exclusion area which is also uninhabitable of similar size- so about 1800sq miles total.

http://www.subbrit.org.uk/rsg/...

http://www.theguardian.com/wor...

Comment Re:Chinese economy on the verge of collapse? (Score 2) 140

China, and the chinese, have a massive superiority complex laid over a very deep inferiority complex stemming from the 1800s all the way to the 1940s.

Until that gets resolved, they are more dangerous than average. They have a chip on their shoulder and have something to "prove" combined with a sense of manifest destiny.

Their military spending is much less BUT their labor costs are much less so their spending is much higher than it looks like given the raw numbers. Effectively its 3 to 4 times as large.

Hopefully they transition to a truly confident nation and resolve their issues. Then there is still "average" danger. Any group of people can go apeshit on other groups of people when they think they are more powerful. It's happened over and over.

Slashdot Top Deals

Neutrinos have bad breadth.

Working...