Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Sad but smart (Score 1) 500

Did you just say "Mac server"? ... That will eat away at your profits a whole lot faster, because you'll just buy a new "Mac server" or something :)

So you've evidently got no issue with a Windows server, but you take specific exception to Macs? That's odd, considering one of those two is a reliable Unix flavor, and the other is... well, Windows. What exactly is the issue with a Mac server?

Comment Re:Naive Question (Score 1) 196

If someone discovers a proof that P==NP, then even though we haven't found the practical solutions to some problems (factorization or whatever) yet, it means that there IS at least one "quick" solution.

Unfortunately, no, it doesn't imply a "quick" solution. All P=NP would mean is that these problems have polynomial-time solutions; it says nothing about how efficient those polynomial-time solutions would be. Your only guarantee is that for sufficiently large N, any polynomial bound is better than any exponential bound. It can still be ridiculously, phenomenally huge (like O(N^4000)), and "sufficiently large N" can also be a ridiculously large number. If you came up with a polynomial-time Traveling Salesmen algorithm that doesn't start to beat exponential time until you're handling 1,000,000,000,000,000,000,000 cities, technically you've proven P=NP, but in a completely useless fashion.

Comment Re:If that is representative of watson's capabilit (Score 1) 164

Agreed, I crushed it. I am very impressed about its ability to answer some questions (It actually got "Black Death of a Salesman" and "Charlie Brown Recluse"), which shows that it has some very sophisticated linguistic analysis, but if it can't beat some random shmuck on the Internet, I don't see how this will be an interesting event.

Comment Re:Been running a dev build for a few weeks now (Score 1) 212

If an app is linked against pre-4.0 libraries, pressing the home button kills it instead of putting it to sleep. Simply recompiling the app against the new libraries will enable multitasking support (there are some changes you might want to make, but I don't think any of them are required).

So far as I know, this isn't for any technical reasons. The early 4.0 betas enabled multitasking for all apps, and it was only midway through the beta that the behavior changed so that legacy apps quit instead of slept. I suspect this was done mainly for safety's sake -- prior to 4.0, you were constantly quitting apps and thereby making them regularly save their data. Post-4.0, an app might go for days without actually quitting. So these apps wouldn't receive any signals that they were being put to sleep (since the APIs to tell them that didn't exist back when they were compiled), potentially for days, meaning that a crash could wipe out days of data. Prior to 4.0, it would simply never be the case that you run a single app for days straight, and multitasking-aware apps don't suffer from this problem because they know to save their data at convenient intervals.

Comment Re:Physicists (Score 4, Informative) 166

So given Moore's law you will eventually end up with a single physical universe and hugely many simulated universes.

Moore's law is an observation about how fast technology is developing, not an incontrovertible law of physics. It will not hold forever, because eventually we will run up against physical limits preventing us from cramming more computing power into a given region.

In particular, it is impossible for a given amount of matter to perfectly simulate more matter than itself. If it were possible -- if you could e.g. use a ten kilogram computer to simulate twenty kilograms of matter -- then your ten kilogram computer could simulate two of itself, doubling its storage. Further, each of those computers could then simulate two more, and so forth, leading to an obvious contradiction (infinite storage requires infinite entropy, which has been proved impossible). Note that this argument holds even if the simulation is slower than real time; no matter how long it takes to simulate, you can't store more memory than you had to start with.

Now, of course this all hinges on the word "perfectly". There's no reason a computer can't simulate large amounts of matter with less-than-perfect fidelity, which is something that we do all the time. But given that we can build working computers, nuclear reactors, particle accelerators, and all that, let alone the vastly-more-complicated processes going on in each and every cell in your body, we are clearly not living in some cut-rate simulation which is hand-waving the laws of physics. We don't know how to model all of this stuff in a computer, but given that it takes supercomputers to simulate hydrogen atoms accurately, and we can't even solve the equations by the time we get to helium, it seems safe to assume that no matter how sophisticated our technology becomes, it will always require a couple orders of magnitude more matter than what you're trying to simulate (if you doubt this, consider a practical example of a computer trying to simulate itself. Can you really picture a computer with 4GB of memory accurately simulating the behavior of 4GB of RAM at the subatomic level? It can't even emulate a different computer with 4GB of memory, let alone simulate it at the subatomic level). So, we're talking about a computer which is, at an absolute minimum, a couple of orders of magnitude bigger than the entire universe.

(For completeness, I will point out two possible "outs" for this problem: First, it's possible that there's some trickiness going on, and "the entire universe" isn't actually modeled. Maybe only a small portion of the universe is modeled accurately, and everything else is an easy low-grade simulation used to trick us. That's certainly possible, but it's also unfalsifiable, so I'm not sure it's worth seriously debating. Second, this assumes that the simulator and the simulation are operating under the same laws of physics. If the "real world" which is simulating our world has different laws of physics, which allow for vastly more powerful computers than anything we could possible hope to build using our cheap low-grade physics, this scenario wouldn't be as ridiculous. And, really, quantum mechanics is so weird that "it was outsourced to the lowest bidder" may actually be a decent explanation for it.)

Regardless, though, I don't understand how the "it is much more likely that we exist in a simulated universe" idea is getting serious traction. No, it's not impossible, but "likely" is a hell of a stretch.

Comment Re:Brakes, please. Please? (Score 1) 226

I think you're on the wrong side of this one, obviously. Words and expressions change meaning over time, and at this point you might as well be upset over the fact that people use the word "computer" to mean "electronic calculating device" instead of its original meaning, "a person who performs tedious calculations by hand".

It's time to give up and accept that you have lost the fight. "Begs the question" now means "raises the question".

Comment Re:Cost? (Score 1) 157

And you're ignoring all of the work it takes to keep a pilot alive and at least reasonably comfortable. A UAV doesn't need a pressurized cockpit, comfortable air temperatures, a complicated and expensive ejection system (comprising not only explosives, rocket motors, and parachutes, but also survival and rescue gear such as flares, food and water, dye packs, and smoke grenades), or for that matter even seats. Nor does it need any of the input / output devices that a human pilot needs in order to actually fly the plane -- no display screens, gauges, joysticks, or anything of the sort.

I can't imagine that all of that doesn't VASTLY reduce the cost of such an aircraft. Not only do you save the incremental cost of stuffing all of that stuff into each plane, you eliminate the R&D cost of developing it in the first place and running tons of tests to make sure it works reliably.

Comment Re:Steve responds (Score 1) 282

OS X is supposedly UNIX but when I want to save things off the internet it will only let me go down one directory.

Have you even USED Mac OS X?

First off, "saving things off of the internet" is an application feature, not an OS feature. Second, if we're talking about Safari, it allows you to save files in whatever directory you want. By default it uses "Downloads", but you can either use Save As... or Download Linked File As... and specify a target directory. And if you don't like Safari, you can use Firefox, or Chrome, or hell, Lynx or wget if you like, and use their particular means of specifying target directories.

Comment Re:50gb BR disc : 3$ - 16gb USB key : 30$ (Score 2, Insightful) 277

Why on earth are you quoting manufacturing costs in one case and retail costs in the other? Retail Blu-Ray discs cost around $25-$30 -- right around the same as your quoted 16GB USB key price. As I don't know the manufacturing cost of flash memory, and evidently you don't either, we have no basis to make a comparison.

Comment Re:caveat (Score 3, Informative) 260

Primarily because they are cheap to breed and raise, take up very little space, reach maturity quickly, and people usually don't freak out about experiments on mice the same way they would on (say) primates.

They are also an acceptable human analogue in that they generally respond to medication and treatments similarly to how a human would; there are certainly other animals which are better models, but there are logistical, economic and public relations issues with trying to keep hundreds of chimpanzees in order to punch holes in their ears.

Slashdot Top Deals

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...