Forgot your password?
typodupeerror

Comment: Re:Was it taken out of context? (Score 0) 306

by egomaniac (#40749315) Attached to: Gartner Analyst Retracts "Windows 8 Is Bad" Claim

I'm curious how right click -> "Make Alias" is "one of the most counter intuitive things [you've] ever experienced in any UI."

I found an article on Microsoft's site describing how to do it on Windows, and I'm not sure how:

1. Right-click an open area on the desktop, point to New, and then click Shortcut.
2. Click Browse.
3. Locate the program or file to which you want to create a shortcut, click the program or file, click Open, and then click Next.
4. Type a name for the shortcut. If a Finish button appears at the bottom of the dialog box, click it. If a Next button appears at the bottom of the dialog box, click it, click the icon you want to use for the shortcut, and then click Finish.

counts as more intuitive in any universe.

And while I certainly agree that struggling to figure out why a particular program isn't working can be very frustrating, surely you're not suggesting that this issue is unique to MacOS? Go dig through the technical support forums for, oh, any Windows game ever if you'd like to disabuse yourself of that notion.

Comment: Re:No null pionters (Score 1) 232

by egomaniac (#38808171) Attached to: Mozilla Releases Rust 0.1

I don't think I could possibly disagree with this more. Arguing that a null-free language doesn't buy you any safety is like arguing that type checking doesn't buy you anything; you are correct that you can still make the exact same mistakes, but the difference is that the compiler will catch most of them. And having the compiler catch most of your mistakes before they become hard-to-find bugs in your runtime is exactly why most serious development is done in type-safe languages. Null is really the last holdout of non-type-safety, and it needs to go.

Comment: Re:unprecedented heights of productivity (Score 5, Informative) 223

by egomaniac (#38597966) Attached to: Germans Increase Office Efficiency With "Cloud Ceiling"

It's not fair to say that the house is built in six weeks. Yes, a house can be assembled from finished materials in six weeks, but you're not counting the effort to cut down the trees, transport them to a lumber mill, turn them into boards, mine the gypsum, turn it into drywall, mine the iron, convert the iron into steel wire, turn the steel wire into nails, refine oil into the raw plastic for pipes, mold the plastic into pipes and pipe fittings, transport all of these products all of the way from the factory to the building site, and on and on and on.

You can only build a house in six weeks because an army of people is busily creating all of these finished materials for you, and if you add up all of the labor, it probably does come to somewhere in the neighborhood of twenty man-years of work to create a house.

Comment: Re:Sad but smart (Score 1) 500

by egomaniac (#35357248) Attached to: The Decline and Fall of System Administration

Did you just say "Mac server"? ... That will eat away at your profits a whole lot faster, because you'll just buy a new "Mac server" or something :)

So you've evidently got no issue with a Windows server, but you take specific exception to Macs? That's odd, considering one of those two is a reliable Unix flavor, and the other is... well, Windows. What exactly is the issue with a Mac server?

Comment: Re:Naive Question (Score 1) 196

by egomaniac (#35351656) Attached to: Will the LHC Smash Supersymmetry?

If someone discovers a proof that P==NP, then even though we haven't found the practical solutions to some problems (factorization or whatever) yet, it means that there IS at least one "quick" solution.

Unfortunately, no, it doesn't imply a "quick" solution. All P=NP would mean is that these problems have polynomial-time solutions; it says nothing about how efficient those polynomial-time solutions would be. Your only guarantee is that for sufficiently large N, any polynomial bound is better than any exponential bound. It can still be ridiculously, phenomenally huge (like O(N^4000)), and "sufficiently large N" can also be a ridiculously large number. If you came up with a polynomial-time Traveling Salesmen algorithm that doesn't start to beat exponential time until you're handling 1,000,000,000,000,000,000,000 cities, technically you've proven P=NP, but in a completely useless fashion.

Comment: Re:Technological independence (Score 1) 88

by egomaniac (#35080518) Attached to: Russia Launches, Loses, Finds Military Satellite

So... did someone else launch a natural satellite?

Is that a serious question? The moon is a natural satellite.

Yes, it is natural. But it wasn't launched by man. That's sorta the point -- if a satellite was 'launched', you already know that it's artificial so it's redundant to say that.

Comment: Re:If that is representative of watson's capabilit (Score 1) 164

by egomaniac (#34546688) Attached to: 'Jeopardy!' To Pit Humans Against IBM Machine

Agreed, I crushed it. I am very impressed about its ability to answer some questions (It actually got "Black Death of a Salesman" and "Charlie Brown Recluse"), which shows that it has some very sophisticated linguistic analysis, but if it can't beat some random shmuck on the Internet, I don't see how this will be an interesting event.

Comment: Re:Been running a dev build for a few weeks now (Score 1) 212

by egomaniac (#34309606) Attached to: Apple iOS 4.2 Hands-On

If an app is linked against pre-4.0 libraries, pressing the home button kills it instead of putting it to sleep. Simply recompiling the app against the new libraries will enable multitasking support (there are some changes you might want to make, but I don't think any of them are required).

So far as I know, this isn't for any technical reasons. The early 4.0 betas enabled multitasking for all apps, and it was only midway through the beta that the behavior changed so that legacy apps quit instead of slept. I suspect this was done mainly for safety's sake -- prior to 4.0, you were constantly quitting apps and thereby making them regularly save their data. Post-4.0, an app might go for days without actually quitting. So these apps wouldn't receive any signals that they were being put to sleep (since the APIs to tell them that didn't exist back when they were compiled), potentially for days, meaning that a crash could wipe out days of data. Prior to 4.0, it would simply never be the case that you run a single app for days straight, and multitasking-aware apps don't suffer from this problem because they know to save their data at convenient intervals.

Comment: Re:Physicists (Score 4, Informative) 166

by egomaniac (#33978098) Attached to: Fermilab To Test Holographic Universe Theory

So given Moore's law you will eventually end up with a single physical universe and hugely many simulated universes.

Moore's law is an observation about how fast technology is developing, not an incontrovertible law of physics. It will not hold forever, because eventually we will run up against physical limits preventing us from cramming more computing power into a given region.

In particular, it is impossible for a given amount of matter to perfectly simulate more matter than itself. If it were possible -- if you could e.g. use a ten kilogram computer to simulate twenty kilograms of matter -- then your ten kilogram computer could simulate two of itself, doubling its storage. Further, each of those computers could then simulate two more, and so forth, leading to an obvious contradiction (infinite storage requires infinite entropy, which has been proved impossible). Note that this argument holds even if the simulation is slower than real time; no matter how long it takes to simulate, you can't store more memory than you had to start with.

Now, of course this all hinges on the word "perfectly". There's no reason a computer can't simulate large amounts of matter with less-than-perfect fidelity, which is something that we do all the time. But given that we can build working computers, nuclear reactors, particle accelerators, and all that, let alone the vastly-more-complicated processes going on in each and every cell in your body, we are clearly not living in some cut-rate simulation which is hand-waving the laws of physics. We don't know how to model all of this stuff in a computer, but given that it takes supercomputers to simulate hydrogen atoms accurately, and we can't even solve the equations by the time we get to helium, it seems safe to assume that no matter how sophisticated our technology becomes, it will always require a couple orders of magnitude more matter than what you're trying to simulate (if you doubt this, consider a practical example of a computer trying to simulate itself. Can you really picture a computer with 4GB of memory accurately simulating the behavior of 4GB of RAM at the subatomic level? It can't even emulate a different computer with 4GB of memory, let alone simulate it at the subatomic level). So, we're talking about a computer which is, at an absolute minimum, a couple of orders of magnitude bigger than the entire universe.

(For completeness, I will point out two possible "outs" for this problem: First, it's possible that there's some trickiness going on, and "the entire universe" isn't actually modeled. Maybe only a small portion of the universe is modeled accurately, and everything else is an easy low-grade simulation used to trick us. That's certainly possible, but it's also unfalsifiable, so I'm not sure it's worth seriously debating. Second, this assumes that the simulator and the simulation are operating under the same laws of physics. If the "real world" which is simulating our world has different laws of physics, which allow for vastly more powerful computers than anything we could possible hope to build using our cheap low-grade physics, this scenario wouldn't be as ridiculous. And, really, quantum mechanics is so weird that "it was outsourced to the lowest bidder" may actually be a decent explanation for it.)

Regardless, though, I don't understand how the "it is much more likely that we exist in a simulated universe" idea is getting serious traction. No, it's not impossible, but "likely" is a hell of a stretch.

Comment: Re:Brakes, please. Please? (Score 1) 226

by egomaniac (#33069436) Attached to: The Physics of a Rolling Rubber Band

I think you're on the wrong side of this one, obviously. Words and expressions change meaning over time, and at this point you might as well be upset over the fact that people use the word "computer" to mean "electronic calculating device" instead of its original meaning, "a person who performs tedious calculations by hand".

It's time to give up and accept that you have lost the fight. "Begs the question" now means "raises the question".

Comment: Re:Cost? (Score 1) 157

by egomaniac (#32888890) Attached to: Boeing, BAE Systems Show Off New Unmanned Planes

And you're ignoring all of the work it takes to keep a pilot alive and at least reasonably comfortable. A UAV doesn't need a pressurized cockpit, comfortable air temperatures, a complicated and expensive ejection system (comprising not only explosives, rocket motors, and parachutes, but also survival and rescue gear such as flares, food and water, dye packs, and smoke grenades), or for that matter even seats. Nor does it need any of the input / output devices that a human pilot needs in order to actually fly the plane -- no display screens, gauges, joysticks, or anything of the sort.

I can't imagine that all of that doesn't VASTLY reduce the cost of such an aircraft. Not only do you save the incremental cost of stuffing all of that stuff into each plane, you eliminate the R&D cost of developing it in the first place and running tons of tests to make sure it works reliably.

What this country needs is a good five dollar plasma weapon.

Working...