Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Get HideMyAss! VPN, PC Mag's Top 10 VPNs of 2016 for 55% off for a Limited Time ×

Comment Re:ARM Windows (Score 1) 167

It's actually easier to recompile existing 32-bit x86 for 32-bit ARM than for 64-bit x64, especially if Microsoft released an ARM backend for the visual C compiler. As long as Windows-for-ARM came out before too many applications transitioned to 64-bit only, it's easy to imagine it could succeed.
If they're aiming at the tablet/netbook market, then the lack of hardware drivers won't be a problem, they just need to support the on-board hardware and a few key applications (IE, Office, Flash). Ironically, if Apple's AirPrint takes off, they won't even need printer drivers. if they were able to run .NET, that would give them a lot of compatibility for free, even in-house corporate apps.

Comment Re:60fps on a phone? Why? (Score 1) 105

Let's not forget latency. On modern 3D architectures, there can be several frames between when the game engine processes user input and when the result appears on the screen. Typically the CPU is filling out a display list for frame 3 while the GPU is rendering frame 2, and the display is showing frame 1. And this is over any additional latency in input processing. At 60 fps, 3 frames is under 50 ms, while 30 fps it's 100 ms. For an amazing display of what low latency is like, try playing something like Kaboom on an Atari connected to an analog tube TV. No buffeting, 60 fps, so only 16 ms between moving the paddle and seeing it move on screen. For racing games, the latency has a huge impact on the user's ability to control a vehicle without entering oscillation. Also, depending how far you hold the phone from your eyes, an iPhone game could potentially take up any amount of the visual field. 60 fps is definitely noticeable on an iPhone. Whether or not the tradeoff of reduced graphical detail versus 30 fps is worth it is a very subjective choice, which is why even on home consoles, there is no standard.

Comment Re:In their defense (Score 1) 965

Don't forget that the primary reason for the existence of Apple Inc is to facilitate the orderly and systematic transfer of money from the bank accounts of bored yuppies to the account of Steven Jobs.

That was priceless... and probably hits a little close to home for me these days.
As a youth programming C64s in BASIC and a little ASM, part of the appeal was being able to make programs (e.g. games) that weren't too far off in complexity/polish from commercial offerings. Nobody was interested that you could fill the screen with "I LIKE BUTTS", but having a joystick controlled sprite character wandering around shooting things was kind of cool.
I wonder if a better equivalent today would be writing Javascript/HTML in a web browser, or perhaps flash. Much as when I was doing PEEK and POKE in BASIC while pro developers were doing crazy hand-coded assembly hacks to get ultimate performance, the same relationship could exist between Javascript or Actionscript versus C++. Just as cutting my teeth in BASIC helped lay a foundation to eventually learn C/C++ and become a professional software developer, this might be how the next generation will start out. For a kid, getting something interesting to happen when they code is probably the most important thing to get them hooked.
From what I recall of the iPhone/iPad restrictions would allow some sort of web development app to be created as long as it used the Webkit Javascript runtime. I'm not a web developer so I could be way off, but I suspect that even with Apple's restrictions there could be some pretty cool stuff for budding programmers on an iPad. Also, there's no reason Apple couldn't make an iPad version of Xcode that would have the same restrictions as the regular iPad SDK ($99/yr contract, only run on developer iPads, require App Store approval for public release). I think a $499 iPad plus $99/yr is still considerably cheaper in inflation-adjusted dollars than a C64 was in 1984 (>$1200 in todays dollars).

Comment Re:Not just software (Score 1) 306

In the previous era of console games before they started supporting patches, games were treated much more like hardware in this sense because once you made the gold master and started printing copies, you couldn't change it. When you compare PC games of that era with console games, the rate of crashes and bugs was much higher on PC games. This was partly, of course because they had to run on a zillion configurations and depend on buggy device drivers, but also because the console makers had fairly rigorous submission testing requirements, and could hold up a game from shipping if it didn't meet these requirements (In the case of 1st party games, of course, there is an inherent conflict of interest there, but typically the approval process was fairly independent of the console makers' publishing arms). By contrast, PC games (like other PC software) have absolutely no oversight, so developers/publishers just do however much or little quality control that they feel they want to.
The PC approach is to let the market reward or punish software companies for how buggy their products are. Unfortunately, so much software uses various means of lock-in to prevent users from switching, that it ends up being a situation where the focus is on getting users to pay for upgrades, sometimes merely to fix things which shouldn't have been broken in the first place, and even that incentive is being removed as more software packages try to force users to constantly upgrade (e.g., make different versions non-interoperable but not sell older versions, so if you add new seats to a company you may be forced to upgrade the whole office)
For software used in high-risk situations (e.g., medical software, aeronautics, space flight), the penalty for failure is high enough that people are willing to pay (and wait) for extensive quality control. For most other software, this is not the case.

Comment Re:Good thing it's a beta (Score 1) 496

I've build a couple of internal GUI tools at work, and I see this all the time among my less-technical coworkers--they just click Ok on anything that pops up without reading it, even if it's an error message, and then come and ask me why is something not working. It's a problem of too many apps crying wolf with too many needless popups and confirmation boxes that have trained people, but it's also just the nature of most people just to keep clicking on different things semi-randomly until they get the result they wanted.

I think the only real solution is better UI design--make things work the way people expect them to, make doing the right thing seem easier and more obvious than doing the wrong thing, try to make dangerous things more buried away, etc.

Probably the only good time to use a popup box if it is a failure state where the app just can't do what is requested, where even if they close the box, they'll keep trying it again and getting the box again, eventually they might read it.

This is one of the things I prefer about OS X, in that it seems much more "quiet" with far fewer popups, flashing task tray notifications, etc. The one exception is the way the System Update icon just keeps bouncing up and down if it has a new update--it would be nice if after ten seconds it would switch to a less annoying animation, or maybe just bounce every now and then, so that if you're in the middle of reading an article or something, you don't feel like you have a two-year-old tugging your pant leg.

Hardware Hacking

Submission + - Cold Fusionat room temperature possible after all?

StarfishOne writes: DailyTech reports the following interesting bit of news today:

" Cold fusion, the ability to generate nuclear power at room temperatures, has proven to be a highly elusive feat. In fact, it is considered by many experts to be a mere pipe dream — a potentially unlimited source of clean energy that remains tantalizing, but so far unattainable.

However, a recently published academic paper from the Navy's Space and Naval Warfare Systems Center (Spawar) in San Diego throws cold water on skeptics of cold fusion. Appearing in the respected journal Naturwissenschaften, which counts Albert Einstein among its distinguished authors, the article claims that Spawar scientists Stanislaw Szpak and Pamela Mosier-Boss have achieved a low energy nuclear reaction (LERN) that can be replicated and verified by the scientific community."

NewScientist is also running an article on this subject, but that article is only available for subscribers.
United States

Submission + - Preventing Global Warming Costs only 0.1% GDP

reporter writes: "According to a report by "The Economist", the United Nations has just released a study demonstrating that preventing global warming is relatively expensive. The study states that the cost is only 0.1% of gross domestic product per year."

Submission + - The Principality of Sealand Finally gets new owner

An anonymous reader writes: After four weeks of negotiations and in a historic move, the founders of have purchased the Principality of Sealand for an undisclosed sum. The founders, represented at negotiations by Paul S. Gates, and the Bates family are both said to be pleased by the settlement figure, which will not be announced in keeping with the final agreement. Viva la revolution!

Feed Will there ever be a real 'Lie Detector'? (

Polygraph Pollyannas

Column Lie detectors figure prominently in the sauciest dramas, like espionage and murder, but they deeply polarize opinion. They pit pro-polygraph groups like the CIA, the Department of Energy and police forces against America's National Academy of Sciences, much of the FBI, and now the US Congressional Research Service. The agencies in favor of lie detectors keep their supporting data secret of obfuscated. The critics have marshaled much better arguments.

Slashdot Top Deals

Real programmers don't write in BASIC. Actually, no programmers write in BASIC after reaching puberty.