Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Wrong question (Score 1) 313

You don't want to make it even more cumbersome to change the code, as it sounds like you are already struggling with the 10 Mloc codebase. So forget about having humans "approve" the changes.

What you want to do is make it easy to submit good code and difficult to submit bad code. This means that you will need the capability to quickly assess the proposed patch, for some definition of "good" and "bad". Computers are fairly good at this. In other words: test-first development, with automated testing on several levels.

Bug comes in, you write an automated user acceptance test that FAILS. Then you try to find and fix the bug. When the newly added test passes, the bug is by definition fixed. Add a couple of unit/module tests, and a couple of UATs for good measure. Rinse and repeat.

There's plenty of tools for this. All major platforms have the capability for programmatic reflection. Use e.g. Cucumber for the acceptance tests, and the relevant unit test framework for module/unit tests. In addition to the functional tests, try static analysis.

Comment Re:Wait a minute (Score 1) 54

Is this an article about how the Windows 8 UI was designed?

Or about how they kept the world's population hostage with Clippy the Paperclip? I mean, when they heard Clippy was going to be removed from the next version of Office, around 350 million people upgraded straight away.

Or is it about how Microsoft is paying 500 million (USD, EUR, whatever) in fines every couple of years, in order to keep doing business as a software monopoly? That is probably the most brilliant crime by the Microsoft Digital Crimes Unit ever!

Comment Juggling (Score 1) 279

Learn to juggle! Seriously! I learned to juggle with three balls during a particularly stressful software project some 15 years ago. Nowadays when I feel blocked, I pick up three round objects and go somewhere else to juggle for a while. I haven't progressed beyond three objects but then again I'm not doing it for the fame and the money. :)

Juggling activates other parts of your brain than you (as a software engineer or IT guy) normally use. You can juggle as long as you like, ten seconds or ten minutes. The materials (e.g. stress balls, tennis balls, apples, oranges, whatever) are cheap and small. Does you good to get up out of that chair and stop staring at the monitor for a moment. If someone asks what you are doing, say that you're taking a minute to think about some small creative problem e.g. structuring the next three paragraphs you are writing, or looking for alternative ways to implement some feature. The learning threshold is admittedly steep, but the Internet should be full of tutorial videos by now. Also, juggling is a nice party trick, kids especially are fascinated.

Think of it as an off-screen microhobby...


Comment Re:Oh, great, exactly what I don't want... (Score 1) 248

> It's a fact that OS/UI developers seem to believe that [something]

A seeming fact, then? Or an opinion?

> we generally need increasingly large displays [read: pixel counts] in order to restore focus on the application and to minimize the impact on screen and usability which the OS/UI claims.

Not at all. We generally need more and better modes of interacting with our phones and the apps running on them. You see, there is an upper limit to the physical size of a phone: it needs to fit into your pocket. And there is also a lower limit on the size of an UI element: your fingertip. Increasing the resolution only enables the device to show sharper text and more detailed pictures.

There are many "novel" or "trendy" interaction methods like swiping, pinching, multi-finger dragging, gestures and so on, that are not yet commonly used in mobile phones or mobile apps. Properly used, they can free up screen estate. For example, an app that supports pinching doesn't need on-screen zoom buttons.

Many if not all of these interaction methods can profitably be reserved for the system UI. If you replace the most common system UI functions with gestures, you can remove the status and system bars and free up the top and bottom part of the screen. The Android 2.x swipe-from-the-top action is a good example, as it allows the top status bar to become very narrow. However there is no reason to stop there.

> Android 2.x seeks to minimize the UI impact and it does a nice job of it. A minimal row of buttons

No it doesn't. The iPhone minimizes the UI impact by simply NOT having that row of buttons. The Nokia N9 uses swiping creatively to do away with even the physical buttons and touch areas that you find on Androids, iPhones and Windows Phones.

All of these phones, even the N9, still have a slim status area at the top. Jolla's Sailfish aims to do away with even that.

> What Ubuntu-phone is proposing is unintuitive and seeks to infringe on how an app can live on a device. Do. Not. Want.

So you say. Now, in the real world, what Unbuntu proposes is really going to make some if not all of the system areas redundant and give apps more control over the screen real estate.

What is intuitive, by the way? In this context something is intuitive if you can pick it up and use it without training. It's not even remotely synonymous with "whatever I've gotten used to", as you seem to imply.

Have you actually tried a swipe UI? I have been using a swipe-based phone (the aforementioned N9) for a long time now, and it just works. You can either take my word for it, or go and try it yourself before commenting further.


Comment All models are wrong (Score 4, Interesting) 676

"Remember that all models are wrong; the practical question is how wrong do they have to be to not be useful." (George E.P. Box and Norman R. Draper, Empirical Model-Building and Response Surfaces (1987), p. 74)

"One of the most insidious and nefarious properties of scientific models is their tendency to take over, and sometimes supplant, reality." (Erwin Chargaff)

I think that says it all, really.


Comment The Celine Dion effect (Score 1) 89

One wonders if they'll be turned off by Celine Dion music — a new type of shark repellent perhaps?

Oh yes, the "Celine Dion effect" is well known. For example, playing My Heart Will Go On in railway stations late at night will magically keep the stations empty from pickpockets, rapists and other miscreants and indeed any sentient beings except cockroaches. The high notes will first melt your earwax and then your brain. Cockroaches merely lose their orientation and walk into walls.

I don't see any reason why it wouldn't work underwater too. Unless of course the sharks fight back with lasers.


Comment Re:Sad, but we could see it coming (Score 1) 78

One element I pushed was that nobody was going to be interested in their kernel, regardless of what they did, and that conversion to Linux would eventually be necessary if they were not to continue to expend millions on re-inventing the wheel.

Not a good thing to push because the kernel is the interesting part of Symbian. It's power-tight and has real-time features, both of which are very nice features in a mobile communications device. Unfortunately it only runs on ARM. Linux on the other hand runs on everything. With Qt on top of both Symbian and MeeGo, there's nowhere Nokia can't go. (There's no guarantee they'll actually go there, but they *could*.)


Comment Re:If this were posted to photo.net... (Score 1) 114

Ouch - this is the best that Hubble can do? The images show serious chromatic aberrations, with significant red-blue fringing on edges. What's worse is that the effect gets more pronounced as the camera moves around.

Given that the camera moves at relativistic speeds, the chromic aberrations are probably a relativistic effect and would of course get more pronounced the faster the camera moves. Another interesting side effect is that while for you the movie is over in a matter of minutes, someone observing you will feel that the movie takes too long, and incidentally also perceive you as significantly smaller.

Kids, don't try this at home!



Submission + - Red Hat, Novell issue back-to-back announcements (techtarget.com)

An anonymous reader writes: In the past 24 hours, the Red Hat Enterprise Linux (RHEL) beta became available through Amazon's Elastic Compute Cloud (EC2) service. While RHEL has been available in limited beta for a few weeks, this public beta announcement came on the heels of Novell's launch of SUSE Linux Enterprise Real Time 10, its latest enterprise OS offering. The launch of SUSE Linux Enterprise Real Time 10 is arguably the bigger announcement. Novell's new SUSE is a real time operating system, allowing critical processes to take priority over processor tasks.

Submission + - SPAM: Which certifications are worth your time?

alphadogg writes: For years, the key to jumpstarting a network professional's career was getting a Cisco, Microsoft or other technical certification. But now CIOs, IT recruiters and salary specialists say demand is waning for hardware- and software-oriented certifications. Instead, companies are looking for IT professionals with business-oriented certifications in such areas as project management and Six Sigma, a statistical quality improvement technique that is being adopted by more IT shops.
Link to Original Source

Submission + - Three-Way Music Software Shootout (extremetech.com)

ThinSkin writes: "Squabbling with bandmates is an old and busted tradition for rock stars. With games like "Rock Star" and "Guitar Hero," gamers are turning to media to unleash their inner rock star. Musicians a little more serious about crafting music can turn to computer apps to translate ideas into songs. Joel Durham Jr. over at ExtremeTech has a review of three popular music creation programs: Cakewalk SONAR Home Studio 6 XL, Sony ACID Music Studio 7.0, and MAGIX Music Maker 12 Deluxe. Each program has enough versatility for users to begin making music, but the right program depends on a person's skill level and budget."
Real Time Strategy (Games)

Submission + - Is our Universe Somebody Else's Hobby? (hughpickens.com)

Pcol writes: "The New York Times is running a story on Philosopher Nick Bostrum, Director of the Future of Humanity Institute at Oxford University, who has a web site on "The Simulation Argument" that contends that there is a pretty good chance that we are living in someone else's computer simulation.

Dr. Bostrom assumes that technological advances could produce a computer with more processing power than all the brains in the world, and that advanced humans, or "posthumans," could run "ancestor simulations" of their evolutionary history by creating virtual worlds inhabited by virtual people with fully developed virtual nervous systems. If civilization survived long enough to reach that stage, and if the posthumans were to run lots of simulations for research purposes or entertainment, then the number of virtual ancestors they created would be vastly greater than the number of real ancestors. There would be no way for any of these ancestors to know for sure whether they were virtual or real, because the sights and feelings they'd experience would be indistinguishable. But since there would be so many more virtual ancestors, any individual could figure that the odds made it nearly certain that he or she was living in a virtual world.
Robin Hanson, an economist at George Mason University, says that if you desire to live as long as you can in this virtual world — and in any simulated afterlife that the designer of this world might bestow on you, you should try to be as interesting as possible, on the theory that the designer is more likely to keep you around for the next simulation."

Slashdot Top Deals

Pohl's law: Nothing is so good that somebody, somewhere, will not hate it.