Become a fan of Slashdot on Facebook


Forgot your password?
Take advantage of Black Friday with 15% off sitewide with coupon code "BLACKFRIDAY" on Slashdot Deals (some exclusions apply)". ×

Comment Re:The Problem: code not seeing the light of day.. (Score 1) 123

Ensuring all developers in the industry are competent is a pipe dream. Take a look at the most exacting careers you can think of - and you'll find varying levels of competence.

People are imperfect (in the sense that they can have a bad day, and let typos slip by from time to time - even the very best of us). Additionally the real software lifecycle is not like frozen water. It is more like all the different states of water - solid, liquid, and gas, changing as its environment changes on a continuum from birth to death.

I agree we should do something. I think that 'something' should be more than just training and hoping they use what they've learned.

Comment Re:The Problem: code not seeing the light of day.. (Score 1) 123

People are not perfect automatons - therefore you always run the risk, and probably will see new bugs and vulnerabilities. However, that is okay - in the sense that it will still reset the clock (assuming you caught the existing zero days in the process). Now the hackers will have to start over - and it will take them another 6 to 12 months to figure out an exploit to the bugs you introduced - assuming they are actually exploitable. Therefore it makes sense to review and refactor code on a recurring basis. The benefits outweigh the costs.

Comment The Problem: code not seeing the light of day... (Score 1) 123

The real problem here is willingness to fund what is necessary - refactoring all code used in critical systems to ensure they are secure - and to maintain that approach over time in an iterative basis.

We should touch code (at least to review it) - every year - which research indicates is the sweet spot for zero-day exploits. We get more benefits if we refactor the code - effectively resetting the clock for exploit writers to find a new zero day, and develop applications to exploit it.

Working in IT today, I can tell you from experience no one is willing to spend money to constantly refactor code without delivering new functionality (read 'revenue generating functionality'). This approach also is counterintuitive to software engineers trained to value code reuse over rewriting or building new solutions.

Instead, they focus on cosmetic bandaids - such as firewalls, antivirus, patch updates, and policy management. All of these things are important - but in the scheme of things will not stop a zero day exploit - particularly given that most patches for zero days are not available until the zero day is discovered - and then the time it takes the developer/company in question to put out a fix - on average 6 months to a year after the zero day is discovered and reported. Meanwhile the network is wide open to anyone who has figured it out (which is roughly 6 months to a year after a new piece of software is deployed on the network). The problem is related more to how humans learn systems than any particular coding practice. Your code refactor efforts just need to fall inside of that curve - leading rather than following.

Finally - the proposed fixes, such as more regulations, will not fix the problem - and will only serve to drive people out of the business, at the precise time when we need more developers than ever to address the problem effectively.


1. Pay for what is needed in IT instead of being cheap. If you get more specific regulation of this - you might not have a choice (e.g. Sarbanes-Oxley)

2. Let your developers as a whole spend some time on evaluating code - the more eyeballs you have the better.

3. Move away from expensive water-fall projects to more flexible agile methods, and adjust your funding protocols to match.

Comment Generation Generalizations (Score 1) 394

Reading the post and associated comments reminds me that the uptake, continued use, and internalization of technology is influenced by generational 'norms'.

Of course the Millennial is flabbergasted you don't have a Facebook, Linkedin, and 10 other flavor of the month social media whatz-its... his/her life being both focused on their network of friends, and the assurance that they can find anything online to answer any question about newcomers.

The Gen X-er shrugs, and goes back to trying to make money (making on average 12% less then their fathers/mothers and trending down - yet optimistic to the end), and thinking about his next job hop and who from his fraternity can get him in touch with the hiring manager.

The Boomer rails at the dangers of social media, and the lack of independence of the Millennial, counting email as sufficient for all contacts less formal than a letter.

Realizing that all generalizations are less than useless, I sit here in the corner loathing you all.

Comment Re:I agree .. BUT .... (Score 3, Informative) 232

Every organisation needs a "not boring" slot of time for their developers. Not for product that needs to ship NOW.. but for stuff that may need to ship next year.


Except I would add: "may never ship at all."

The key point here is you aren't betting the company on it, but you still should be doing it. Every company should encourage innovation - and even if the company isn't willing to bet any cash on it. Another way is to encourage your developers to spend some time on their own personal FOSS projects. What this gives you is experience - and from a risk vs. reward perspective, success is attained not by how much working (boring) code you produce, but really how many times you try something that fails, and get up again and keep pushing on with new/modified ideas based upon this experience giving your customers real value. Companies without this perseverance will fail, or at best will be mediocre.

On the flip side - if your core business (the part that you are trying to show your customers you are innovative and a leader in) becomes too boring - and by too boring I mean while it may 'work', it may not do what a customer really wants/needs - then you run the risk of losing those customers to someone who will try and be willing to fail.

Just like all oversimplified prescriptions, the article's concept does not take into account the nuances of business goals, risk aversion level, available human factors and skills, and so on.

Comment Not completely useless... (Score 1) 486

They should have viewed this presentation about increasing a python data crunching application 114,000 times faster before they set off on their research project.

To summarize - there are a multitude of ways to optimize your application including using the chip's onboard cache to avoid the overhead/delay of accessing memory on the motherboard across the bus

Yes - as we try to eek out more performance from our applications - we'll need to consider the relationship between our applications and the underlying implementation and capabilities of the hardware it lives on. Further - I would say we also should be considering how to make our tools do this sort of thing for us. Given the complexities we are seeing in the development arena today, including virtualization, the need to do more with less both on the back end, as well as on small hand held devices, and the need to build more faster while increasing security of what we build, I consider it imperative.

Comment Re:Kinesis (Score 1) 452

When I started to have wrist tendon pains about 10 years ago - I got a Microsoft ergonomic - which helped a lot - and I used that for years. However, through use of that keyboard, I also started to have pain again. Part of it was the non-adjustable nature of that keyboard, and the other part of that was the short compression rubber dome switches it used - which was destructive to my fingers and caused me to slow down and use a lighter touch (I tend to pound the keyboard when typing fast due to learning to touch type on a manual typewriter back in the day, and then a number of years of buckling spring style keyboards in between that and the commoditization of computers and keyboards that emerged in the 90s).

Today I have two Kinesis Maxims (one for work, one for home), and I love the way it adjusts - three different 'tent' levels, and fully adjustable angle spacing. Even as good as that is, I still have an old IBM buckling spring keyboard (from an old RS6000 - not a model M - but identical for all intents and purposes), and recently purchased a Corsair gaming keyboard with no number pad (I never use the keypad on 101 keyboards anyway - always type my numbers using the number keys at the top) - which itself uses Cherry Red key switches. Not quite as clicky as the buckling spring - but very nice on my fingers when compared to the pseudo mechanical/dome switches on the Maxim. So when working on different machines, I get different experiences.

For me, the key to healthy wrists and hands as a typist is to mix things up - use more than one type of keyboard. Don't get stuck in a rut.

Comment Re:quiet mechanical keyboard (Score 1) 452

This sounds a lot like my Corsair Gaming keyboard with Cherry Red key switches I am using to type this.

I also have two Knesis Maxim keyboards (ergonomic split keyboard with 3 height adjustment levels), and one IBM keyboard with buckling spring switches I nabbed off an abandoned RS6000.

I keep on of the Maxims at work - and the rest I use at home. I like to move from one to the next periodically - because I have found that mixing up the different types of keyboards (and trying to avoid the rubber dome ones if possible) - is better for my Carpal-Tunnel than sticking with any one type.

My favorites though are the IBM buckling spring, and the Corsair Cherry keyswitches. --- very nice action - and I can type very fast / accurate on these in comparison to the maxim or other rubber dome switch keyboards.

Comment Re:The solution being totally obvious .. (Score 1) 43

Bill Gates, Steve Jobs, and Steve Wozniak were part of the Digital Revolution where they wanted to decentralize data and put computers in the hands of the people.

Now it looks like we need a backlash.

No, the solution isn't centralization of our data systems. You can already see where that is leading with the high profile exposures today (Sony, Target, et al). It is a fallacy to assume corporations have all the answers, or will act in the general public's best interests. Short term profit is the only thing that has any meaning in that system.

At the same token we can't continue going along like we are - as that is already proven to fail.

The very thing that makes the internet useful for communications and commerce for large populations spread all over the globe, is the same thing that is at the core of it's weakness: public key encryption. To be more specific, computers are designed not to be random, and the systems we've devised to get around this problem have limits that may be exploited. When paired with encryption these limits open up potential exposure, and advancements in computing technology allow those exploits to be more readily used. For certain short term transactions, this level of exposure may be an acceptable risk - for data that is transient in nature, and not useful to someone at some future point in time. However, much of the data we trust to encryption could be useful to a 3rd party in the future.

We could ensure our systems (personal or corporate - doesn't matter) are completely secure from a remote attacker - by placing them inside a Faraday cage, and disconnecting them from the internet. While the data would be secure, it wouldn't be very useful in the broader context of communication and commerce - but for some types of information it might be an appropriate approach, and I imagine is what some sensitive government networks opt for their classified systems. For all other systems it would be as useful as throwing them into the deepest part of the Pacific Ocean - secure, but useless.

In order to communicate on the wider stage then, we must accept a certain amount of risk. I think we are all in agreement that the current risks are unacceptable the way they are today. I also think there is no single magic bullet. I think you will see the teams focus on the following areas, assuming corporate interests are not overly impacted by the potential solutions:

Tools - tools need to be devised that don't allow neophyte application programmers to shoot themselves in the foot.

Training - training has to be developed based upon new approaches, and made available widely.

Willpower - everyone - corporations down to individual developers - must have the willpower to do some things that might be hard at first (e.g. code reviews of all code - including libraries, refactoring/rewriting same in light of security issues etc) - and these things need to become habit.

Whatever the outcome, there will be no silver bullet.

Comment This brings into question our theories... (Score 1) 157

The big bang theory is just that: a theory. It is not yet proven indisputably as a law of nature.

New ideas and observations, such as this article on new equations and this article on lack of expected gravitational waves put the theorum to the test. Furthermore, the Pope declaring the 'big bang theory right' only increases the need to check our models and assumptions on this subject (and now that I think about it, wouldn't the church have a vested interest in a non-permanent universe to mesh with end-times dogma)?

At least until we get some indisputable evidence, we need to continue to question our theories, record our observations - and try to see where the puzzle pieces fit. Being a dogmatic scientist is worse than being ignorant - the scientist should know better.

"Survey says..." -- Richard Dawson, weenie, on "Family Feud"