Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Oracle Exec Strikes Out At 'Patch' Mentality 264

An anonymous reader writes "C|Net has an article up discussing comments by Oracle's Chief Security Officer railing against the culture of patching that exists in the software industry." From the article: "Things are so bad in the software business that it has become 'a national security issue,' with regulation of the industry currently on the agenda, she said. 'I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."
This discussion has been archived. No new comments can be posted.

Oracle Exec Strikes Out At 'Patch' Mentality

Comments Filter:
  • Of course (Score:5, Insightful)

    by Anonymous Coward on Monday May 29, 2006 @04:46AM (#15423660)
    Oracle are (rightly or wrongly) worried about competition from Open Source. Regulation of the software industry would be a major benefit to them in this. Anyone who didn't meet the regulators' criteria couldn't compete.
  • by Anonymous Coward on Monday May 29, 2006 @04:47AM (#15423663)
    In other words, you should make your choice not on merit, but on a short list of products from an exclusive club. There is a ring of corruption to this G
  • by Mikachu ( 972457 ) <burke.jeremiahj@ ... m minus caffeine> on Monday May 29, 2006 @04:51AM (#15423675) Homepage
    Of course the "patch, patch, patch" business plan is bad for consumers. But in truth, most software companies don't care about consumers. They care about making money. As it happens, most people really don't care enough about the subject to make the companies change.

    One of the examples in the article asks, "What if civil engineers built bridges the way developers write code?" and answers, "What would happen is that you would get the blue bridge of death appearing on your highway in the morning." The difference here, however, is that civil engineers couldn't get away with making rickety bridges. You would find public outcry if it broke while people were on the bridge. In the software world, however, they scream and the companies just fix it with a patch and it shuts the consumers up. Saves a lot of money and time in testing at companies.
  • by pe1chl ( 90186 ) on Monday May 29, 2006 @04:57AM (#15423688)
    Another difference is, that when you build a bridge and it collapses you will be held liable for it.
    When you build software, you just attach a EULA that says "I shall not be held liable" and that's it.

    Once software makers, especially the large commercial companies, find themselves in the same boat as other industries and have to pay compensation when bad stuff is released, they will certainly step up quality control to the next level. Because it saves them money.
  • by zappepcs ( 820751 ) on Monday May 29, 2006 @04:59AM (#15423695) Journal
    Wow, really nice slice on the Brittish.. FTFA

    She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."

    It seems to me that the F/OSS industry has shown that fast, and effective patches can be applied, and that software we pay for has less then reasonable responses to such threats. I use F/OSS and I'm quite happy with the response they have to software problems. I don't expect it to be of NASA quality, just to be good, and it is. For the amount that you have to pay for Oracle et al, you expect fast resonses on problems. The problem is that they don't respond fast enough. There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers. Certainly, Oracle isn't showing that they deserve the price they demand, at least not in this respect.

    I might be off topic, but all the F/OSS that I use, delivers what I pay for AND MORE. The software that I have to pay for is lacking. When you pay thousands of dollars, you expect patches in a timely manner, and before you get hacked. I think this is a big reason that F/OSS will continue to win hearts and minds across the world. Despite the financial differences, F/OSS actually cares, or seems to, and they do fix things as soon as they find out, or so it seems to me. They have a reputation to uphold. Without it, they will just wither and die. It amazes me that investors, stock holders, and customers are willing to wait for the next over-hyped release of MS Windows while they suffer the "stones and arrows" of the current version. It appears that no matter how bad commercial software is, people rely on it. Yes, of course there is more to the equation than this simple comparison, but I think this is important. If you weigh what you get against what you pay, F/OSS is a good value. The argument is old, and worn, but ROI is a big deal, and patches make a difference to ROI.

    Is it really what the software industry needs? A set of rules to make things bullet proof.. which of course won't ever happen. That kind of mindset is totally wrong, even though the sentiment is in the right place, you can't regulate quality in this regard. Sure, you can make sure that all gasoline is of a given quality, but I don't trust the government to test and regulate software. The US government already has a dismal record of keeping their own house in order on this account, I don't want them telling me how to do anything or what I can and cannot sell, never mind what I can give away for free under GPL.

  • by Toby The Economist ( 811138 ) on Monday May 29, 2006 @05:02AM (#15423704)
    "I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."

    Funnily enough, I'm just now reading Darrell Huff's book, "How To Lie With Statistics".

    The problems with her poll are manifold.

    Firstly, her group is composed of securiy officers who are on the CSO Council; might their views differ from security officers not on the Council? perhaps tending to be more of the belong-to-an-organised body sort? might perhaps therefore be predisposed towards regulation?

    Secondly, of the officers on the Council, which ones did she ask? all of them? or did she have a bias to tend to ask those she already knows will agree? perhaps those who found it rather boring and aren't quite so pro-organised bodies just don't turn up at the meetings.

    Thirdly, what's her position in the organisation? if *she* askes the question, are people more likely to say "yes" than they would to another person?

    Fourthly, are people inclined in this matter to say one thing and do another, anyway? e.g. if you do a survey asking how many people read trash tabloids and how many people read a decent newspaper, you find your survey telling you the decent newspaper should sell in the millions while the trash should sell in the thousands - and as we all know, it's the other way around!

    Fifthly, even if the views of members of the CSO Council truely represent all security officers, and even if they were all polled, who is to say the view of high level security officers is not inherently biased in the first place, for example, towards regulation?

    So what, at best, can her poll tell you? well, at best, it can tell you that chief security officers who regularly turn up at meetings will say to a particular pollster, for whatever reason, and there could be widely differing reasons, that they think regulation is a good idea.

    Well, I have to say, that doesn't tell us very much, and that's even assuming the best case for some of the issues, which is highly unrealistic.
  • Re:One problem (Score:2, Insightful)

    by paskie ( 539112 ) <pasky.ucw@cz> on Monday May 29, 2006 @05:03AM (#15423707) Homepage
    Someone sends you an image and tricks you to open it in Gimp (social engineering of that kind is not really very hard to do). Then depending on nature of the bug, he can install a backdoor calling out to him and asking for further commands, or whatever.
  • what a moron (Score:1, Insightful)

    by Anonymous Coward on Monday May 29, 2006 @05:04AM (#15423712)
  • Re:One problem (Score:3, Insightful)

    by fatphil ( 181876 ) on Monday May 29, 2006 @05:07AM (#15423719) Homepage
    If you've ever forwarded an image file to a friend who might forward it to other people, then you are a potential vector for malware. Sure it may look like a picture of a carrot that looks like Tom Hanks, but if it causes a buffer overrun that installs a rootkit, and one of the friends-of-a-friend wants to 'photoshop' out the logo in the corner, then someone's getting as 0wned as if they clicked "yes" after downloading an executable.

    The moment you say that security doesn't matter in on place, that becomes the ideal place for attacks to be focussed.

    FatPhil
  • This, from Oracle? (Score:5, Insightful)

    by Anonymous Coward on Monday May 29, 2006 @05:07AM (#15423720)
    Whose patches are infamously known to break stuff, released in 6 month batches (maybe just a mite too spaced out?), and so infamously poor at actually patching their bugs that they currently have an open, publically known 0day with no patch, because they screwed up patching it last time and it's still open?

    And they think security patches are a poor model?

    Maybe that's why they put so little effort into them. Maybe that's because they put so little effort into them. Maybe some people think of it as bridge maintainance, and they want to build the bridge perfect every time? When they can't even get patches right when they have six months between them? Fat chance.

    Honestly, out of the people in the software industry, even Microsoft do a better job, security-response-wise, than Oracle. And when you're behind Microsoft in that department, you've really got a problem.

    They need to make a serious effort at security response and treat it like a real priority, not show-ponying about regulation when, if they were regulated, they would still be completely unable to respond, but would point to poorly-drafted regulation as "tying them up in red tape".
  • by cyber-vandal ( 148830 ) on Monday May 29, 2006 @05:15AM (#15423739) Homepage
    As soon as the management starts to then so will I. Or did you think unrealistic deadlines and bad overall designs come from the grunts?
  • by Anonymous Coward on Monday May 29, 2006 @05:22AM (#15423748)
    I really don't get all the negative comments. I think it is high time that people really start to address this issue and I can only applaud her for doing it.

    Lack of security, lack of taking responsiblity and the relience on customers as beta testers really is a big problem in the software industry and as she rightly notes, it's going to have some serious repercussions for this industry.

    So, if you want to avoid these, get your act together.
    Do something about the problem, but don't shoot the messenger!
  • Just Be Clear (Score:4, Insightful)

    by Enderandrew ( 866215 ) <enderandrew&gmail,com> on Monday May 29, 2006 @05:25AM (#15423753) Homepage Journal
    Often, when consumers are given the choice they prefer to have software sooner, even in a beta state. We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

    Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?

    Do you think major corporations are just going to hand over source code? Can you imagine the leaks?

    Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?
  • by Masa ( 74401 ) on Monday May 29, 2006 @05:33AM (#15423769) Journal
    Well, patches are not nice and of course it would be better for customers if the product would be perfect from the start. It's true that the most software products are buggier than, for example, fifteen years ago. On the other hand, there are several reasons for the (lack of) quality of the modern computer software. Tight dead-lines, investors, competition, to name few. And of course it's always possible to cast some blame to the software engineer.

    However...

    I don't like that she is using age-old classics for fear mongering. "National security" and the bridge analogy to be specific.

    Bugs themselves are rarely the problem when we are talking about "national security". For some odd reason it seems that people have forgot the importance of physical separation of the public network and sensitive information / infrastructure. It's stupid to blame the tools if the user is an idiot (and in this case I mean those "chief security officers", who design these braindead infrastructures for corporate networks).

    I don't understand how anyone in their right minds could suggest any kind of regulatory system for the software quality. It's practically impossible to control and what if there is some sort of accident caused by some regulated and "certified" product? Is this certification (or what ever) a free pass for the software provider? This would turn to be an ultimate disclaimer for the software companies. Or - the other way around - the ultimate responsibility, which would lead to the point where there are no more software engineers because there is too much personal responsibility involved.

    Besides, in my opinion, Daividson insults British people pretty badly and describes them as "slightly disrespectful of authority, and just a touch of criminal behaviour." I think that's not a very professional comment.

    Anyway, this is what I'm thinking about of this whole article.
  • by Sycraft-fu ( 314770 ) on Monday May 29, 2006 @05:35AM (#15423774)
    The difference is that software is expected to be cheap, released fast, and to run on all kinds of platforms. Sorry, that leads to errors. You can have software that never needs patching, you just have to take some concessions:

    1) Development cost will be a lot more. You are going to have to spend time doing some serious regression testing, besically testing every possible compination of states that can occur. May seem pointless, but it's gotta be done to gaurentee real reliability.

    2) Development time will be a lot more. Again, more time on the testing. None of this "Oh look there's a new graphics card out, let's get something to support it in a month." Be ready to have years spent some times.

    3) Hardware will be restricted. You are not going to be running this on any random hardware where something might be different and unexpected. You will run it only on hardware it's been extensively tested and certified for. You want new hardware? You take the time and money to retest everything.

    4) Other software will be limited. Only apps fully tested with your app can run on the same system. Otherwise, there could be unexpected interactions. The system as a whole has to be tested and certified to work.

    5) Slower performance. To ensure reliability, things need to be checked every step of the way. Slows things down.

    If you aren't willing to take that, then don't bitch and demand rock solid systems. I mean such things DO exist. Take the phone switches for example. These things don't crash, ever. They just work. Great, but they only do one thing, yoy use only certified hardware, they've had like one major upgrade (5ESS to 7R/E) in the last couple decades, and they cost millions. You can do the same basic type of stuff (on a small scale) with OSS PBX software and a desktop, but don't expect the same level of reliability.

    The thing is, if your hypothetical bridge were software (and it's quite simple compared to software) people would expect to be able to put the same design anywhere and have it work, drive tanks over it and not have it collapse, have terrorists explode bombs under it and have it stay up and so on and have all that done on 1/10th of the normal budget.

    Until we are willing to settle for some major compramises, we need to be prepared to accept patches as a fact of life. I mean hell, just settling on a defined hardware/software set would do a lot. Notice how infrequent it is to see major faults in console games. It happens but not as often. Why? Well because the hardware platform is known, and you are the only code running. Cuts down on problems immensly. However take the same console code and port it to PC, and you start having unforseen problems with the millions of configurations out there.

    Me? I'll deal with some patches in return for having the software I want, on the hardware I want, in the way I want, for a price I can afford.
  • by charlievarrick ( 573720 ) on Monday May 29, 2006 @05:36AM (#15423776)
    The whole bridge::software analogy is:

    1. A straw man man argument and a poor one at that. It's not uncommon for civil engineering projects to require "patches" http://en.wikipedia.org/wiki/Big_dig#Reports_of_su bstandard_work_and_criminal_misconduct [wikipedia.org]

    2. An obviously bad analogy, I'm sure the specifics will be discussed here ad infinium.
  • by suv4x4 ( 956391 ) on Monday May 29, 2006 @05:48AM (#15423804)
    That's a typical manipulation move: announce a problem we all know exists, ask "why does not solution X exist that solves it" and then push for solution x to happen.

    Somewhere in between the hype surrounding the issue, noone stops to ask themselves "wait, this solution doesn't even prevent this problem".

    Liability is one thing, regulation before manifacturing: another. Given how much success government institutions have with software patents, how could we trust our software's security to them?

    First thing they'll do is "regulate" the existence of a backdoor for the police/CIA/FBI into everything that resembles software technology with access control.
  • Re:Of course (Score:5, Insightful)

    by Anonymous Coward on Monday May 29, 2006 @06:14AM (#15423857)
    Open Source has nothing to do with this and I would suggest that you actually do some research instead of parroting the usual "Open Source will fix all problems" mantra.

    I said nothing at all about open source fixing all problems, or fixing any problems for that matter.

    If you've ever worked in an industry that's gone from being unregulated to being regulated, you'll know that one of the first things that happens is that the number of participants decreases as all those that can't afford the overhead of the regulations and of maintaining a compliance department (not the same as quality assurance; experts in the interpretation and application of the regulations) leave the field. One of the next things that happens is that the number of new suppliers entering the market plummets.

    There are many disasvantages to being regulated - additional costs and potential damage to reputation if you conflict with the regulator, but the big advantage is a barrier to competitors entering your market.

    That does NOT mean that regulation is a bad thing - that depeneds on the specifics. However, if a supplier is arguing for regulation of their market then the chances are that they're doing so to cut down the competition. It's unlikely that they're asking for it because they can't control their own engineers and are hoping a regulator will do better.

    If you've observed Oracle at all you'll have noticed that they are worried by competition from open source. It is likely that that's their target in this, though it could be other smaller competitors.
  • by OP_Boot ( 714046 ) on Monday May 29, 2006 @06:20AM (#15423865)
    Does no-one remember the Millenium bridge across the Thames? http://www.urban75.org/london/millennium.html [urban75.org]
    It was opened, closed within two days, then patched.
  • British "Hackers" (Score:3, Insightful)

    by smoker2 ( 750216 ) on Monday May 29, 2006 @06:21AM (#15423866) Homepage Journal
    Speaking as a Briton -

    the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."
    should read -
    the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, disrespectful of authority, and are not averse to criminal behavior."
    BTW, I see the use of the word "hacking" as a good thing, versus "cracking". Also, "criminal behaviour" is an ever changing variable, defined by clueless beaurocrats. I break the law every time I play a dvd or mp3 on my linux system.

    The ideal system (for the government) is one where we are all criminals.

  • Re:Just Be Clear (Score:4, Insightful)

    by erroneus ( 253617 ) on Monday May 29, 2006 @06:24AM (#15423870) Homepage
    Often, when consumers are given the choice they prefer to have software sooner, even in a beta state. We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

    Actually, it does. At least in my case, and in the case of the business I work for. The fact is, we have quite a few programmers on staff due to the realization that we KNOW we cannot trust anyone but ourselves to address the concerns of the company directly and diligently. We don't create our own word processors. We have no plans to write our own Photoshop clone. But for many apps that are critical for business flow, we either wrote it ourselves, or have a great deal of leverage over the development of the apps we use.

    Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?

    OSS would have an inherent exemption. Regardless of where or how it is used, it's still 'hobby' coding. No pretense is made that it is a for-profit effort. However, if there are any OSS projects that are designed for for-profit, then yeah perhaps some level of consumer protection is in order. EULAs have questionable legal status as it is, but I think it's time we struck them down as invalid and forced 'professionals' to accept the blame for shoddy work. As for burdens on taxpayers? OMG. Are you serious? And as for regulators having access to souce code? Probably not at bad idea! We've all heard of source code escrow. Perhaps it should ALL be that way.

    Do you think major corporations are just going to hand over source code? Can you imagine the leaks?

    Yeah, they would as a continued cost of doing busines. Many of the products we use in the physical world are easily duplicated and most are. Unsurprisingly, there is more than one maker of clothing. More than one burger joint. More than one maker of plastic food storage containers. More than one maker of automobiles. In these cases, it's not the technology that differentiates the product. It's the QUALITY and the reputation of the business (and yeah, the price too) that factors into consumer choice.

    But yeah, I see your point about leaks... it could result in software piracy, copyright violations and all sorts of nasty things that... hrm... hey wait a minute! They are ALREADY a problem! This wouldn't create the problem and I can't imagine it adding too much more fuel to it.

    Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?

    No, I don't want to see more government oversight. But I would like to see more consumer protection. Do you think the consumer doesn't need it? If not, then why not? If so, then how would you propose that consumers get that protection?

    Look. There was a time before the FDA and various medical boards. To live life without them protecting the recipients would be rather unimaginable wouldn't it? We don't want people driving on the streets without all manner of regulation... driver's licenses, safety inspections, liability insurance. We require that many of the products and services we use regularly have regulation to guarantee minimal quality standards and some of them aren't as 'critical' as software. We don't allow EULAs and disclaimers to get in our way either. There's a cancer warning on every label for cigarettes. Doesn't stop people and governments from going after the tobacco industry. Why should software have such an exemption? Because it's PRESENTLY unregulated as medical/dental practice once was? Because it's an unimaginable mess to clean up?

    There are ways for goverment to be involved without being complete morons. How about people with PhDs in software development sitting on the board of regulation
  • by freedom_india ( 780002 ) on Monday May 29, 2006 @06:29AM (#15423882) Homepage Journal
    Coming from a company that for Years has perfected the art of vaporware, and charges the cost of a Battleship to build a kid's 2-ft long boat.

    She forgot to say that if Oracle were to adopt truthfulness in adverts and avoid vaporware and prevent charging the cost of a FULL Salon to setup cardboard emplants the industry would be $159 billion richer and we would have all have witnessed the Second Coming with the money...

    Sheesh what a rant from a company that is responsible for the Vaporware strategy...

  • If only... (Score:2, Insightful)

    by Jimboscott ( 845567 ) on Monday May 29, 2006 @06:39AM (#15423901)
    "What if civil engineers built bridges the way developers write code?" she asked. If only all IT projects where well defined as briges plans...
  • by SmallFurryCreature ( 593017 ) on Monday May 29, 2006 @08:32AM (#15424147) Journal
    You mean like that double decker highway that collapsed during an LA earthquake? Maybe that one that fell apart in a stiff wind?

    Ah but most bridges don't fall apart that easily. Well no, most bridges are best on millenia old technology. The more advanced designs are designed to very fine tolerances.

    Take that "new" superhigh bridge in france. It cannot support the weight of an ocean liner. Would collapse if you blew up one of the pillars and a nuclear strike within a mile would cause it to fall apart. Hell even a simple typhoon would do it.

    Ah, but none of those things are likely to happen so the bridge wasn't designed for it.

    That is the big difference between software and hardware. Even the simple thing of user supplied data is different. In software you need to check and check again every bit of data to make sure the user hasn't supplied the wrong kind of data. Hasn't the user put a 1 gigabyte of data in a bool field?

    In the real world this is kinda easier to check. I think you would notice if a truck instead of being loaded with 10 tons was loaded with 10.000 tons. A clue might be the way its axels are buried in the asfalt.

    So the bridge designer only has to design for the entire roaddeck being filled with trucks filled with lead and that is it. He can work with real world limits. The french bridge was really tested like this. It withstood the test and is in theory designed to withstand 2x the load. That ain't much of a tolerance but in the real world you can easily discount such a heavy load ever being put on the system. Someone driving up with an ocean liner on his trialer would draw attentention.

    Not so with software. I can put anything I want in this input form and the software better be designed for it. I am not constrained by real world limits.

    That is what makes software engineering so difficult, you need to account for every possibility. If you checked a piece of data and wrote it too storage then you need to check it again when you read it. This would be like a bridge engineer testing the steel, then having to check it every day to see if hasn't turned into porridge by an act of god.

    Oh and one final note. A lot of software insecurity only happens under attack. Bridges don't exactly last long under attack. Blowing one up is amazing easily. Any army engineer can do it.

  • Re:Of course (Score:2, Insightful)

    by Bing Tsher E ( 943915 ) on Monday May 29, 2006 @08:55AM (#15424203) Journal

    First of all it breaks all of their marketing bollocks

    Second it is threatening their sales to customers in



    It sure sounds like an Oracle problem to me. How the hell can they try to drag in a regulatory body, whose essential function would be to raise the barrier to market entry and protect and grow their market share?

    Well, we *know* how they can try. No way in hell they will succeed.
  • by orasio ( 188021 ) on Monday May 29, 2006 @09:08AM (#15424248) Homepage

    So, what I trying to say is that computer science needs some fundamental theories, techniques and tools applicable in real life situations before software can be trusted by design. Till then, software engineering is just a craft, where testing, patching and the like is needed to keep the system going.


    There are fundamental theories that can prove you that the software you are using does exactly what it should. You can prove your software right.

    The only problem is that it's too expensive, takes too much time, and no one wants to pay for that kind of costs.

    You could start your own company, selling bugfree software, but you would have to compete with Oracle. They only need to _say_ they have no bugs, actually making it a reality would be prohibitely expensive. But you could prove me wrong, of course.

  • by zacronos ( 937891 ) on Monday May 29, 2006 @09:34AM (#15424319)
    You are making things much more extreme than they need to be. Would the failure of *most* software applications cause you to get hurt? No. Would the failure of a small portion of software potentially cause you injury? Yes. A very small portion. Similarly, although there are some books whose misinformation could cause you physical harm, most software is more akin to a recipe in a book. What happens if it goes wrong? I waste some time and ingredients. Maybe the smoke alarms wake up my neighbors. And that's it. Do I think the entire non-fiction book industry should be regulated to cover those books that might cause you harm? No.

    If an author writes something blatantly dangerous, whether intentional or not, perhaps they should be liable. Perhaps -- it depends on additional circumstances (I don't trust everything I read, so it would partly depend on the general perceived trustworthiness of the source). But regulation is overkill.
  • Re:Nope, sorry (Score:5, Insightful)

    by gbjbaanb ( 229885 ) on Monday May 29, 2006 @09:35AM (#15424320)
    I prefer this scenario, if bridges were like software.

    We all know you can build bridges out of spaghetti (surely you did it at engineering college?), so at a company like the one I work for, a college kid would be hired, lick the bosses arse and mention that he could build a much better bridge with the tools and new practices he learned at his university where he was taught the latest, cutting-edge technologies. Boss is impressed and asks for a prototype (not stupid this boss, prototype first, then ship it. He's learned loads on the 'how to manage technical people' course he went on)

    So, new kid builds the prototype and, yes, it can carry a model car across. Even scales up to carry the stress test of a model lorry! Boss is impressed - "why can't you old guys do that?' he says, thinking of the praise he'll get at the next board meeting.

    So they set about scaling it up to suit their customers, larger bridge, industrial spaghetti, held together with glue and installed in the customer's city, across the river. Customer is really happy with their upgrade, and after testing it with a compact saloon realise they can de-commission the old steel monstrosity. All's happy... until it rains. But, that's ok, its just new-bridge teething troubles, just requires a patch with some waterproof paint and rubber sealant.

    Until a lorry decides to cross.. and it snaps, but again, just patch it up by reinforcing with some old-technology steel girders. Doesn't look so pretty and won't be as maintainable, but.. what the hell, the project manager declares it a success so the comapny is happy, and new customers are told that the company's flagship bridge uses only the latest cutting edge technologies.

    Unfortunately, in the real world, software is not as visible as a bridge so new customers can only go with the marketing and sales waffle. Once they've bought it, its too late.
  • by SmurfButcher Bob ( 313810 ) on Monday May 29, 2006 @09:53AM (#15424382) Journal
    The other irony -

    Ths, coming from Oracle... who Litchfield has been bashing non-stop, for NOT patching holes for years -

    http://search.securityfocus.com/swsearch?sbm=%2F&m etaname=alldoc&query=litchfield+oracle&x=26&y=7 [securityfocus.com]

  • Pretty easy (Score:5, Insightful)

    by Moraelin ( 679338 ) on Monday May 29, 2006 @10:12AM (#15424447) Journal
    Given what Oracle's problem _is_, probably what they _really_ want isn't regulation of the "you must prove that your software passes this and that criteria to be allowed to sell it." (Which would also raise entry barriers for competitors.) I mean, really, if you were a company which takes five fucking _years_ to bother patching a security hole, and even then only when an exploit was widely publicized, you're not going to ask for a regulation that'll ask you to pull the product off the market until you fix it.

    The kind of regulation they want is more like "you're an evil irresponsible hacker and going to jail if you disclose bugs in someone else's product." Yes, it's security by obscurity. But that way Oracle can happily spew bullshit about being secure and unbreakable, and never have to fix any bugs.

    Basically Oracle doesn't give a shit if Corporation X's database is riddled with bugs and exploits. They just don't want the PHB's at Corporation X to know about it.

    If it also results in some entry barrier, all the better, but that's not the main goal.
  • by spirality ( 188417 ) on Monday May 29, 2006 @02:05PM (#15425278) Homepage
    You make lots of good points. However, software is generally not written from scratch. That is, more people are maintaining existing systems than writing new ones.

    Second, software is maleable. It grows over time. Unlike the bridge, which is static. After the initial release you add to it. Oftentimes in ways that were unintended by the original designers.

    We do not have Brooklyn Bridge 2.0.

    So yes, everything you mention would improve quality, but because of its maleable nature, software will always be different than things in the physical world.
  • by pe1chl ( 90186 ) on Monday May 29, 2006 @02:07PM (#15425284)
    Of course software can be treated as a science, with mathematic roots and stable foundations.
    Of course people could look upon programmers as they look upon engineers: this is something that you need a good education and training for, and that you should not attempt as a naive bystander.

    In reality, this is not happening. There have been times when unemployed people with some not-so-practical education were retrained as programmers in a couple of weeks. And we see development environments that push "trial and error" development.
    In such an environment it is to be expected that bad quality software is developed.

    It is a natural reaction to say "we will go under when we get stricter quality control requirements". Maybe some badly managed companies will go under. Too bad for them. But a company with good quality products will survive, and the customer will profit from that.
  • by erroneus ( 253617 ) on Monday May 29, 2006 @03:10PM (#15425494) Homepage
    Most coding errors, according to the observations of many, are either typographical or the use of ill-advised programming practices. This is especially true where things like stack overflows are concerned which are exploited in the vast majority of vulnerabilities.

    There are projects out there such as qmail and related projects written specifically to be secure and reliable code. If one mature project can walk that path, they all can. The word "childish" was used to describe my argument and so I respond suggesting that it is childish to not exercise care in the design of software.

    Yes, hindsight is 20/20 but I think any programmer would be hard pressed to come up with truly original coding techniques. Hindsight being 20/20 should be the very reason all future code should be more secure and sturdy. It's not because even though there is hindsight, no one is learning from these mistakes.

The moon is made of green cheese. -- John Heywood

Working...