Oracle Exec Strikes Out At 'Patch' Mentality 264
An anonymous reader writes "C|Net has an article up discussing comments by Oracle's Chief Security Officer railing against the culture of patching that exists in the software industry." From the article: "Things are so bad in the software business that it has become 'a national security issue,' with regulation of the industry currently on the agenda, she said. 'I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."
Of course (Score:5, Insightful)
The fellowship - ring of corruption (Score:1, Insightful)
Well, obviously.... (Score:4, Insightful)
One of the examples in the article asks, "What if civil engineers built bridges the way developers write code?" and answers, "What would happen is that you would get the blue bridge of death appearing on your highway in the morning." The difference here, however, is that civil engineers couldn't get away with making rickety bridges. You would find public outcry if it broke while people were on the bridge. In the software world, however, they scream and the companies just fix it with a patch and it shuts the consumers up. Saves a lot of money and time in testing at companies.
Re:Well, obviously.... (Score:4, Insightful)
When you build software, you just attach a EULA that says "I shall not be held liable" and that's it.
Once software makers, especially the large commercial companies, find themselves in the same boat as other industries and have to pay compensation when bad stuff is released, they will certainly step up quality control to the next level. Because it saves them money.
Wow... is this what the software industry needs? (Score:5, Insightful)
She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."
It seems to me that the F/OSS industry has shown that fast, and effective patches can be applied, and that software we pay for has less then reasonable responses to such threats. I use F/OSS and I'm quite happy with the response they have to software problems. I don't expect it to be of NASA quality, just to be good, and it is. For the amount that you have to pay for Oracle et al, you expect fast resonses on problems. The problem is that they don't respond fast enough. There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers. Certainly, Oracle isn't showing that they deserve the price they demand, at least not in this respect.
I might be off topic, but all the F/OSS that I use, delivers what I pay for AND MORE. The software that I have to pay for is lacking. When you pay thousands of dollars, you expect patches in a timely manner, and before you get hacked. I think this is a big reason that F/OSS will continue to win hearts and minds across the world. Despite the financial differences, F/OSS actually cares, or seems to, and they do fix things as soon as they find out, or so it seems to me. They have a reputation to uphold. Without it, they will just wither and die. It amazes me that investors, stock holders, and customers are willing to wait for the next over-hyped release of MS Windows while they suffer the "stones and arrows" of the current version. It appears that no matter how bad commercial software is, people rely on it. Yes, of course there is more to the equation than this simple comparison, but I think this is important. If you weigh what you get against what you pay, F/OSS is a good value. The argument is old, and worn, but ROI is a big deal, and patches make a difference to ROI.
Is it really what the software industry needs? A set of rules to make things bullet proof.. which of course won't ever happen. That kind of mindset is totally wrong, even though the sentiment is in the right place, you can't regulate quality in this regard. Sure, you can make sure that all gasoline is of a given quality, but I don't trust the government to test and regulate software. The US government already has a dismal record of keeping their own house in order on this account, I don't want them telling me how to do anything or what I can and cannot sell, never mind what I can give away for free under GPL.
How To Lie With Statistics (Score:5, Insightful)
Funnily enough, I'm just now reading Darrell Huff's book, "How To Lie With Statistics".
The problems with her poll are manifold.
Firstly, her group is composed of securiy officers who are on the CSO Council; might their views differ from security officers not on the Council? perhaps tending to be more of the belong-to-an-organised body sort? might perhaps therefore be predisposed towards regulation?
Secondly, of the officers on the Council, which ones did she ask? all of them? or did she have a bias to tend to ask those she already knows will agree? perhaps those who found it rather boring and aren't quite so pro-organised bodies just don't turn up at the meetings.
Thirdly, what's her position in the organisation? if *she* askes the question, are people more likely to say "yes" than they would to another person?
Fourthly, are people inclined in this matter to say one thing and do another, anyway? e.g. if you do a survey asking how many people read trash tabloids and how many people read a decent newspaper, you find your survey telling you the decent newspaper should sell in the millions while the trash should sell in the thousands - and as we all know, it's the other way around!
Fifthly, even if the views of members of the CSO Council truely represent all security officers, and even if they were all polled, who is to say the view of high level security officers is not inherently biased in the first place, for example, towards regulation?
So what, at best, can her poll tell you? well, at best, it can tell you that chief security officers who regularly turn up at meetings will say to a particular pollster, for whatever reason, and there could be widely differing reasons, that they think regulation is a good idea.
Well, I have to say, that doesn't tell us very much, and that's even assuming the best case for some of the issues, which is highly unrealistic.
Re:One problem (Score:2, Insightful)
what a moron (Score:1, Insightful)
Re:One problem (Score:3, Insightful)
The moment you say that security doesn't matter in on place, that becomes the ideal place for attacks to be focussed.
FatPhil
This, from Oracle? (Score:5, Insightful)
And they think security patches are a poor model?
Maybe that's why they put so little effort into them. Maybe that's because they put so little effort into them. Maybe some people think of it as bridge maintainance, and they want to build the bridge perfect every time? When they can't even get patches right when they have six months between them? Fat chance.
Honestly, out of the people in the software industry, even Microsoft do a better job, security-response-wise, than Oracle. And when you're behind Microsoft in that department, you've really got a problem.
They need to make a serious effort at security response and treat it like a real priority, not show-ponying about regulation when, if they were regulated, they would still be completely unable to respond, but would point to poorly-drafted regulation as "tying them up in red tape".
Re:Engineers vs mechanics (Score:5, Insightful)
I, for one, can only applaud her (Score:1, Insightful)
Lack of security, lack of taking responsiblity and the relience on customers as beta testers really is a big problem in the software industry and as she rightly notes, it's going to have some serious repercussions for this industry.
So, if you want to avoid these, get your act together.
Do something about the problem, but don't shoot the messenger!
Just Be Clear (Score:4, Insightful)
Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?
Do you think major corporations are just going to hand over source code? Can you imagine the leaks?
Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?
Typical fear mongering (Score:4, Insightful)
However...
I don't like that she is using age-old classics for fear mongering. "National security" and the bridge analogy to be specific.
Bugs themselves are rarely the problem when we are talking about "national security". For some odd reason it seems that people have forgot the importance of physical separation of the public network and sensitive information / infrastructure. It's stupid to blame the tools if the user is an idiot (and in this case I mean those "chief security officers", who design these braindead infrastructures for corporate networks).
I don't understand how anyone in their right minds could suggest any kind of regulatory system for the software quality. It's practically impossible to control and what if there is some sort of accident caused by some regulated and "certified" product? Is this certification (or what ever) a free pass for the software provider? This would turn to be an ultimate disclaimer for the software companies. Or - the other way around - the ultimate responsibility, which would lead to the point where there are no more software engineers because there is too much personal responsibility involved.
Besides, in my opinion, Daividson insults British people pretty badly and describes them as "slightly disrespectful of authority, and just a touch of criminal behaviour." I think that's not a very professional comment.
Anyway, this is what I'm thinking about of this whole article.
But it's different things (Score:5, Insightful)
1) Development cost will be a lot more. You are going to have to spend time doing some serious regression testing, besically testing every possible compination of states that can occur. May seem pointless, but it's gotta be done to gaurentee real reliability.
2) Development time will be a lot more. Again, more time on the testing. None of this "Oh look there's a new graphics card out, let's get something to support it in a month." Be ready to have years spent some times.
3) Hardware will be restricted. You are not going to be running this on any random hardware where something might be different and unexpected. You will run it only on hardware it's been extensively tested and certified for. You want new hardware? You take the time and money to retest everything.
4) Other software will be limited. Only apps fully tested with your app can run on the same system. Otherwise, there could be unexpected interactions. The system as a whole has to be tested and certified to work.
5) Slower performance. To ensure reliability, things need to be checked every step of the way. Slows things down.
If you aren't willing to take that, then don't bitch and demand rock solid systems. I mean such things DO exist. Take the phone switches for example. These things don't crash, ever. They just work. Great, but they only do one thing, yoy use only certified hardware, they've had like one major upgrade (5ESS to 7R/E) in the last couple decades, and they cost millions. You can do the same basic type of stuff (on a small scale) with OSS PBX software and a desktop, but don't expect the same level of reliability.
The thing is, if your hypothetical bridge were software (and it's quite simple compared to software) people would expect to be able to put the same design anywhere and have it work, drive tanks over it and not have it collapse, have terrorists explode bombs under it and have it stay up and so on and have all that done on 1/10th of the normal budget.
Until we are willing to settle for some major compramises, we need to be prepared to accept patches as a fact of life. I mean hell, just settling on a defined hardware/software set would do a lot. Notice how infrequent it is to see major faults in console games. It happens but not as often. Why? Well because the hardware platform is known, and you are the only code running. Cuts down on problems immensly. However take the same console code and port it to PC, and you start having unforseen problems with the millions of configurations out there.
Me? I'll deal with some patches in return for having the software I want, on the hardware I want, in the way I want, for a price I can afford.
Shoddy Straw Man (at best) (Score:2, Insightful)
1. A straw man man argument and a poor one at that. It's not uncommon for civil engineering projects to require "patches" http://en.wikipedia.org/wiki/Big_dig#Reports_of_s
2. An obviously bad analogy, I'm sure the specifics will be discussed here ad infinium.
Typical manipulation (Score:3, Insightful)
Somewhere in between the hype surrounding the issue, noone stops to ask themselves "wait, this solution doesn't even prevent this problem".
Liability is one thing, regulation before manifacturing: another. Given how much success government institutions have with software patents, how could we trust our software's security to them?
First thing they'll do is "regulate" the existence of a backdoor for the police/CIA/FBI into everything that resembles software technology with access control.
Re:Of course (Score:5, Insightful)
I said nothing at all about open source fixing all problems, or fixing any problems for that matter.
If you've ever worked in an industry that's gone from being unregulated to being regulated, you'll know that one of the first things that happens is that the number of participants decreases as all those that can't afford the overhead of the regulations and of maintaining a compliance department (not the same as quality assurance; experts in the interpretation and application of the regulations) leave the field. One of the next things that happens is that the number of new suppliers entering the market plummets.
There are many disasvantages to being regulated - additional costs and potential damage to reputation if you conflict with the regulator, but the big advantage is a barrier to competitors entering your market.
That does NOT mean that regulation is a bad thing - that depeneds on the specifics. However, if a supplier is arguing for regulation of their market then the chances are that they're doing so to cut down the competition. It's unlikely that they're asking for it because they can't control their own engineers and are hoping a regulator will do better.
If you've observed Oracle at all you'll have noticed that they are worried by competition from open source. It is likely that that's their target in this, though it could be other smaller competitors.
Re:Shoddy Straw Man (at best) (Score:2, Insightful)
It was opened, closed within two days, then patched.
British "Hackers" (Score:3, Insightful)
The ideal system (for the government) is one where we are all criminals.
Re:Just Be Clear (Score:4, Insightful)
Actually, it does. At least in my case, and in the case of the business I work for. The fact is, we have quite a few programmers on staff due to the realization that we KNOW we cannot trust anyone but ourselves to address the concerns of the company directly and diligently. We don't create our own word processors. We have no plans to write our own Photoshop clone. But for many apps that are critical for business flow, we either wrote it ourselves, or have a great deal of leverage over the development of the apps we use.
Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?
OSS would have an inherent exemption. Regardless of where or how it is used, it's still 'hobby' coding. No pretense is made that it is a for-profit effort. However, if there are any OSS projects that are designed for for-profit, then yeah perhaps some level of consumer protection is in order. EULAs have questionable legal status as it is, but I think it's time we struck them down as invalid and forced 'professionals' to accept the blame for shoddy work. As for burdens on taxpayers? OMG. Are you serious? And as for regulators having access to souce code? Probably not at bad idea! We've all heard of source code escrow. Perhaps it should ALL be that way.
Do you think major corporations are just going to hand over source code? Can you imagine the leaks?
Yeah, they would as a continued cost of doing busines. Many of the products we use in the physical world are easily duplicated and most are. Unsurprisingly, there is more than one maker of clothing. More than one burger joint. More than one maker of plastic food storage containers. More than one maker of automobiles. In these cases, it's not the technology that differentiates the product. It's the QUALITY and the reputation of the business (and yeah, the price too) that factors into consumer choice.
But yeah, I see your point about leaks... it could result in software piracy, copyright violations and all sorts of nasty things that... hrm... hey wait a minute! They are ALREADY a problem! This wouldn't create the problem and I can't imagine it adding too much more fuel to it.
Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?
No, I don't want to see more government oversight. But I would like to see more consumer protection. Do you think the consumer doesn't need it? If not, then why not? If so, then how would you propose that consumers get that protection?
Look. There was a time before the FDA and various medical boards. To live life without them protecting the recipients would be rather unimaginable wouldn't it? We don't want people driving on the streets without all manner of regulation... driver's licenses, safety inspections, liability insurance. We require that many of the products and services we use regularly have regulation to guarantee minimal quality standards and some of them aren't as 'critical' as software. We don't allow EULAs and disclaimers to get in our way either. There's a cancer warning on every label for cigarettes. Doesn't stop people and governments from going after the tobacco industry. Why should software have such an exemption? Because it's PRESENTLY unregulated as medical/dental practice once was? Because it's an unimaginable mess to clean up?
There are ways for goverment to be involved without being complete morons. How about people with PhDs in software development sitting on the board of regulation
Coming from a company.... (Score:3, Insightful)
She forgot to say that if Oracle were to adopt truthfulness in adverts and avoid vaporware and prevent charging the cost of a FULL Salon to setup cardboard emplants the industry would be $159 billion richer and we would have all have witnessed the Second Coming with the money...
Sheesh what a rant from a company that is responsible for the Vaporware strategy...
If only... (Score:2, Insightful)
Bridge of blue death (Score:5, Insightful)
Ah but most bridges don't fall apart that easily. Well no, most bridges are best on millenia old technology. The more advanced designs are designed to very fine tolerances.
Take that "new" superhigh bridge in france. It cannot support the weight of an ocean liner. Would collapse if you blew up one of the pillars and a nuclear strike within a mile would cause it to fall apart. Hell even a simple typhoon would do it.
Ah, but none of those things are likely to happen so the bridge wasn't designed for it.
That is the big difference between software and hardware. Even the simple thing of user supplied data is different. In software you need to check and check again every bit of data to make sure the user hasn't supplied the wrong kind of data. Hasn't the user put a 1 gigabyte of data in a bool field?
In the real world this is kinda easier to check. I think you would notice if a truck instead of being loaded with 10 tons was loaded with 10.000 tons. A clue might be the way its axels are buried in the asfalt.
So the bridge designer only has to design for the entire roaddeck being filled with trucks filled with lead and that is it. He can work with real world limits. The french bridge was really tested like this. It withstood the test and is in theory designed to withstand 2x the load. That ain't much of a tolerance but in the real world you can easily discount such a heavy load ever being put on the system. Someone driving up with an ocean liner on his trialer would draw attentention.
Not so with software. I can put anything I want in this input form and the software better be designed for it. I am not constrained by real world limits.
That is what makes software engineering so difficult, you need to account for every possibility. If you checked a piece of data and wrote it too storage then you need to check it again when you read it. This would be like a bridge engineer testing the steel, then having to check it every day to see if hasn't turned into porridge by an act of god.
Oh and one final note. A lot of software insecurity only happens under attack. Bridges don't exactly last long under attack. Blowing one up is amazing easily. Any army engineer can do it.
Re:Of course (Score:2, Insightful)
First of all it breaks all of their marketing bollocks
Second it is threatening their sales to customers in
It sure sounds like an Oracle problem to me. How the hell can they try to drag in a regulatory body, whose essential function would be to raise the barrier to market entry and protect and grow their market share?
Well, we *know* how they can try. No way in hell they will succeed.
Re:But it's different things (Score:3, Insightful)
So, what I trying to say is that computer science needs some fundamental theories, techniques and tools applicable in real life situations before software can be trusted by design. Till then, software engineering is just a craft, where testing, patching and the like is needed to keep the system going.
There are fundamental theories that can prove you that the software you are using does exactly what it should. You can prove your software right.
The only problem is that it's too expensive, takes too much time, and no one wants to pay for that kind of costs.
You could start your own company, selling bugfree software, but you would have to compete with Oracle. They only need to _say_ they have no bugs, actually making it a reality would be prohibitely expensive. But you could prove me wrong, of course.
Re:Well, obviously.... (Score:2, Insightful)
If an author writes something blatantly dangerous, whether intentional or not, perhaps they should be liable. Perhaps -- it depends on additional circumstances (I don't trust everything I read, so it would partly depend on the general perceived trustworthiness of the source). But regulation is overkill.
Re:Nope, sorry (Score:5, Insightful)
We all know you can build bridges out of spaghetti (surely you did it at engineering college?), so at a company like the one I work for, a college kid would be hired, lick the bosses arse and mention that he could build a much better bridge with the tools and new practices he learned at his university where he was taught the latest, cutting-edge technologies. Boss is impressed and asks for a prototype (not stupid this boss, prototype first, then ship it. He's learned loads on the 'how to manage technical people' course he went on)
So, new kid builds the prototype and, yes, it can carry a model car across. Even scales up to carry the stress test of a model lorry! Boss is impressed - "why can't you old guys do that?' he says, thinking of the praise he'll get at the next board meeting.
So they set about scaling it up to suit their customers, larger bridge, industrial spaghetti, held together with glue and installed in the customer's city, across the river. Customer is really happy with their upgrade, and after testing it with a compact saloon realise they can de-commission the old steel monstrosity. All's happy... until it rains. But, that's ok, its just new-bridge teething troubles, just requires a patch with some waterproof paint and rubber sealant.
Until a lorry decides to cross.. and it snaps, but again, just patch it up by reinforcing with some old-technology steel girders. Doesn't look so pretty and won't be as maintainable, but.. what the hell, the project manager declares it a success so the comapny is happy, and new customers are told that the company's flagship bridge uses only the latest cutting edge technologies.
Unfortunately, in the real world, software is not as visible as a bridge so new customers can only go with the marketing and sales waffle. Once they've bought it, its too late.
Re:If software was built like bridges! (Score:3, Insightful)
Ths, coming from Oracle... who Litchfield has been bashing non-stop, for NOT patching holes for years -
http://search.securityfocus.com/swsearch?sbm=%2F&
Pretty easy (Score:5, Insightful)
The kind of regulation they want is more like "you're an evil irresponsible hacker and going to jail if you disclose bugs in someone else's product." Yes, it's security by obscurity. But that way Oracle can happily spew bullshit about being secure and unbreakable, and never have to fix any bugs.
Basically Oracle doesn't give a shit if Corporation X's database is riddled with bugs and exploits. They just don't want the PHB's at Corporation X to know about it.
If it also results in some entry barrier, all the better, but that's not the main goal.
Re:But it's different things (Score:4, Insightful)
Second, software is maleable. It grows over time. Unlike the bridge, which is static. After the initial release you add to it. Oftentimes in ways that were unintended by the original designers.
We do not have Brooklyn Bridge 2.0.
So yes, everything you mention would improve quality, but because of its maleable nature, software will always be different than things in the physical world.
Re:Well, obviously.... (Score:3, Insightful)
Of course people could look upon programmers as they look upon engineers: this is something that you need a good education and training for, and that you should not attempt as a naive bystander.
In reality, this is not happening. There have been times when unemployed people with some not-so-practical education were retrained as programmers in a couple of weeks. And we see development environments that push "trial and error" development.
In such an environment it is to be expected that bad quality software is developed.
It is a natural reaction to say "we will go under when we get stricter quality control requirements". Maybe some badly managed companies will go under. Too bad for them. But a company with good quality products will survive, and the customer will profit from that.
Re:hindsight is always 20/20 (Score:3, Insightful)
There are projects out there such as qmail and related projects written specifically to be secure and reliable code. If one mature project can walk that path, they all can. The word "childish" was used to describe my argument and so I respond suggesting that it is childish to not exercise care in the design of software.
Yes, hindsight is 20/20 but I think any programmer would be hard pressed to come up with truly original coding techniques. Hindsight being 20/20 should be the very reason all future code should be more secure and sturdy. It's not because even though there is hindsight, no one is learning from these mistakes.