Comment Re:A small vat of organic liquid? (Score 1) 98
we already have those - look at all the 'bots posting on here for starters...
we already have those - look at all the 'bots posting on here for starters...
The rationale was probably to rack up charges, if the machine was not in use at the time, might as well use it.
CT _scanners_ of course are pretty expensive, be interesting to know how they decontaminated it, if they did.
If the numerous people reviewing Bash, from multiple companies, and disciplines didn't find the issue with the first patch, then how would Microsoft with a far more limited set of people looking at the code be able to get the same kind of patch correct the first time and get all the corner cases figured out and fixed before releasing the first patch?
Because they have a "far more limited" team full of security specialists. Some (maybe all) of the later bugs were found using standard fuzzing tools, which should have been part of the test process the first time, as soon as the parser was found to be broken once on non-standard input. In fact it should have been picked up, whilst under embargo, that the whole idea of parsing code out of untrusted input was a security hole that would need to be patched (as it eventually was), even not-security-experts with some idea about security could have predicted that (as I did - http://slashdot.org/comments.p...)
I'm not saying the Bash devs had 1 million eyes on this; but they certainly had a few hundred if not a thousand or so in total.
During the embargo, really ?
Agreed - kinda. The main point of the origin of this thread (article?) was that F/LOSS software could not deal as well as proprietary software; that somehow the proprietary vendors could do better with these kinds of bugs - both catching them and responding to them.
Actually the article doesn't say that proprietary is any better, just that FOSS hasn't turned out to be as good at it as people were assuming (many eyes bugs shallow etc.).
But here's the kicker - there is a similar exploit for cmd.exe. It's yet to be patched.
cmd.exe parser has a bug, or maybe a feature. bash parser had a bug, or several, or maybe a feature.
Big big big difference is that cmd.exe doesn't execute, or echo, or parse, all its environment variables at startup - that is the actual bash shellshock vulnerability (not the various parser bugs), and cmd doesn't have it. No one has yet found an exploit for this cmd.exe bug, let alone a remote one.
I didn't say MS was better, I said the bash response was poor, and the poster I replied to couldn't possibly have had fixes in place within minutes as claimed.
Oh, and in your argument "up to 30 days" suddenly becomes "taken 30 days" - actually if bugs come in uniformly distributed in the 30 day cycle then average would be 15 days, or lower since sometimes they do go out-of-band.
Plus, the second (and third and fourth and so on) patches are only needed if the first (and second and third.,.) one is inadequate and not properly tested. Maybe MS are just as bad at that too, but the developers of Bash were certainly not good at it.
There aren't because:
1. no one is paying for them (or at least not enough to make a difference and catch stuff like heartbleed and shellshock)
2. auditing existing code doesn't "scratch an itch" for anyone on the hobbyist side
Closed source companies like MS have to weigh up costs of security auditing vs. cost of reputational damage of getting it wrong (i.e. if you think safety is expensive try having an accident). For a long time, MS was so secure as a monopoly that the reputational damage wasn't worth them worrying about - that isn't the case now, and they are better at security than they were, but they have a very large legacy mess still to clear up.
For open source companies, the reputational damage is spread or lands elsewhere (shellshock is a GNU bug not a Linux bug or a RedHat or Debian or...), so there is even less incentive. Your competition benefits equally from your auditing but you take the whole cost. Therefore it will need collective funding by competing companies - which is always a lot harder to organise.
How did you fix them in minutes when it took several days for correct patches to come out, for entirely predictable reasons (laughable approach of trying to find and fix all bugs at once in a parser never designed to be secure, when the real issue is that it should never be being fed untrusted input) ?
To my mind, that is the biggest failure of open source / free software in this case
- 20+ yr old bug / insecure-feature in an obscure corner of a system never designed for today's threat environment - forgiveable
- responsible disclosure, working with maintainers under embargo - good
- publication along with a patch that was broken again within hours if not minutes - fail
- everyone and his dog then panic-issuing further patches for one parser vulnerability after another before eventually someone (actually more than one different approach) fixes it properly the way it should have been done in the first place - spectacular fail
There is good engineering and engineering a successful product. Edison was much better at understanding the latter, he also understood and played the patents system. He was in the end by far the better capitalist / businessman, hence he won, financially, and winners write the history books.
Before writing Tesla down as always the great engineer who never got successful, it is worth remembering that he did make a fortune (tens of millions in today's money) from his AC patents before he gave up on the royalties, but he died a pauper because he blew his fortune self-funding research into ideas that were much less good - too confident in his own promised results, he sunk all his money into ideas that just didn't work.
Try page margins, there's lots of room and they don't interfere as much with legibility.
Tried that, I had a great proof of this colonization concept, but this margin was too small to contain it...
Yeah and who exactly is this afflicted user? Right, normally apache or some other unprivileged user who has relatively little power though granted you don't even want unprivileged users logging in from the Internet
For some, the ability to run their own code on a server with high bandwidth outward connection to the internet is all the power they need/want.
If the server is an authorised mail source for a domain (e.g. spf) then so much the better.
If the user has access to some writable disk space that can be used to host some interesting files, then there are uses for that.
If the user can read the web site source or config files that may have value in itself or may lead to further penetration - but I'm sure you've never seen DB user/pw in the source for a production website, ever ?
The easiest thing to do whether or not you can get a patched bash yet is to disable Apache's cgi-bin module.
and qmail procmail exim postfix sshd openVPN some inetds... and all the vectors that haven't been found yet.
The easiest thing to do is to fix bash - using rm if necessary
C. Like it or not, the bitcoins represent evidence. Seizing evidence is par for course in any criminal case.
Is selling the evidence off before the trial (which it was seized to be used in) also par for the course ?
What happens to seized evidence after a not-guilty verdict - or do certain people already know that is not a possibility ?
Nothing was stopping the angels either, per Genesis 6-4 "when the sons of God came in unto the daughters of men" and begat the Nephilim (or were the Nephilim depending on your translation). Enoch appears to make clear they are angels, but Enoch is apocryphal so you may or may not believe it enlightens the story in Genesis. It matters not though, because the story _is_ in Genesis which is canon, and whether the "sons of gods" (or Enoch's Watchers) are angels or aliens, they are clearly not-men but are capable of interbreeding with "daughters-of-men" and producing a race of giants (Nephilim).
As regards opening up a whole new mess, all of this is post-Eden and pre-flood, man can always complain to God: "you said I had dominion over everything so what are these more powerful beings doing taking all the good looking girls" , and God will say "yeah and ? Can I remind you that was back in Eden when I also told you not to eat the ****ing apples ?".
At least with Linux, if a security hole is found (and made public or released to experts in the security community or to the relavent developers or whatever), the number of people who are able to investigate and fix the hole (and make official or unofficial fixes available) is (in most cases) significantly larger than the number of people who would e able to deal with issues in Microsoft code. And the Linux guys can have patches out much faster (and they can get into distros fairly fast too)
There is a flip side to that, which is that too many cooks with no common management may royally spoil the broth, which is what has happened.
When found, this was embargoed and should have been fixed under embargo, properly. Instead, someone pushed out half-baked patches that were proved to be still vulnerable within hours (if not minutes), and by then everything was public. So now we have everyone and his dog patching in panic mode, three, four or more(?) sets of patches now, with each one still proving vulnerable, because collectively we blew our chance to fix it (and test it) properly the first time.
The eventual fix will have to be to stop playing who-finds-the-parser-bugs-first and accept that the whole feature is a security hole and needs to be removed or radically changed - something that could and should have been predicted at the start. There are various patches flying around now that do just that, probably in different incompatible ways. Those changes will, inevitably, break some stuff - and of course we could have looked and tried to figure out the extent of that, and mitigated the effect and been ready with the announcement, if it were still under embargo, but we blew that chance.
I am all for saying that the open source development model is better - but this really is not the best example to use to illustrate it, in fact it's already heading into embarrassing farce territory.
Modeling paged and segmented memories is tricky business. -- P.J. Denning