Mozilla Foundation Top 20 Excuses for Not Fixing Firefox Bugs (Last updated in 2009.)
This looks awfully similar to my KDE bug tracker experience.
Emphasis mine. The summary almost makes it sound like the researchers just used HIV as we know it
The article also said that they used the virus as a delivery vessel to inject genes of their own choice into the white cells. So not much to do with the real HIV. And ultimately not that "HIV cancels out cancer". Using virii to change genes seems to me as a "normal" thing to do, as it has been noted in press before.
I know what monoculture in security context is. Let me restate my opinion: presenting 10 or so choices of popular distro's is not going to render a significant difference from only 1 choice.
As for botnets or harvesting data: they are doing it. Run a honeypot and you'll get yourself an IRC based botnet in 2-3 days average. Faster than snail mail!
I don't see what you were trying to say about servers. Obviously, the user factor will vanish on a headless machine, but OTOH servers get usually reaped via buggy webapps. The OS role in this is relatively minor.
Your monoculture argument is wrong. From the dawn of times, linux exploits come tailored for the most common distrubutions and some are even intelligent enough to determine the environment at run time. Some can even adjust for non-standard parts replaced by the user. And they have a very good success rate indeed. The number of possible combinations for a typical linux server or workstation is not by a long shot high enough to pose any problem due to environment diversity.
You have a far better point than the other reply to my comment, but nevertheless...
Kernel or other patches are a reactive measure, not proactive such as micro kernel, sandboxing, mandatory access controls, and shifting drivers to userspace (of which linux has the least).
One of the pillars of good security, i.e. ex-post detection of malicious behavior, is completely missing from linux installations, and seemingly from the mentality of the linux community, whereas on windows it is the norm to have an "anti virus" software, which can be pretty efficient in detecting userspace threats and sometimes even stands some chance against kernelspace intrusions.
The point of being able to run a VM legally in linux is valid, but no wide-spread practical application of that is currently available. In fact there's a lot of fine security solutions for linux (unfortunatelly sans the kernel itself) but they all are brutally under-utilized. From that perspective linux desktop is only at the very beginning of the road towards security. I stand with my previous assessment that the lack of linux based malware is from its greater part caused by minimal interest on the part of the criminals.
And yes, when linux becomes so popular that it will attract malware enough, the plan to move to another less known OS is pretty good
The problem is the inability of consumers or managers to understand the 3 part rule. Speed, Quality, Cost, pick two.
This rule concerns external quality of the product - as perceived by the customer. The problem with bad code is the *internal* quality - which has impact on the "quotient" for the 3 part rule. The worse the internal quality, the smaller the overall pile from which you "pick two". Eventually with very bad code the development just grinds to a halt.
I think the best analogy for this is furniture. Mennonites make great furniture. It takes a long time, and is very expensive. It is a craft, and they are craftsmen. IKEA makes some pretty shitty furniture but is good enough for many applications. It is cheap, and fast.
Yeah and internal quality (bad code vs good code) would then be the tools with which the furniture is made. Poor shops would assembly every piece by hands of unexperienced workers, great shops just have experienced guys working on top grade machinery. Both poor and great shops can still do the "pick two" tradeoff although it is obvious how their overall productivity would compare. And thats the way it is with shitty code (bunch of cowboy coders working on a yesterday deadline) and great code (tests, refactoring, good planning, experienced people).
35% of technology professionals said they would sacrifice up to 10% of their salaries for full-time telecommuting. The average tech pro was paid $79,384 last year, according to Dice's annual salary survey, which means a 10% pay cut is equivalent to $7,900 on average
Wrong calculation -- the average pay of the 35% who are willing may not be equal to the average of all tech pros. So the 10% cut might be far from the said figure.
Yep, the one-line answer is:
It's too CPU-intensive for the server.
Cost could be an issue but it's like $100 a year? Hardly a problem for anything but the most amateur of blogs.
It's not that. It's key distribution. Without that HTTPS is simply not secure. I'm surprised that TFA does not mention it. In fact, it talks nonsense. "Everyone knows HTTPS is more secure." Secure from what? Mosquitoes?
But the whole point of this discussion: What if there is a bug in the library that renders that *data*? All of a sudden, your data is no longer very data-y, and much more executable-y than you might have intended.
For reference, take a look at the (lengthy) list of bugs in any of the image processing libraries.
Well, bugs can be fixed. But if you make it a deliberate feature to recklessly run anything that comes into your computer then there's little hope.
How is Apple worse? When did they root kit your iPod or iPhone? Who did they take to court for jail breaking?
Wasn't there some story about Apple bricking all the jailbroken gadgets some time ago?
If it has syntax, it isn't user friendly.