It's an authority and leadership problem. The thing the email cert dealers miss out on, in my opinion is the sale of directory services. If a dozen or so CAs could come up with a giant LDAP server of people they've verified and issued certificates to and you could just plug that in to your email client then you could rely upon a centralized like database of public keys that are all trusted, the owners could perhaps set some mail preferences (I prefer encrypted mail, I prefer signed mail, etc..) then the whole thing could develop some use. Thing is, people don't wnat to be "listed in the phone book" on the internet. PGP/GPG could possibly do some similar things, I mean you could build services on top of them such that everyone published a public key to a central server and then maybe used some social networking type stuff to encourage the web of trust to expand.
Either way, it seems like a central directory is pretty key to email certs working well, trusted CA or not, how do I know who has a key until we've already emailed? That alone prevents encryption and if it's only authentication then the value prop drops. In fact even Zimmerman suggests not signing most of your emails...
Can you emulate what JWZ did? Sure. Does that mean you'll be successful? I doubt it. THis discussion isn't about unit tests or dev methodologies in the traditional sense. It's about guys that threw that stuff out and achieved success in spite of it. It's much more subtle than being a hack that get's stuff "done." It's about being a hacker that get's stuff Done (with a big D) One you want on your team, they are rare individuals that can guide you towards riches, the other? Well you've probably got them around and they're probably paid too much and they actually cost the organization over the long term rather than help it.
There is tons of code reuse in Linux already, it just might be that it's kind of at the wrong level. Take drivers for example, there are basically 3 kinds of drivers, network, character and block. All the drivers essentially are one of those. Some subsystems like ieee1394 and USB kind of add some new types of drivers on top of that and provide a framework for them. There isn't exactly an ethernet device framework though, every ethernet device is basically a network device and then implements all the chip specific stuff and in some cases it could probably be identical between chips but for a couple registers that are different. By the time there is enough pressure to come up with a "WIFI device" framework, half the devs that contributed WIFI drivers have abandoned them and moved on, they wrote them got them working, got them included, debugged them and they're done. This is a double edged problem, you want to keep the hardware support but you don't want to change drivers without testing them, and ideally the original authors will play along and help out but they aren't always around.
You could probably make some great arguments about the VFS and LVM layers too, they have caused tons of public arguments over the years. For many things they are very well defined interfaces, for some new and exotic types of things (think ReiserFS4) those interfaces are broken. These are hard problems to solve. More over, LVM + EXT3 work really well for 90+% of the users out there.
There is a bigger missing piece too, Linux hasn't had a "dev branch" in ages. 2.5 took too long to stabilize, so they say, but you can have high level goals for the entire community with a dev branch. "We won't ship 3.0 until, x, y, and z are done... when they're done they'll be done." It's allowed for more rapid integration of stuff but there is always a cost with that. I think Alan Cox suggested it was time to start a 2.9/3.0 branch a while back in order to start dumping some of the abandoned drivers and stuff. It's a good thing for the community psychology.
Nice and concise. I'll do you one better. They want some non-UNIX APIs and they also want a lot of the legacy UNIX APIs at the same time. Seems Microsoft and BeOS and really Apple too think that a proper UI depends upon kernel support, be it for message queues or for various services that need to have kernel like superiority to clients. I've been wondering about this for a long time, the X guys don't approach the problem that way but no matter what kinds of hacks they do, Linux desktops just don't have anything like the cohesive feel of Windows, BeOS or MacOSX. Let un not forget, BeOS had terrible networking support and terrible hardware support.
So why didn't Haiku start with a Linux kernel, which is really good at a lot of stuff, add some patches/drivers to provide the missing mechanisms that they desire and then build on top of that? I have no belief that the interfaces that the BeOS guys provide would get accepted by the Linux kernel or a BSD kernel, at least not any time soon but I'd like to see those interfaces defined and it's a perfect job for a distribution to apply that patch and build a product out of it. Then at the very least, this new Haiku OS would have a chance in hell of maintaining hardware compatibility and running on interesting stuff.
It also seems if you came up with a good set of audio APIs and built user space stuff that used them, you could legitimately take over Linux audio.
I'd love to see something like this succeed, and I applaud their tenacity. It's just when you start writing a kernel from scratch and only have vague explanation about software consistency as to why you can't use an existing kernel (that happens to be very good and have great support.) Provide some kernel patches to Linux, and start a completely from scratch distribution and software model on top...
Coding is nice and all, but good communication, good engineering, and good design (not graphics design) are parts of the old skool "hacker ethic." There was a time when you touched a program you tried to leave it better than you found it and anticipate what the next guy will need or want. That's becoming a rarity and some modern paradigms like Agile seem to ignore it completely, you're wasting energy and time if you "over engineer" anything or try to build something before you need to.
Now a lot of folks might think they have HD TV and have a DVD player that is either 480p or an upscaling one but that's not HD-DVD. It just doesn't seem like it's possible for those numbers to be correct. If you look at the income distribution as well, it suggests to me that the sample set is flawed if nothing else. Computer ownership went down? HD TV ownership is substantially different than the Neilsen numbers. Original xbox numbers are consistent but PS2 numbers went down? The $50k to $75k folks own way more gadgets than the $75k+ crowd? 'splain that to me.
Maybe put nice blank plates over the jacks if it bothers you that much. By "better" I'd say fishing cat 5, cat6, or structured wiring to each jack and then home running them somewhere. A loop is no good, you'll only make what's there worse with any other scheme.
The only thing worse than trying to un-fuck the wiring in a new place you just bought because the last owner did some "project" is being that home owner and trying to get it all unfucked on your own because an inspector told the potential buyers that the wiring is all screwed up. Trust me on this. Your wife will be a defcon 0 with the stress of moving. You'll be either paying two mortgages or dealing with the close on your new place, trying to get things timed just right. (And they never can time things "just right.") The new buyers will be ready to close yesterday, except for the list of stupid crap you need to fix and or explain. A contractor will want to tear up walls and fix it that way, for a couple grand (maybe more if they know you're bent over the table) and you'll have to re-clean the place with that lovely drywall dust just about everywhere... And it's going to be about 200 degrees in your attic where you cleverly "hid" most your dirty work... If you're there forever, then knock yourself out, but if you plan on selling the place, just realize that a lot of people still like to have phones in rooms and phone service (even Vonage or 8x8 or whatever can run over the old loop if you plug it in to the house instead of a phone)
Or maybe the new buyer will get a kick out of your "intercom" system or home brewed HPNA, with the speaker about 2 feet off the ground where the phone jack was... You never know.
MSNBC and HNN have nearly the same format, a morning variety show with varied opinion, but definitely not a "just news" program, some number of hours of news readers and then opinion guys/gals for primetime.
Nobody from Fox News would ever claim that O'Reilly is a news man (well he might, who knows? His program clearly isn't a news program though, and even he'd say that) same with MSNBC, Olberman nas been very outspoken on the fact that's he's paid to give his opinion, that's the point of his show, and as such, it's not a news program. It was MSNBC that really botched it over the convention coverage and tried to use the prime-time opinion line up for news.
Bottom line though, and it affects papers too, people tend to like to read opinions and editorials and they seem to like to watch it more than they like real news. You non-profit either the papers or the broadcast news and you probably have to dump them. There is probably a greater problem here if you take a step back; ABC,NBC, and CBS have been scaling back news for decades, they're basically down to a 30 minute evening news broadcast and that's about it without some sort of entertainment/investigative journalism spin. More people want to watch Jeopardy than "The News." Making papers non-profit might be a good way to make them cover more news and to protect them a little bit, but it remains unclear to me that people want to actually read news, they kind of like how they get to pick the kinds of "news" they can read or watch on their own and listen to the bias.
Even the financial news has become a sham, and if there is ever something you should be able to report on without bias, it's the markets. They do more cheerleading than real news. They're poopooing Jon Stewart's criticism and he's the wrong messenger but his points are 100% valid. Honestly, I think a whole lot fewer people watch and you can hardly run a 24 network with real news, let alone the dozen or so that we've got. It's hard to put the horse back in the barn.
How much embrace/extend can you do to Gecko or Webkit before you're in the same position again? Admittedly, I think it's all about silverlight. Why not just let IE kind of die off and encourage users to install a new browser or something and then focus on plugins for all the other browsers?
The only thing I can think of that would make any sense to me is that this is the worst recession ever, they probably have better forecasting that we do and they can see the it might be decades before the decadence is back, if ever. In that case, they need to cut all the fat that they can. They did just RIF a bunch of folks... IE isn't exactly free to develop and it's not clear that they're getting much from it.
Lock-in is one thing. The other thing is they're a huge target for hackers, they have to be responsible for fixing things and they've never shown that they can play nice with opensource and opensource tends to be hostile towards them. If they did use webkit, it would effectively be a branch, just because of their requirements. They'd be going from zero to full blown dependence on something they don't completely own.
It'd be huge, I mean if MS wanted to ever show that they have changed their colors, supporting open browsers would be a start. I just can't see it though.
Didn't they claim that they couldn't take IE out of windows?
It's a sensational headline, that's for sure, but for the vast vast majority of Americans in the work force, with families and roots in America it's not even remotely an option. If you're here in the US on an H1B then it's a different kind of issue.
Also, FWIW, if it really is labor costs alone (and it probably is,) this is about as perfect a time as there is. When you have GM going to congress begging for money while their laborers are making near $80k a year with gold plated benefits, the public is as open to it as ever.
I'm not going to push the ideology here but the last decode or so has shown in more than a few cases the only time this seems to matter is when some company doesn't have the resources to build something but they want to put some tweak on it and sell that. If you're writing some kind of optimizer that you need to keep "secret" but you can't build a full compiler then it's hard to offer much sympathy. If you're building some sort of static analyzer or something that you need to keep "secret" again, I think there are more than enough holes here, you really just can't link to GCC, write your plugin, GPL it and just have it dump whatever intermediate form you need. Custom language or custom hardware support? There are probably some more treacherous areas, I'd imagine that some of the better ILs are somehow protected, and I'd also imagine more than a few compiler jocks would like to graft some of that stuff together, you know GCC's parser and Yoyodyne Corps optimizer and code generator, stuff like that.
Having worked with more than one chip vendor that "sold" a GCC derivative that supported their hardware, to be completely honest it would have helped our cause and theirs to just GPL the code to begin with.
Everyone thinks compiler plugins is cool for one reason or another and just about everyone will hate it when there is some interesting plugin that costs $2500 a seat and does some cool stuff but in only runs on Redhat's enterprise Linux in 32bit mode... with a version of GCC that is 2 revs back.
It's really absurd when you take a step back, google bought postini to deal with spam, that's a nontrivial investment. Spam filters for exchange and mail systems can be very costly to a business. Years back the "good guys" started black lists but a lot of legitimate organizations that didn't have the same tech savvy were snared; it was really vigilante style network defense. Some spammers even took offense to that and escalated things, like they were offended by the attempts to stop spam. To really fix the problem, we need to fix the email protocols, we need strong authentication for smtp peer to smtp peer and we should consider end user authentication while we're at it. Until we do that, there will be spam. If Bill Gates wanted to help, he's encourage MSN and the exchange team to work with Google and come up with a plan to secure SMTP and make it default "on" in future versions of exchange. Before we had the lame excuse that there were too many different mail servers and clients to do it, now if you got google, hotmail, and exchange to adopt a new protocol that could cover a huge percentage of the world and everyone else would follow suit.