In reading about this mess, I would like to see if my understanding is correct:
Before Oracle aquired Sun, Sun seemed to be a fairly cool company. They came out with the Java language, and release it in such a manner that anyone and everyone could use it. Hence it being picked up by educators and such so they can teach the OO style of programming. Sun was also very active in updating it and fostering a community around it. They even released a good sized portion of it as Open Source (This is how I assume Google and alot of other open source projects came to be).
Everything was going good and dandy until Oracle bought them up. Now, from what I've been reading and can remember (I've been paying attention to the tech industry since about 1998 - Graduated HS, got a CS degree and such since) Oracle is a company who seems to have the philosophy that they try to buy up what competes with them, then for all intents and purposes, destroys the parts it deems "not high retern on investment" and then tries to bleed the other parts dry until they are worthless.
This Sun bit seems to be a great example. They seem to not see much of a market for OpenSolaris & OpenOffice, so they pretty much just cut them off immediatly. With Java, they see the potential of possibly putting in a pay structure somehow, and step one seems to be sueing everyone who has their hands in the java pot (so to speak). If I'm not mistaken, this is the same thing going on with MySQL.
To me, the whole situation sucks. Going though school, I was taught Java. Up until my senior year of college, I only had 2 classes taught in non-java languages, and as such I feel I have a connection to the language. Now that Oracle seems to be completely messing the situation up (and unfortunatly it seems they have the right to) I don't get all warm and fuzzy when I think of java.
IMHO, it seems very unfortunate (and please don't think I'm supporting Oracle by saying this. I'm not) but a Judge or someone should tell google to just leave anything Java behind (and not just Google) and let Oracle do what they want with it. Then, when people stop using Java because Oracle has run it into the ground, Java will be put out of it's misery. I really don't want to see this happen, but I don't see any other way. In the end, Oracle will be out a lot of $ & the world will be out of Java. While one of those outcomes I welcome, I don't think I could live in a javaless world.
I guess I can't say we would be javaless, there are the open source java implimentations... Anyway, Please correct me if you think I am off base in any way.
P.S.: I unistalled java from all my machines as well as all software that uses Java once I heard about Oracle sueing Google over this. I don't want to support Oracle in any way, shape, or form.
I wouldn't say that linux desktop is dead.
I would say that he does raise some points that I have noticed and think the community was turning a blind eye towards.
The fragmentation is there. While this can be a good thing, a user that is not tech-savvy but wants to know more, or try Linux, may go to distrowatch and just be overwhelmed. I've found that when I try to talk to non-techies about Linux, they refer to "Linux" as a singular item. While that is the technically true way to think of it, the context of the questions quickly change to stuff that is distribution specific, and I have to launch a whole side topic about what a distribution is and all. About 3 seconds into this, I loose them and they just tune it out. I really think there should be a community effort to push a solid desktop out. Ubuntu is the clear leader in this category IMHO (I'm actually a fan of OpenSUSE). Yeah, I could skip the whole bit on distributions and just focus on the Ubuntu distribution, but would only cause confusion when they were out on their own and tried to look up "Linux" and expecting just a single result. Instead they get all sorts of information overload.
As for the Open Source fanaticism, this is an interesting topic. Throughout college, I was a huge flag waver. I had Linux on anything and everything. However, afterwords, I had to adjust to the real world and it used all MS products. I am now really adept at C#, and familiar with a lot of MS concepts. I have 2 reasons for softening my stance on MS: 1) Those skills pay the bills. I live in a city where there is only one shop that uses Linux. All the rest use windows. Moving isn't an option since my fiancee and I have lots of family around here. 2) MS just works. I have to admit, that after I graduated, I thought it was nice to have all sorts of time to do "computery" things and all. For a while it was nice, but then I realized there was a world outside of computers. Do I want to spend a weekend setting up a Ubuntu/OpenSUSE box and get network printing going with file sharing or do I want to go Hiking? Yeah, you could just set it up once, but the distributions are released so frequently that you are almost always having to update and fix things. I just have my doze boxes humming along without thinking.
I'm definitely not going to say I'm a MS fan boy by any means, but I have had my eyes opened up. Linux provides a good service to computer uses: It keeps MS honest. If users didn't have the option of Linux, I feel that MS would just start raking users over hot coals to make a lot more $ off their products. MS also has to stay on their toes in terms of features.
I wouldn't take this article as what it tries to pass itself off as. I would take it as a person who just needed to write an article an did care if he started a controversy. I would use it as a call to arms to solidify the linux community and tone down the fan boyism. But then again, that is just my $0.02.
Have to agree with ya there, iONiUM... Seriously, one more electronic device?
It would be nice if you could manually put your credit card info in there yourself so you only have 1 card to use and based on what button combo you want, it can take that profile. But you know it will only work for a specific bank, and you would end up with multiple digital CCs.
My point is, then obviously new they were inexperienced and that the code would have numerous problems. That's why the article said only the die-hard fans with blinders on would try to set this up and be subject to the security holes.
What I'm trying to say in my post is that since they knew there were problems, they went ahead and released the code so others can look. This is one of the great strengths of open source. If you know you have problems in your code, you can release it and have others look over it and provide insights into what you are or are not doing correctly.
Should inexperienced people be trusted to create a highly secure network protocol and implementation? No. Not even remotely. BUTThey took it upon themselves to get the process started. Once they felt they had something worth others looking at, they released the code, and professionals with more experience provided feedback.
Yes, I understand that any security vulnerability is a bad thing. In that merit this is a bad thing. BUT...
These are people fresh out of college, and haven't gotten a lot of real world experience. I, myself, am only out of college by a year and a half. The first year was spent as a sys admin, but the past 6 as a developer. They have probably heard of some types of attacks, but are unfamilier with details. Others, if they are like me, they haven't even thought of. All of this comes from being "in the trade".
This is why Open Source is good. It can rapidly increase a programmers competency if they get constructive criticism. It sounds like they are getting plenty of that, but the article kinda makes it sound like the should know all this.
I, for one, am glad they are doing this, and that they have decided to release some code early for review. Not only will it allow bugs to be fixed early, but it will also give them lessons for future use.
Air is water with holes in it.