Sure, thank you for bolstering my point.
um, arguing on the internet is like running in the Special Olympics. Even if you win, you're still retarded. Being a language nazi makes you doubly so.
Mod parent up +1000!!! Holy crap its about time people other than liberals started posting on popular internet forums.
"The watering-down of titles for the Wii certainly isn't universal. Almost every game released by Nintendo is solid. The story lines are outstanding, the controls capture the essence of the Wiimote, and the graphics are just fine. Super Mario Galaxy and The Legend of Zelda: Twilight Princess immediately come to mind when I think of Wii games that aren't watered down. They are stellar titles that anyone should play. And they match any full-featured game on other consoles. The same can be said for the vast majority of titles built exclusively for the Wii. Punch Out was great. Wii Sports provides an incredibly fun experience. Simply put, there are a variety of compelling games on the Wii that still make it a worthwhile console. But unfortunately, the vast majority of those full-feature Wii games have been developed by Nintendo. The reality is that many third-party developers haven't been able to capture the true power of the Wii and thus water down their games to bring them to the popular console. If gamers want the best experience for those games, they'll need to play them on another console."
So in other words, the problem is not the Wii, it's the capability of the developers? Why is it the Wii's fault that third party developers water down games because they can't develop properly for the Wii? Do third party developers not have all the tools, knowledge, etc they need to develop for the Wii? Is Nintendo holding back on third party developers to ensure Nintendo always publishes the "best" titles (I hope not!) Based on this paragraph, I am led to believe that Nintendo is perfectly capable of writing awesome games for the Wii while everyone else is incapable of doing the same.
but may I ask: what is the real benefit to totally handcoding a site as opposed to using web design applications? I really like the ability to create a layout in Fireworks and then have that imported into Dreamweaver where I can continue to design graphically or code by hand where I feel it is necessary. I can see what it will look like instantly in the WYSIWYG, and then test it in the million different browsers I have installed on my system. Some of us do not want to code our sites 60 hours a week, we want to spend the time on figuring out the actual look and feel of the site, writing copy, editing graphics, etc. Maybe if you have a full team of developers and a marketing department its a different story...marketing writes the copy, creates the images, has the concept for the layout of the site, and then developers just code it to marketing's specs.
But many organizations, especially small businesses that like to do things in-house don't have those luxuries. I have been in the position of being responsible for ALL ASPECTS of a corporate site, from copy to images to layout and coding. Many people in positions like mine love the ability to quickly put together the site, have an automated tool tell me the code is compliant to whatever standard I desire, and then dive into the code where I see things just aren't right, or to write the dynamic portions of the site that can't be put together in a WYSIWYG environment. Tools like Dreamweaver (especially Dreamweaver, I've used it since Dreameaver MX) and really the entire Macromedia Studio/Adobe Web Design package as a whole have been a Godsend.
Ultimately if the page looks great, runs well, is secure, built quickly, cost effective, and meets all the requirements of the organization or customer, what's the problem? Other than personal ego and bragging rights (neither of which have anything to do with creating a website), I don't see the big deal.
You, sir, should be modded up to the sky, as this is the best explanation I have ever seen concerning the "nature of Linux" as you describe it.
However might I add that Linux can still keep this identity despite things like merging distros, LSB, making connections to the non-free world, non-free components, etc. You mentoned Red Hat but not Ubuntu, but I will use Ubuntu as my example.
For example, Ubuntu is Debian, except that Ubuntu has been willing to play nice with corporations and put some real money and business sense behind it...which includes helping to drive the development and supply of non-free/restricted drivers. Ubuntu is still Linux if, for example, someone at Canonical got Linux Mint to close shop because they might "create confusion among distros". Having your own distribution does not make you any more or less of a contributor to Linux or the Open Source movement. If there were no metadistributions of Ubuntu and Ubuntu somehow caused that, it does not cease to be Linux. Some could say the same about Ubuntu vs. Debian as well...but the difference is the people involved with Ubuntu would not be able to do what it is doing if they were part of the Debian organization. Many minor distros can not make this argument. I believe minor or metadistros help Linux when they do something that can not be done by simply being a contributor to the parent distribution. Otherwise they do not.
Despite this and other things, I don't believe that Ubuntu (as an example) has gone far enough away from the "Nature of Linux" to say that it is no longer Linux. They seem to straddle the line between "Linux" and "Linux-like thing but not Linux" pretty well. It is almost completely free/open source and plays well with the open source community. What is non-free, closed or proprietary is not forced on the user - I can have a completely FOSS box and still run Ubuntu. Changes have been made to "improve" the user experience/ease of use for joe user without throwing away that which your typical Linux hacker holds dear. While I use GNOME, I am not forced to keep it. While I use Firefox, I can dump it and use Iceweasel instead, and do so with ease. I believe Linux as the beacon of software freedom and choice not only continues to exist - it has been enhanced through vendors/distros such as Ubuntu.
However I agree with your view that OSX is not BSD. It certainly is possible for a vendor to take Linux/BSD far enough away from its roots to say that it is no longer that thing. I'm not quite sure at what point that line is crossed, but I believe Red hat and Ubuntu are dancing on that line; Apple has jumped over it.
So let the bank get robbed because insurance will pay out? That is pretty weak. Shit rolls downhill. Banks are insured, insurance costs money. Insurance costs go up when perceived risk goes up, to the detriment of non-rich bank customers who lose money in the form of reduced interest rates and increased fees to cover new insurance premiums as well as increased security. Federal money also covers bank account funds, so the non-rich taxpayers all over the place can lose out there. In modern times, a bank robbery can also include theft of non-rich personal financial information which has value. Safe deposit boxes at banks also contain physical property that has great value, monetary or otherwise. Plenty of non-rich have those as well. Insurance can compensate for these things, but can never truly replace things like private information or family relics.
I wasn't trying to imply that someone could steal my house by robbing a bank, and I realize now that it sounded like that. The intent simply was to further express how banks are tied to non-rich people. Plenty of non-rich people do business with banks, even if it is just for their mortgage.
Five days before the computer genius who killed his wife led police to her body, he was remorseless and angry in defense of his innocence.
A computer scientist is someone who fixes things that aren't broken.