You've just described a whole lot of inertia. Business practices that, you know, may need to change.
Preventing tired surgeons from operating is the kind of thing that could force such change.
Why should I care about distributed source code control in a monolithic commercial development environment? I can see its value in a distributed open-source project, but I really don't understand the necessity otherwise.
Let's let Joel (the original Joel) explain why in his own words: http://www.joelonsoftware.com/items/2010/03/17.html
Did Joel think this was important? You be the judge:
In that podcast, I said, “To me, the fact that they make branching and merging easier just means that your coworkers are more likely to branch and merge, and you’re more likely to be confused.”
Well, you know, that podcast is not prepared carefully in advance; it’s just a couple of people shooting the breeze. So what usually happens is that we say things that are, to use the technical term, wrong. Usually they are wrong either in details or in spirit, or in details and in spirit, but this time, I was just plain wrong. Like strawberry pizza. Or jalapeño bagels. WRONG....
...And here is the most important point, indeed, the most important thing that we’ve learned about developer productivity in a decade. It’s so important that it merits a place as the very last opinion piece that I write, so if you only remember one thing, remember this:...
...This is too important to miss out on. This is possibly the biggest advance in software development technology in the ten years I’ve been writing articles here.
Or, to put it another way, I’d go back to C++ before I gave up on Mercurial.
If you are using Subversion, stop it. Just stop. Subversion = Leeches. Mercurial and Git = Antibiotics. We have better technology now.
While they may still try to apply deep packet inspection to regular net connections (ie, web usage), I suspect that most of these ideas will, in fact, apply to *Apps* on mobile devices, rather than to web usage.
So they could do various kinds of novel charges for using a YouTube App, etc., but possibly leave alone use of YouTube through a browser (other than overall bandwidth limitations). Now whether they would try to marginalize web browsing generally in favor of (favored) app access, I don't know...
None of the things I have learned from these leaks surprised me at all.
While I haven't read any of the releases directly or read too many reports about them, I agree with this statement so far. I mean, is there really anything particularly shocking here? Is there some compelling reason for us to be keeping such massive amounts - I'm sure this is only a tiny fragment of it all - of fairly obvious and unsurprising information secret?
Here's what has typically happened in the past 30-50 years:
- Republicans tend to spend as much money as democrats, but instead of investing in infrastructure, education, research, health, etc., they plow it into starting wars, putting people in prison, and spying on everyone.
- On the revenue side, Republicans tend to lower taxes for the rich (but not, contrary to popular supposition, the poor or middle class), thus substantially increasing the deficit while not helping anyone that really needs "relief" from taxes.
By making government borderline useless to ordinary people and fiscally bankrupt, Republicans can make a case for the ineffectiveness and inefficiency of government, allowing them to cut MORE public services (while not cutting overall spending - ie, plow even more into military etc.), and cut taxes on the rich even more (again, while not cutting others' taxes), which makes government seem even more bankrupt which allows them to complain some more, which gives them license...
You are very correct, but people harp on this is because they feel swindled, and like there is nowhere else to turn. We were all told that in this post-manufacturing jobs market, we should pursue a college degree to become part of the "knowledge economy". Then, within just a few years, many of those jobs, too, got shipped overseas, and those that were left became vulnerable to corrupt H1B insourcing.
I wouldn't say that IE6 was really the problem. While it did things differently from Netscape (and it may have been clear by then that those differences were not complying with the standards being formed), it was still probably the best browser in existence when it was released (as IE 4 and IE5 had been). Remember, its competition was Netscape 4...
The problem was what came after: IE7 wasn't released until 2007 (six years later!), and required XP Service Pack 2 or higher to install. This meant that developers had far too long of IE6 being the state-of-the-art in browsers (or at least of IE versions) for them to target, as well as not very long since then for people to upgrade. MS browsers also don't do anything to encourage users to upgrade (aside from generally sucking). It also meant corporate users on Windows 2000 and home users on Me/98 couldn't install it at all.
IE7 really should have been released in 2004, should have run on Windows 2000, Me, and maybe even some versions of 98, and should have been included as at least an optional component of XP Service Pack 2. (No XP service pack has ever suggested users install a more modern browser than IE6.)
Then IE8 didn't come out until 2009. This is the browser Microsoft should have released in 2007 (which could have made it the default Vista browser), and could have been included as an optional upgrade with XP SP3. It should also have been more backward compatible, maybe to some versions of Windows 2000 at least.
(Considering that IE8 has the same system requirements as IE7 and is better in every way, nobody really has any excuse for using IE7, whereas there are still some excuses to use IE6.)
Now they are doing it again: IE9 should have been released (fully, not just beta) in 2010 and should have had better backward compatibility, but it's not coming out until next year, and requires Vista SP2 to install. (On the other hand, they are catching up, moving from a six-year gap to a two-year cycle of browser development, and IE9 really does look competitive. Even IE8 is a decent browser if you aren't using any CSS3/HTML5 features or relying too much on heavy Javascript that could run slowly.)
Considering the slow uptake of IE8, and the significantly higher system reqs of IE9 (despite being released just a couple of years later), I don't think it's going to have very significant uptake any time soon. IE 6/7/8 (with their total lack of HTML5/CSS3 support) will still comprise the bulk of IE usage for years.
As such, we developers need to keep doing more to encourage people to switch from pre-IE9 versions of IE (by more-freely using CSS3 etc. for cosmetic enhancement that IE users won't see, fixing only functional problems in IE6 and IE7, not cosmetic ones, etc), or we're going to be shackled to outdated development practices for years. Microsoft sure isn't doing much to encourage users to switch (you'd almost think they were discouraging it, based on the above history).
Hear, hear. I especially like the series of bugs that they refuse to acknowledge because they think that's how it should work. Look people, I don't care if you categorize them as "intentional design flaw bugs", they're still bugs.
Oh? Let's not forget that copyright isn't a one-sided thing - end users (society) have rights, too. And "copy"rights aren't inalienable, they're granted by society to the original copyright holder. So it SHOULD mean managing the balance of rights of the copyright holder and the end user. DRM in practice tends to ignore user rights (such as fair use), and thus ought to be illegal as it's generally practiced.
'Writing about pure gameplay is tough.
Yes, this is exactly the problem, trying to describe games in the wrong terms and evaluate them in the wrong framework. We all probably agree that great games are great due to gameplay mechanics, and story doesn't really matter (some may also have good stories, but it's certainly not necessary, and for me if the story drones on too long, even if it's good, it just gets in the way of actually playing the game - like how you always skip cutscenes after the first time through).
Yet non-gamers seem to think of games in a story-driven entertainment sense, like "how does this compare to a movie?" The answer should be "it doesn't, it compares to chess and poker and ping pong and billiards (and car racing and tennis and other sports, minus the sweat)." Games are GAMES, do you care if checkers or monopoly or bridge or badminton have great stories? So why do you care if a video game does?
Of course, the entertainment industry doesn't help by putting out endless big-budget, story-driven games often derived from other forms of entertainment, but which have crappy gameplay (if there's much actual gameplay at all), thus feeding the stereotype...
In fact you need XP SP2 or higher to install IE7.
IE6 was actually a great browser when it came out - the best browser in existence, hands down (its main competition was Netscape 4), and became something like 90% of the total browsers in use. But that was nearly 10 years ago... Nobody should be using it today, yet it's still 5-15% of the browsers in use.
The situation isn't IE6's fault, but it is Microsoft's fault for not making IE7 backward compatible with Win2K and early XP at the least. Actually, nobody should be using IE7 any more either...FireFox and Chrome users (and probably Safari, for the most part) manage to upgrade to pretty recent versions without any difficulty, so what is it about Internet Explorer that more than half its users are using versions at least 5 years old?
But still, even IE8 has zero support for CSS3 or HTML5, so even it needs massive help if developers want to move the web forward... We can all hope IE9 lives up to its promises, but how many years will it take before even half the IE users have switched over to it??
They are called computers simply because computation is the only significant job that has so far been given to them.