Your comment is a tad cynical. Especially since, at least in my area, we're not seeing a decline in salary for good developers at all. But maybe that is the point, good developers make money. The people who have a problem are those who spend 2.5 years running through a "CS" degree at ITT Technical Institutes home study program and then come out with a resume that reads like the who's who of fast food workers and yet expect a starting salary of $85,000 "because my college recruiter totally said that is what I could make". Those people will be making $4 an hour working as the next great Friday night waiter or waitress at Applebee's (but they might also make a tip, occasionally).
Is it hard to find a job? Well, that depends. Do you know how to collaborate? Do you come up with innovative solutions in a group setting? Do you keep up on your skills and network in the IT community? If you do, you'll have a good, well paying job.
And, from the article, that is what these children are being taught in these classes. Good for them!
But to learn what? I was in the very top percentile of my class at every school I went to. Unfortunately for me, very few of the teachers could teach me anything that I did not find remedial. In the 7th grade I had a math teacher give me the greatest insight I have ever had the pleasure of realizing. She said, I would never learn anything from the teachers or textbooks in school that I couldn't easily figure out on my own. She encouraged me to help others and learn new and interesting things from those around me by observation.
This opened up a whole new world for me. Yes, I tutored many people for a heft sum (enough to comfortable pay for college without incurring any debt). But I also helped those who couldn't afford my services, I made friends, I learned as I taught, I gained valuable social and managerial skills, and most of all I got a great experience out of school even though I hated just about every textbook I ever picked up and most of the lectures where teachers attempted to prepare me for "life" (which I guess is a code word for some standardized test that helps them get funding for the school).
For me I think collaboration is the way to go. Ultimately, in good companies, that is how things work. I have my strengths and the 6 people on my team sitting around me right now have their strengths. We complement one another and we work well. Personally, I am glad I learned that while I was in school, and have mostly forgotten about all the lectures that bored me so badly.
Are you really attempting to say that writing code on a blackboard or a piece of paper in an interview setting is substantially similar to pair programming or getting a code review? I can't imagine how that comparison could be made at all. First of all, I have been in a lot of pairs and a lot of code reviews, I either know the person I am pairing with or who is reviewing me very well, or they have some stake in the situation that is equally as high as my own. Second, yes comfort and familiarity with your coding environment are important. I find nothing comfortable about "coding" on a blackboard with people staring at me who may be my next employer, and who have to look at 4 other candidates that day.
That being said, I think asking a developer to develop is a good idea. I have often given developers a real world task to complete at home before they come into the interview (sent a few days ahead of time). They turn in their work, which they accomplish in whatever environment they feel comfortable, and then we talk about the results when they come in. It works very well, and it a lot less stressful on them than blackboard coding.
I absolutely agree. I run a HP EliteBook 8540w (Intel Core i7 with 8 GiB memory) with Fedora 15 and absolutely everything that he mentions in the post works fine. I run it dual monitor with a docking station and all the features you would expect run without a hitch. The wifi question always cracks me up since I have not had issues with wifi running linux on a laptop in years.
Unless you are looking for linux out of the box, I don't think any issue is insurmountable.
I know that 20 minutes is pretty short, but I have found that the most impact from a presentation like this is when I actually give them the knowledge they need to start developing something on their own. A little research before the session (much like you are doing right now, kudos to you!) can produce a half page sheet on how to set up a development environment, a short tutorial to solve a real problem, and even some links to further tutorials.
In a high school setting I am sure you will get a huge mix of people who range from very interested all the way down to super bored. With 20 minutes, play to the interested learners and show them the real deal. If they always seem interested in game programming, make sure to have a link on your half page to a game programming tutorial.
That makes a lot of really big assumptions. For example, in the case of my company which may switch away from Oracle, we have ongoing licensing costs which means we haven't "bought" Oracle, we are "buying" Oracle, and continuing to do so over and over again, every year.
Also, as another person mentioned, we use only a small percentage of the actual features that Oracle provides. For us, and I am assuming a lot of others who are paying up the ying yang for licenses, switching to a PostgreSQL solution makes a lot of sense. Really all we want and need is a stable and cost effective environment. (Now, I must say we are looking at enterprise PostgreSQL support which isn't cheap, and far from free. But still a significant savings over Oracles licensing fee).
Yes but in reality the developers who are now working faster and more efficiently because of their freed up time, probably won't profile or optimize. Most* will consider themselves done and pat themselves on the back for a rewriting job well done without any further thought until something bad happens.
*Most -- Yes I am aware that anyone reading this is the exception to the rule and you profile, optimize and test extensively while not wasting time, you are on
There are surges in power usage as well. I am not sure I see your point there. For example, solar power does not produce at night. True, but we can store the energy gathered during the day, during peak times, and during the most efficient portions of the day in various regions and then use it during those times when peak power is not high for the method of production. This isn't rocket science.
I happen to think that nuclear plants and to a lesser extent gas and coal power plants aren't as bad as they are being made out to be. But to say we need them because renewable or "green" sources are not stable is inaccurate. What we need in those cases are better and smarter grids for storing and handling the capacity needed versus the collection of power. And honestly, we need better and smarter grids even if we stick with purely coal powered plants.
I guess that last part there is the real question, and my problem with this news as a whole. Is Google making a single language with coherent syntax and semantics? I know I shouldn't prejudge, having not seen the language. But given past experience I would say it is unlikely that Dart is moving towards the goals you mention.
I also agree with your statement "If they made it using a lisp-like syntax, they would be doing God's work." Not that I am any huge fan of lisp, though I am, but the point is, if you are Google, why not throw your weight behind an existing solution making it better, fixing it's short comings, and not producing the "next new thing"? Unfortunately the truth might be that for marketing purposes it is more interesting to create a whole new language, get buzz behind it, send people off to conferences to talk about it, and ignore the existing communities. Existing solutions have issues, but I wouldn't say all of them are broken to the point where they cannot be fixed effectively, which is the only time a completely new language would make sense.
A year spent in artificial intelligence is enough to make one believe in God.