I go out and talk to kids of various ages, sometimes, and I do a lot of mentoring. I've talked to girls who want to program, and I've talked to other girls who don't. When I poll the ones who don't want to go into it, the girls at elementary-school age tell me:
"I can't do that, because I'm no good at math."
This just kills me. There is NO math these kids are doing in elementary school that is any indication of programming ability, whatsoever. I've been programming professionally for almost 20 years, and I'm terrible at elementary-school level arithmetic.
When I actually engage these same kids in a programming exercise, they light right up. They get right into it. Who is telling these girls they can't do this? It breaks my heart.
The only way to stop boring people is to stop being boring.
I think it's a lot more complex than that.
Of course it is.
We're starting to see news stories that children can operate tablets, but can't use building blocks.
Future-shock news stories are a perennial favorite. The news story you linked is Yet Another concerned teachers story. Come to me when you have the results of a peer-reviewed research study to share.
A friends daughter can use a tablet, but she also reads (or something close to it), and plays with lots of actual toys and the like. But they know some children in her age group which seem to have some lesser skills when it comes to actual physical tasks instead of digital ones.
Anecdotes, confirmation bias, and hearsay. I mean, seriously, come on... how the heck do you know how much those other kids play with toys at home, or whether they might just naturally be less adept at these things, or even whether your friends might be naturally biased in favor of their own kid's abilities?
You start a 2 year old playing games on your smartphone or tablet, and they're going to always view it as a game.
In your opinion. A 2013 report from The Milennium Cohort Study showed conduct problems, emotional symptoms, and relationship problems among kids who spent more than three hours a day watching TV and other video content, but did not show the same negative behavioral effects from age-appropriate videogames.
I can't tell you how often I see mothers with their very young children playing on the phone as a keep them quiet measure.
In public. Where their kids might otherwise be climbing on the clothing racks in the store, like I did, when I was a kid, because I was incredibly bored. So what?
Are they doing the same thing at home? You don't know. The previous generation left the TV on to keep their kids out of their hair. As I showed above, that may actually be worse.
I'm not saying that it's not possible that tablet or phone usage may be causing some kind of trouble. I'm saying that I want real, scientific evidence of this, and not piles of "concerned" people spouting unproven hypotheses and biased anecdotes. Those are no basis by which to form any kind of sane public policy or parenting guidelines.
And I'm not at all surprised to see that by the time they reach school they've not got the attention span (or in some cases motor skills) they should.
And, if every time they've gotten bored or fussy someone gives them a phone, then when they hit school and that's not really an option, they're going to have NO idea of what to do, because they've always been given these things to keep them quiet. They've never learned that sometimes they have to suck it up and deal with it.
And thirty years ago, this would have read: "And if every time they've gotten bored and fussy someone puts them down in front of a TV, then when they get to school and that's not really an option, they're going to have NO idea of what to do, because they've always had a TV to keep them quiet. They've never learned that sometimes they have to suck it up and deal with it."
Me, I'm not surprised at all that people are seeing this.
I'm not surprised that there are people who see the Virgin Mary in pieces of toast.
Hell, I see a lot of kids where they're all looking at their phones -- and I wonder if they're texting one another from 3 feet away instead of interacting with one another.
You never passed notes, in school? Hell, thirty years ago, we were ominously told how text was soon going to be obsolete. Guess they were wrong about that!
Unfortunately it seems like what most people want is less "CLI and BASIC interpreter" and more "toaster".
Well, can you blame them, really? I don't really care how my toaster works.
You have to specifically seek out a rootable model, or purchase an expensive dev kit, to get any kind of access beyond "Touch the blue button for Facebook."
Or, you could download Unity3D, and do all kinds of cool stuff with it. Or, you could just install Codea on it, and write lua games right on the device. There are a lot of cool programmery things you can do with an iOS device -- it's just at a shallower level than you would like.
There will always be a section of the population genuinely interested in how computers work, and that will always be a smaller section. In the 70s, being interested in computers was a prerequisite to using one. Nobody was going to drop $8k on an Apple II without being interested enough to tinker a bit. The general population (toaster-types) were simply just not using computers. Thus, people in the Apple II era seemed more interested. Either they were interested and involved, or they were out of the picture - most people being out of the picture.
All true. Where this really concerns me is that it has become a lot more difficult for teachers and guidance counselors to identify students with the potential to be computer science majors. Back in the day, if you just used a computer, you were a shining candidate. Today, kids are getting really, really wrong advice.
It drives me crazy, every time an elementary-school-aged kid tells me that she can't be a programmer, because the teacher said you have to be good at math. For Pete's sake, there's no math that you do in elementary school that is relevant to programming ability. I always tell the kids that I was terrible at elementary school math (because I was). What they really need to show potential for, at that age, is logic and computational thinking, but most elementary school curricula never address those. Most kids in elementary school have no way to know if they would be good at programming, without trying it out.
I hear you, but sitting behind the computer and doing Facebook and Trackmania is not the same as peeking and poking your Apple II in BASIC.
That's a valid point. Though, it is the things that we love on the computer that first inspire us to learn to program. For me, it was games.
The problem now is that people take computers for granted. It's a freakin' toaster, as far as most people are concerned. People are never given any incentive to look under the covers. I'm interested in what we can do to encourage more exploration.
My thoughts exactly.
This sounds like round 36 of "kids today and their rock-and-roll music." Teachers indulging in future-shock is just plain trite. Boring classes have always been boring. Kids like me have always had trouble slogging through them. If the kids have trouble paying attention to something that isn't exciting, then, for the love of all that is good, be more engaging. The only way to stop boring people is to stop being boring.
If computers actually impeded the ability to learn, I'd still be coding in BASIC.
The term man and words derived from it can designate any or even all of the human race regardless of their sex or age. The word developed into Old English man, mann meaning primarily "adult male human" but secondarily capable of designating a person of unspecified gender, "someone, one" or humanity at large.
Language pedantry from an Anonymous Coward? Aww, it feels like home...
I'll just leave this article here, since it will save me some typing: Think twice before using "mankind" to mean "all humanity," say scholars.
Everybody knows software development is a "young man's game"? Did you seriously say that?
HELLS no, man.
First off: I've been programming since I was 8, but I was never a man, and I will never be a man, and I have never suffered under the idiotic delusion that this was ever exclusively a man's game -- young or otherwise. This is my game.
I am still programming at 40, and I assure you that youth offers no advantages over experience, either.
But, that doesn't stop me from mentoring. My interns may not be able to program like I do, but I'll give 'em every advantage I can. It's great to teach them some of those intrinsics that they don't get in school. That gives them some of the advantages an experienced developer, even if they're younger. This isn't a zero sum game. We all need good devs, so we should try to make everyone who is working with us better -- whether they are young or old. We all get better software, that way.
Right. Exactly. The point that I'm making is that I think we need to create more demand for the critical thinkers. A+ certifications encourage your newest IT staff to not ever think or ask questions, because they don't have the authority to use their own brains for anything. I would venture that this is the heart of the reason for the lack of critical thinking skills in IT. Yet, countless junior IT positions demand A+ Certs, so that is the requirement that education is ultimately satisfying. When we use certifications as a crutch to determine if people will make good IT workers, we are getting exactly the non-critical thinkers that we deserve.
If IT workers knew how to think critically, they would go into programming, instead.
*cough* OK, that was mean. The thing is, critical thinking skills are notoriously difficult to teach effectively. Maybe we should put more effort into hiring IT workers who can solve problems, instead of looking for people with the right combination of resume bullet-points. If we created greater demand for critical thinkers, instead of creating demand for certifications, perhaps we would see more effort put into learning to solve problems.
Or not. Maybe we just wouldn't find anyone to hire.
Here are some of the benefits:
- 1.) We have taken up the discipline of writing check-in messages that are easy to digest.
- 2.) Players have an opportunity to get excited about what we are doing before it is released, but after we have done the work.
- 3.) Players can see that it's a living project -- that we are actively improving the game.
- 4.) Players can see that bugs are being fixed -- that we care.
Obviously, we make an effort not to post things that are going to compromise our security.
Has there been a downside? It hasn't bitten us yet. There is usually no reason to hide what technologies you are using, unless you are using something that is highly vulnerable, or you are making other bad choices. Don't do that. There is no reason to hide that your software has bugs. Everyone knows you have bugs. It's only shameful if you aren't fixing them.
Are we really crazy?
Mwahaha! Perhaps, instead of "like playing console games," I should have said, "like playing Little Big Planet."
The open source world cares as much about the PS3 as a microwave oven. Yep. That about sums it up nicely.
If that were the case, then PS3s would be nearly ubiquitous, and no one would remember how to get by without one.
That is an entirely fair point! I've worked on embedded systems, too, and you are totally right. It is entirely plausible that my microwave could be running Linux. I suppose it would have been more correct to say that I couldn't install Linux on my microwave, rather than that my microwave doesn't run it. And if I could, the level of difficulty probably wouldn't be worth the (admittedly awesome) ability to reprogram the preset buttons to perfectly nuke my favorite microwavable foods, instead of whatever random assortment of crap the manufacturer thought I would want to microwave.
While I'm dreaming, I would love to be able to write new programs for my bread machine, damn it. It has no setting that can accommodate the extended kneading and rise time necessary for a loaf of whole-grain sourdough started from a home-grown levain. Clearly, I should invest in one of the more expensive programmable models, but that possibility doesn't keep me from itching to hack the one I've got.
I agree. This quote really made me giggle:
But by omitting the option to install GNU/Linux on its new PS3, it has removed the final reason for the open source world to care about Sony.
Unless they -- I don't know -- like playing console games, like the vast majority of people who buy game consoles. My microwave oven doesn't run Linux, either, but it somehow manages to still be useful to me.
Honestly, I think out-of-touch rants like this only serve to further reinforce the "Linux zealot" stereotype, and drive the mainstream away from Linux.